January 8, 2010 at 3:43 pm
Hi guys
I have an application that was developed in SQL 2005 on a 32bit windows 2003 server.
This application is on demand by the user reading information from both ACCESS 2007 and SQL Server combining then the result.
In order to access the ACCESS database the application build a Linked Server that point to it.
We have a requirement to move our applications to a 64 bit server.
In order to access the ACCESS database we need the Jet Engine which do not exist for x64 machines.
So in order to tackle this situation I decided to off load part of the application to SSIS.
This piece of the application create the linked server, read and copy the data from ACCESS to SQL.
During this process i need to pass a primary key value (INT) to the SSIS package that will have to copy data from ACCESS to soem table to SQL.
1. The SSIS cannot be scheduled, must be an automated process fired by the app on demand of the user.
2. a PK have to be passed to the SSIS in order to complete le load correctly.
for the point 1....i was thinking to embed the SSIS into a job without schedule and fire the job from the stored proc but for the point 2?
:w00t::w00t::w00t:
January 8, 2010 at 3:50 pm
There are (of course) a lot of ways to accomplish what you want, and your posited method should work.
Keep in mind that for #1, you're going to have to use an OS Step in the Agent job, as the SSIS step can't run a 32-bit DTExec. For more info: Quick Reference: SSIS in 64- and 32-bits.
For #2, you could always have the command that starts the job write a value (or values) into a "control table". For example, a table that only has one column - PKs. The SSIS package wouldn't take any arguments in the command line, it would just use an Execute SQL Task to extract the "TOP 1" PK from that table and process with that.
Todd McDermid - SQL Server MVP, MCSD.Net, MCTS (SQL08BI)
Find great SSIS extensions - most are open source - at SSIS Community Tasks and Components[/url]
Are you doing Data Warehouse ETL with SSIS? Don't use the slow SCD Wizard, use the Kimball Method SCD component instead.
January 8, 2010 at 4:35 pm
dear Todd
As far as i know I can include the SSIS package into a job and have the choice to run it on 32 or on 64 bit.
I know that for sure because i have another job with many packages as steps and some of them run under the 32bit. for example pakages that access excel files.
for the point numner 2, you are saying that i cannot call a SSIS passing parameters to it.
so i have to store the parameter dynamically into a table and have the SSIS read it each time it runs.
am i correct?
thanks a lot for your help.
January 8, 2010 at 4:49 pm
Outside of what Todd is suggesting, I know of only one other way to do this. It's a bit out of the box, but I've built an entire automation platform on this so I know it works and works well.
I've combined Service Broker queues and SSIS packages to allow for stored procedure execution of SSIS packages. I needed a reliable way for packages to call each other and for me to have the ability to call individual packages as needed. Also, I wanted my packages to run more like services in that they can receive a task, handle the task, and pass back results. Errors do not stop package execution, rather, they are just passed back to the caller (either another SSIS package or the SP) to be handled there.
I'm not planning on posting the entire solution here as it has a number of moving parts and I don't have the time today to get it into a postable format (I may begin an article for SSC on the topic as I've posted this as a solution a few other times) but I can give you the logical gist of it.
Set yourself up a pair of Service Broker queues and the necessary messages to make up a simple conversation that consists of a request and a reply. Your stored procedure should be designed to pass an XML based message, containing your PK values for #2, as a Service Broker message into your request queue. The first step in your SSIS package should be an Execute SQL task that calls a stored procedure that is coded to WAIT FOR messages in the request queue. Upon receiving a new message, use the PK values in the message XML to query your ACCESS DB and get the result set. Build the result set into an XML message and have the SSIS package pass the XML back as the reply.
All the while, your calling SP has been waiting for the reply message. It will receive the message from the reply queue and process the reply.
This has worked great for me in what I'm trying to build out here at work. It's been very reliable and I've had very little maintenance type work to do once everything was up and running. The downside to this is that it is a serial process in that your SSIS package can only handle one request at a time. But from what I can tell, any other SSIS centric solution will do the same for you. Service Broker is great for scaling applications using activation procedures, but you miss out on that here. One way you could scale this would be to create copies of your SSIS package and a queue request/reply set for each package and come up with a way to round robin the XML requests to the open queues.
Outside of Todd's suggestions and what I've described here, the only other avenue to look at may be CLR procedures. I'm not a CLR developer so I don't know if they can do this, but it may be possible for someone who is proficient in programming in CLR languages. Sorry to be so long winded, I hope this helps.
January 8, 2010 at 5:23 pm
I read from your post that you were using SSIS 2005. Agent jobs in 2005 don't have that handy checkbox, so you have to use an OS Step. If you're using 2008, use the checkbox.
I say that you won't be able to pass parameters to the package, because to do so, you would have to alter the job each time you ran it, because the command line for DTExec is stored in the job, and that's what you'd have to modify.
Your alternatives to that are having the package query something for the parameter(s), or using configurations. The problem with configurations is that you can have concurrency issues (what happens if multiple clients start a job... which client's configuration gets used?). It's easier to handle concurrency when you have a "control table", because if you want to get fancy, you can set up "input" columns for the job, and "output" columns to report back results.
You can also use PSExec to remotely execute packages without involving jobs.
Todd McDermid - SQL Server MVP, MCSD.Net, MCTS (SQL08BI)
Find great SSIS extensions - most are open source - at SSIS Community Tasks and Components[/url]
Are you doing Data Warehouse ETL with SSIS? Don't use the slow SCD Wizard, use the Kimball Method SCD component instead.
January 8, 2010 at 5:26 pm
Service Broker sound interesting. it could work for me as well.
Anyway yeah I use SQL 2008 so i will use the check box.
Also can you tell me how to pass parameters to the command line?
becasue if that is possible, yeah i can modify the job each time and run it....each time with a new command line...;-)
January 11, 2010 at 11:14 am
Here's the docs for DTExec, including command line parameters: http://msdn.microsoft.com/en-us/library/ms162810.aspx
Todd McDermid - SQL Server MVP, MCSD.Net, MCTS (SQL08BI)
Find great SSIS extensions - most are open source - at SSIS Community Tasks and Components[/url]
Are you doing Data Warehouse ETL with SSIS? Don't use the slow SCD Wizard, use the Kimball Method SCD component instead.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply