March 18, 2008 at 6:02 am
Hi,
I need to implement SSIS for bulk down load for a client site.But I don't know the procedure for implementing this.This is very Urgent ,so please help me...
March 19, 2008 at 1:03 am
what exactly you want to implement?
describe it in details
March 19, 2008 at 1:48 am
Actually I want to down load huge amount of data from data base to Excel sheet.First I wrote simple select statement for this .But it is taking too much time.So I cant use this.Thats why Now I am trying with SSIS.Actually I need to down load more than 10000 products at a time.So please help me to solve this issue.
March 19, 2008 at 8:13 am
You are not being very specific about your issue and it is going to make it difficult to help.
If your query is taking a very long time, you are going to probably continue to have the same performance problem using SSIS. In most cases, the first thing your package will be doing is running the query to get the specified data. If the query takes a long time in management studio, it is not going to magically be faster as an OLEDB source in SSIS.
I would start with posting the query you are trying to run - along with as much information about the table schema that you can provide (well-formed sample data is also nice).
Also, you have indicated that your output is going to be Excel. Which version of Excel? Excel is a rather poor format for transmitting data in bulk. If you have the option of using XML, you may find less issues later.
March 20, 2008 at 5:56 am
From what i understand your problem is the long duration of the entire process , for this we need to identify the problem area. First make sure that the query you are using is performing well, if so then most of the time is taken in writing the data to the excel. You need to take a call on whether you want to do the unloading and writing to the excel in one step or break it down.By breaking it down i mean you can unload the data from the database to flat files or xml.
Also , you can use the paralllel processing ability of SSIS by using multiple dataflows in parallel and conditionally fetching the data from the database (eg SELECT * FROM TAB WHERE COL BETWEEN 0 AND 2500 , SELECT * FROM TAB WHERE COL BETWEEN 2501 AND 5000...), you will end up with multiple excel files as well , but you can combine them later , which will take much less time as compared to reading data serially .
March 22, 2008 at 8:14 am
If you're using Excel, remember, you have a limit of 65K records per sheet. You said 10K products, so does that mean 10K records or does each product have more than 1 record? In my experience(and I do this a lot, I mean a lot), instead of using SSIS, I would do either one of these 2 things:
1) create a stored procedure out of your query and call the stored procedure from Excel. Now your Excel file has become a template for future data retrieval or
2) create a view out of your query and call that from Excel. Once again, your Excel file can now be used as a template for retrieving data. It will also stay formatted. Now all you have to do is hit the 'Refresh Data' button in Excel every time you need the data.
March 22, 2008 at 9:02 pm
It might be easier going from your client db to MS Access, and then to your end DB or Excel.
There are DB tool applications like EMS Import that can make data transfer easier, and the latest version (Import 2007) supports Excel 12 and Access 12 so you don't have to worry about 65K limits
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply