March 9, 2011 at 11:03 am
Hi Guys,
I've been searching the forums trying to find the answer to my question but I'm not having much luck. The majority of our clients require that we deliver data to them in .xls formated files because they are behind on upgrading to 2007/2010. We have some data sets returning 200,000k+ rows. Is it possible to have SSIS break the large dataset up into chunks and automatically place it into multiple tabs on the spreadsheets?
Thanks,
Code
March 9, 2011 at 11:07 am
One way to do this is dump the data into a staging table, including a SheetNumber column, break the data up that way, select the distinct SheetNumber values, and loop through them, creating the "tables" (worksheets) in Excel at the beginning of each loop, and then exporting just the records for that sheet.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
March 9, 2011 at 11:11 am
Code-1029433 (3/9/2011)
Hi Guys,I've been searching the forums trying to find the answer to my question but I'm not having much luck. The majority of our clients require that we deliver data to them in .xls formated files because they are behind on upgrading to 2007/2010. We have some data sets returning 200,000k+ rows. Is it possible to have SSIS break the large dataset up into chunks and automatically place it into multiple tabs on the spreadsheets?
Thanks,
Code
You could always give them CSV files instead. Excel is the wrong tool for handling so many lines of data. If they are processing this data after you provide it to them, the fact that it's split over multiple tabs is surely just going to slow them down.
March 9, 2011 at 11:17 am
Thanks for the suggestions, guys! I'll have our sales team work with the clients to see if they will accept .csv, if so then then we'll go that route, otherwise we'll look into using the first suggestion.
March 9, 2011 at 11:20 am
Phil Parkin (3/9/2011)
Code-1029433 (3/9/2011)
Hi Guys,I've been searching the forums trying to find the answer to my question but I'm not having much luck. The majority of our clients require that we deliver data to them in .xls formated files because they are behind on upgrading to 2007/2010. We have some data sets returning 200,000k+ rows. Is it possible to have SSIS break the large dataset up into chunks and automatically place it into multiple tabs on the spreadsheets?
Thanks,
Code
You could always give them CSV files instead. Excel is the wrong tool for handling so many lines of data. If they are processing this data after you provide it to them, the fact that it's split over multiple tabs is surely just going to slow them down.
CSV will also be smaller files (though still huge with that many rows).
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
March 9, 2011 at 12:33 pm
Phil Parkin (3/9/2011)
You could always give them CSV files instead. Excel is the wrong tool for handling so many lines of data. If they are processing this data after you provide it to them, the fact that it's split over multiple tabs is surely just going to slow them down.
Second that. Excel is a spreadsheet application, not a data transfer format.
When will the world know?!? *cries in despair*
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
March 9, 2011 at 5:55 pm
Phil Parkin (3/9/2011)
You could always give them CSV files instead. Excel is the wrong tool for handling so many lines of data. If they are processing this data after you provide it to them, the fact that it's split over multiple tabs is surely just going to slow them down.
Unless they're trying to use excel to manipulate the data after it's been exported to them, sigh.....
You could try using a conditional split in the dataflow task to redirect some of the rows to other tabs. You'd need to have some way to specify groups of rows to split out of the data set.
March 9, 2011 at 8:42 pm
Here is an article, complete with sample code that may provide the T-SQL code you are searching for:
http://www.mssqltips.com/tip.asp?tip=1202
Additional data replete with many, many code samples
http://www.simple-talk.com/sql/t-sql-programming/sql-server-excel-workbench/
Another possible method, have Excel pull the data from the DB.
http://www.sqlservercentral.com/articles/Stored+Procedures/63537/
March 10, 2011 at 12:59 am
Unless they're trying to use excel to manipulate the data after it's been exported to them, sigh.....
Brings to mind a whole batch of accountants doing multi-tab analyses very very slowly and knowing that there must be a better way...
March 10, 2011 at 11:44 pm
Phil Parkin (3/10/2011)
Brings to mind a whole batch of accountants doing multi-tab analyses very very slowly and knowing that there must be a better way...
Oh but they do (or at least they ought to)... there have been so many instances where they are told not to do VLOOKUP on that many rows, do it in a database, or themes along those lines...
Sadly many accountants are totally wedded to Excel. And then MS in their infinite wisdom removed the 65535 row limit in a worksheet, resetting the number over the million mark (1048576 rows) in Office 2007. Trust me, some accountant somewhere will be religiously hitting that limit too, in their download of GL data from <insert package of choice here>, where they then sort the result, match and reconcile.:-D
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy