April 26, 2013 at 2:42 pm
Hello all,
I am tasked with archiving some very large tables. There are about 30 tables that I need to do work on. I need the tables to be exported to pipe-delimited text files. Because of the size of the tables, I also need to split the tables by date to keep the size down. Most of the tables have a date field, the ones that do not are smaller and I can just export them to one file.
The tables that have the date fields I need to split out by YYYYMM.
Can anyone help me with the best approach to take with this?
April 26, 2013 at 3:00 pm
Use SSIS.
Add a loop (don't forget to increment your date)
Within the loop add a dataflow task. The source should be OLEDB. in the source you can specify the use of a "SQL command from variable" as the Data access mode. as your date increments it will alter a variable that is used as the source.
in this example the dates would be altered each run through the loop
select * from dbo.BigTable Where TheDate >= 1/1/2011 and TheDate < 2/1/2011
depending on the sizes of your tables and indexing this could take a long time. you may want to split the data into separate tables first then export those. you can do the same thing, but instead of altering a date filter through the variable, you would be changing the table name.
in this example the last 7 characters of the table name would be altered each run through the loop
select * from dbo.BigTable_2011_01
April 27, 2013 at 6:19 am
You know, that is pretty close to what I was trying. But when I read your post there were one or two things that I was not thinking of. I will work on testing this out as soon as I can (don't have access again until Monday)
Thanks very much for the input.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply