Viewing 15 posts - 61 through 75 (of 149 total)
http://www.sqlservercentral.com/articles/SSIS/68025/
The above tutorial has it... thanks to sqlcentral.com
July 15, 2010 at 1:30 pm
MeltonDBA (7/15/2010)
I'm happy that you have data in a table...:hehe:What is your specific question/problem?
heheheh that was accident... I pee'ed before I unzipped my pants 🙂
July 15, 2010 at 10:33 am
try using ssis if that imports all the columns... how big is one record?
July 9, 2010 at 3:20 pm
Thanks for the response... I am sure I can find few columns that has mostly nulls values and for those Sparse Properties is a good idea. Also, how about downtime...
February 22, 2010 at 11:49 am
thanks for all the response and helpful tips guys.
February 20, 2010 at 8:28 pm
I guess I have to do page level compression and do it on each partition one by one... how about sparse column? Can it be done on an existing table?
February 20, 2010 at 8:28 pm
yes the tables are partitioned... so joshhh you think page level partition should work fine... anything else I can do to free up some space? I guess database shrink should...
February 19, 2010 at 8:36 pm
You need to create a workflow that will trigger the package to run and pass the filename as the parameter or use for each loop.... Or you can schedule the...
November 25, 2009 at 1:00 pm
There is one issue though... They are historical files and each days files are in folders named as YYYYMMDD and the issue is somedays the files are Zipped while other...
November 25, 2009 at 12:53 pm
It is varchar(MAX) I will change it to varchar(8000). Thanks Jeff I will try with this change.
August 14, 2009 at 3:01 pm
Thanks Jeff... there was little bit of improvement... I would say around 5% of improvement. I will run few more times with different file size and check out the difference....
August 14, 2009 at 1:32 pm
I made the change you suggested... I get this error:
The ORDER BY clause is invalid in views, inline functions, derived tables, subqueries, and common table expressions, unless TOP or FOR...
August 14, 2009 at 12:53 pm
This is the code that runs... but my code is in dynamic sql as I will be loading more than one file at onetime and fields and tables will change......
August 13, 2009 at 10:41 pm
Hey Jeff,
This works great except one issue... using CTE to create the final table takes a bit more time than expected... a file with only 10K records takes around 1minute...
August 13, 2009 at 9:55 pm
Viewing 15 posts - 61 through 75 (of 149 total)