October 22, 2012 at 7:15 am
I'm building an import package for a CSV file that has about 200 columns. If every column was maxed out it would be about 2200 bytes and is well below the max table size. Should I split this up into multiple tables because of the amount of columns?
October 22, 2012 at 9:29 am
AVB (10/22/2012)
I'm building an import package for a CSV file that has about 200 columns. If every column was maxed out it would be about 2200 bytes and is well below the max table size. Should I split this up into multiple tables because of the amount of columns?
My guess is that with that many columns the data could stand to be normalized. I would import it into a wide table and then normalize it. I don't see any reason to break this into 2 tables. If the rows were too wide then yes but otherwise it is going to be more work than needed.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
January 3, 2013 at 9:23 am
Sean,
Thanks for the reply. Appreciate it.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply