March 3, 2011 at 7:27 am
Jeff Moden (3/3/2011)
I agree that the cursor will likely be a problem but the Temp Table won't be. It's just a different kind of memory usage. I do stuff similar to this all the time (thanks to some whacko 3rd party data vendors) in T-SQL with million row inputs. Of course, I whack'n'stack the data quite a bit differently... I don't build a million SELECT/UNION statements which would take comparatively forever. 😉
I'm not really a DBA, but if you have several SSIS packages dumping millions of row in tempdb, that must have some impact?
If it is not the space alone, than certainly disk I/O must suffer? Or am I mistaken?
(and not every company follows best practices regarding tempdb: on a seperate filegroup on a seperate disk et cetera)
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
March 14, 2011 at 6:39 am
LexusR (3/3/2011)
try without ssisbulk insert into FlatBuffer ...
and parsing into tables
In one or two package I do the import into a table with ID column and one column of varchar(max). I then use a stored proc, to split it into the correct tables, and do some more transformations. This was the best solution for me for these types of files. And , I stayed away from cursors:-D
And sorry in one pacakge I do use the script component as well, seeing that it was a bit easier to do in the package itself
Viewing 2 posts - 16 through 16 (of 16 total)
You must be logged in to reply to this topic. Login to reply