February 3, 2010 at 7:29 am
I'm trying to read a file with over 11,000 characters but I need to create multiple records from that one row. The rows I need to create depend on the occurance of segments from the input row (the input record is fixed length and if not all segments are present, it is blank filled)
I was hoping that I could do a bulk insert to a sql staging table but I get an error because the input exceeds the max row size of 8060.
Does anyone have any ideas?
February 3, 2010 at 9:31 am
Don't use a staging table, instead use SSIS to do the import and that way you can break down the records on the fly.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 3, 2010 at 9:55 am
SSIS will work, though it's more programming.
Are you using varchar(max) as the table field type?
February 3, 2010 at 10:03 am
Yes I was going to use varchar.
The ultimate table will have a row size of 630 characters composed of many fields.
February 3, 2010 at 10:58 am
I mean the staging table, varchar(max). Not just a varchar
February 3, 2010 at 12:24 pm
I didn't think to ask that.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply