August 30, 2016 at 2:59 pm
I have 12 files - one for each month - each with between 1.5 and 1.9 million rows. These are separated by "`".
I am trying to load these into a table and using StreamReader to do that.
But it dies at around 1.1 million rows.
I am reading into a datatable in memory and then hoping to use SqlBulkCopy to load into table.
I can of course, split the files into smaller chunks, but that would mean that I would end up with over 25 files. Is there any other way of loading txt files into the DB?
Thanks in advance.
August 30, 2016 at 3:14 pm
Can't you just bulk load or bcp the files directly?
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
August 30, 2016 at 4:11 pm
Thanks, that's what I will have to do.
I am dreading creating the format file.
August 30, 2016 at 6:12 pm
you can do it in c# or powershell - see a good example here https://blog.netnerds.net/2015/01/powershell-high-performance-techniques-for-importing-csv-to-sql-server/
main thing is not loading everything in one go to a data table - load a few rows (10-50K) and then SqlBulkCopy them to the server - repeat until end of file
August 30, 2016 at 7:53 pm
I noticed you didn't mention WHY it dies around 1.1M. If it gets that far before crashing I'd have to question if there's something odd in the file.
If there are bad records, chances are most loading routines will likely crash unless you trap for the error.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply