October 12, 2005 at 6:53 am
I am trying to import a file which is 300+MB. When using DTS, I get an error after two hours because it says that the log file is full; that craps at about 3GB.
I researched BCP and Bulk Insert and would like to utilize that instead. The text file has a mixture of text, int, and decimal values, and the destination table is formatted correctly for the file. In DTS, the first row contains column names. The lines are terminated by a character return and the field terminator is |
This is what I have come out with so far, but I have had no luck whatsoever. Any help would be greatly appreciated.
BULK INSERT hospital.dbo.[patbill]
FROM 'h:\patbill_200503.txt'
WITH
(
FIRSTROW = 2,
ROWS_PER_BATCH = 1000,
FIELDTERMINATOR = '|',
ROWTERMINATOR = '|\n'
)
October 12, 2005 at 7:26 am
I must confess I've never used bulk insert that much, I still favor bcp for my loads and unloads.
In the simples way, using bcp, the command would look something like:
bcp hospital.dbo.patbill in h:\patbill_200503.txt -F2 -c -t"|" -r\n -T
The above runs when H: is local to the server and uses trusted connection.
You would perhaps want to add switches for error and output files, codepage, servername, userid and password if needed.
Last note when using pipe as delimeter - it's important to denote the pipe with doublequotes, or you risk it beeing seen as a redirection command instead.
/Kenneth
October 12, 2005 at 9:04 am
Not sure what your conditions are for the import but I would change the recovery model of your DB to BulkLogged First, then perform the Import and finally change the Recovery back to Full
Cheers,
* Noel
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply