November 15, 2007 at 11:45 am
I have a log file which I literally want to copy each "line" of the log file into a row in my database. The BULK INSERT I'm attempting to use looks like this.
BULK INSERT [LogSamples].[dbo].[RawDataImport]
FROM 'E:\LogsToImport\dcsv9q2ah100004nscw1mr4qm.log'
WITH
(
ROWTERMINATOR = '',
TABLOCK
)
I know my file literally has over a couple million lines of data, but I only end up with about 59,993 rows.
Am I missing something, do I need to set another attribute of the BULK INSERT?
November 15, 2007 at 12:03 pm
I am not sure about ROWTERMINATOR
November 15, 2007 at 12:06 pm
I think you have to use ROWTERMINATOR = '|'
Check the Microsoft sample
A. Using pipes to import data from a file
This example imports order detail information into the AdventureWorks.Sales.SalesOrderDetail table from the specified data file by using a pipe (|) as the field terminator and | as the row terminator.
Copy Code
BULK INSERT AdventureWorks.Sales.SalesOrderDetail
FROM 'f:\orders\lineitem.tbl'
WITH
(
FIELDTERMINATOR =' |',
ROWTERMINATOR =' |'
)
Regards
Ahmed
November 15, 2007 at 12:16 pm
My ROWTERMINATOR is actually set to ' \ n ' without spaces, I think it formatted it or something for the forum post. I'll try the suggestion to stick a pipe or something at the end.
I tried this:
BULK INSERT [LogSamples].[dbo].[RawDataImport]
FROM 'E:\LogsToImport\dcsv9q2ah100004nscw1mr4qm.log'
WITH
(
ROWS_PER_BATCH = 3300,
ROWTERMINATOR = '',
TABLOCK
)
But it still is stopping before it gets to the end of the file.
November 15, 2007 at 12:25 pm
Do you have any error displayed
November 15, 2007 at 12:27 pm
Nope. It just stops at the 59,933rd row. Says it completed.
November 15, 2007 at 12:30 pm
Check the following, may be it could help
November 15, 2007 at 12:35 pm
November 15, 2007 at 3:16 pm
Just an update, I've tried a few more things and I still get stuck at the aforementioned total rows.
It seems like for some reason the process thinks it is done. Does the BULK INSERT maybe break on nulls or some other default?
November 15, 2007 at 9:18 pm
Hi,
I tried to make abulk insert with a file of 90 MB and which contains about 500000 rows, I had no problems, all row were inserted.
I used
Bulk insert test1 from 'd:\textFile.txt' with (fieldterminator=',').
Your problem is not related to the file size.
You said you don't have an error message, so you don't have null problem.
I suggest you to retreive the line were the bulk stopped.
Regards,
Ahmed
November 16, 2007 at 3:01 pm
I will try that out and do a little more research into the file. I think you are right in thinking the issue is somewhere inside the actual file.
I tried a 1.2GB file today and got a good clean 1.2 Million rows out of it. So something has to be askew in the file.
...I'll post back my results.
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy