SSIS BULK INSERT attempt

  • I have a log file which I literally want to copy each "line" of the log file into a row in my database. The BULK INSERT I'm attempting to use looks like this.

    BULK INSERT [LogSamples].[dbo].[RawDataImport]

    FROM 'E:\LogsToImport\dcsv9q2ah100004nscw1mr4qm.log'

    WITH

    (

    ROWTERMINATOR = '',

    TABLOCK

    )

    I know my file literally has over a couple million lines of data, but I only end up with about 59,993 rows.

    Am I missing something, do I need to set another attribute of the BULK INSERT?

  • I am not sure about ROWTERMINATOR

  • I think you have to use ROWTERMINATOR = '|'

    Check the Microsoft sample

    A. Using pipes to import data from a file

    This example imports order detail information into the AdventureWorks.Sales.SalesOrderDetail table from the specified data file by using a pipe (|) as the field terminator and | as the row terminator.

    Copy Code

    BULK INSERT AdventureWorks.Sales.SalesOrderDetail

    FROM 'f:\orders\lineitem.tbl'

    WITH

    (

    FIELDTERMINATOR =' |',

    ROWTERMINATOR =' |'

    )

    Regards

    Ahmed

  • My ROWTERMINATOR is actually set to ' \ n ' without spaces, I think it formatted it or something for the forum post. I'll try the suggestion to stick a pipe or something at the end.

    I tried this:

    BULK INSERT [LogSamples].[dbo].[RawDataImport]

    FROM 'E:\LogsToImport\dcsv9q2ah100004nscw1mr4qm.log'

    WITH

    (

    ROWS_PER_BATCH = 3300,

    ROWTERMINATOR = '',

    TABLOCK

    )

    But it still is stopping before it gets to the end of the file.

  • Do you have any error displayed

  • Nope. It just stops at the 59,933rd row. Says it completed.

  • Check the following, may be it could help

  • Just an update, I've tried a few more things and I still get stuck at the aforementioned total rows.

    It seems like for some reason the process thinks it is done. Does the BULK INSERT maybe break on nulls or some other default?

  • Hi,

    I tried to make abulk insert with a file of 90 MB and which contains about 500000 rows, I had no problems, all row were inserted.

    I used

    Bulk insert test1 from 'd:\textFile.txt' with (fieldterminator=',').

    Your problem is not related to the file size.

    You said you don't have an error message, so you don't have null problem.

    I suggest you to retreive the line were the bulk stopped.

    Regards,

    Ahmed

  • I will try that out and do a little more research into the file. I think you are right in thinking the issue is somewhere inside the actual file.

    I tried a 1.2GB file today and got a good clean 1.2 Million rows out of it. So something has to be askew in the file.

    ...I'll post back my results.

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply