bulk insert not inserting everything...

  • Hi all,

    I'm using this code to read in the contents of a file: (@filename & @filecontent are declared earlier on in the sp)

    CREATE TABLE #tempf (filecontent text)

    EXEC ('BULK INSERT #tempf FROM ''' + @filename + ''' with (datafiletype=''char'',codepage=''raw'',rowterminator = ''\0'')')

    SELECT @filecontent = filecontent FROM #tempf

    DROP TABLE #tempf

    However, @filecontent contains only the first 256 characters of the file (which can be up to 100 MB). I already tried setting the @@textsize, but that didn't seem to matter much (though it did respond to settings lower than 256)

    Some more info: the file is in fact an xml file containing data for various tables. Previously I used ColdFusion to read the file and call the stored procedure with the xmlstring as an argument. That worked, however, it is giving me memory issues. Therefore I'm trying to have SQL Server read the file directly.

    Regards,

    Ruud

  • How do you know it's only 256 chars that gets into the table?

    Have you tried a 'SELECT DATALENGTH(filecontent) from #tempf' ?

    Remember that many tools used for viewing data has a limit on how wide the displayed column is. For QA this is 256 by default.

    In QA, look in Tools->Options->Results and see what the Maximum chars per column is set to.

    /Kenneth

  • Whoops, you're right. It seems I jumped to conclusions a bit to fast.

    The sp returns the error: 'XML Parse error: Invalid at the top level of the document' Then I saw only 256 chars and I figured an incomplete xml string might be causing the error.

    But it's not. datalength() proved the entire file is read indeed. Thanks.

    But, it still leaves me with the original xml-parse-error-problem. The next line of code (which is throwing the error) in my SP would be:

    EXEC sp_xml_preparedocument @idoc OUTPUT, @filecontent

    Note that the xml is perfectly valid since the SP works when I directly insert the (unparsed) xml string.

    Is anything happening to the xml data when using this bulk insert I'm not aware of?

    Regards,

    Ruud

  • On that department, unfortunately I have no clue.

    (XML studying is still on the 'to do' list)

    =;o)

    /Kenneth

  • You don't show your declaration for @filecontent, but the largest it could be is varchar(8000). If your file is longer than that, then the statement:

       SELECT @filecontent = filecontent FROM #tempf 

    will truncate your XML, resulting in malformed content stored in @filecontent.

    Unfortunately, you can't use a SELECT as a parameter for sp_xml_preparedocument, and you can't declare local variables as TEXT.

    Mike

     

  • I learned a lot lately about SQL Server and XML. I found out the basic problem here is that my XML documents are way too large (90 MB) to feed into sp_xml_preparedocument using varchar(8000) fields. It's even too large for dynamic solutions like the one mentioned here: http://www.experts-exchange.com/Databases/Microsoft_SQL_Server/Q_20670044.html 

    The only option so far is calling the procedure external. I tried ColdFusion & VB; both work. However, they both need a lot of memory. For a 90 MB file, 400 MB of memory is used. Not an ideal solution. If 2 imports are going on, the whole system hangs.

    So far I can only hope Yukon offers better XML handling...

  • Please correct me if I'm wrong, but your XML is data specific...

    Can you reduce the total record count on XML document, thus reducing your memory requirements? Making XML pages instead of outputing the entire file at once?


    Regards,

    Coach James

  • Sure, we can split up the files in tiny bits, but it still is 1 import, so you need complicated transaction control to make sure you have every bit and none failed.

    Besides, small files also take memory 5 times their own size. Still leaves me wondering why. Is loading a file and feeding it into a sp such a memory intensive task?

  • Best to rethink the process so it's not a huge transaction.

    If you can't do that, try a workaround to the workaround. Temporarily load the 90mb xml in chunks to a work table. Then insert from that work table to the real destination table in one transaction.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply