server memory low during xml import in ms sql table xml size is near about 4 GB

  • --I am using Sql Server 2005 .

    --I have one sp which simply takes xml document as a input and inserts value i two tables.

    CREATE TABLE [dbo].[TopKey1]([id] [varchar](50) NOT NULL,[key1Id] [int] NOT NULL,[key1_Frequency] [int] NULL,[Doc_id] [int] NULL);

    CREATE TABLE [dbo].[Key1]([key1Id] [int] NOT NULL,[Key1] [nvarchar](255) NULL,[doc_Frequency] [int] NULL);

    --my sp is as below.

    CREATE PROCEDURE [dbo].[ImportKey1Key1words]

    (@Key1sentenceKey1wordsXML XML, @returnCode INT OUTPUT,@errorMessage VARCHAR(MAX) OUTPUT)

    AS

    BEGIN

    SET NOCOUNT ON;

    declare @xmlDocumentHandler INT

    SET @returnCode = 0;

    SET @errorMessage = NULL

    BEGIN TRY

    --BEGIN TRAN

    EXEC sp_xml_preparedocument @xmlDocumentHandler OUTPUT, @Key1sentenceKey1wordsXML;

    INSERT INTO Key1(Key1id,Key1)

    select k.id ,k.Key1

    from openxml(@xmlDocumentHandler,'/docs/d/ks',2) WITH (id varchar(50) '../@id', Key1 nvarchar(max) '.') as k

    INSERT INTO [dbo].[TopKey1]([id],[Key1id],[Key1_Frequency])

    SELECT k.id,k.Key1,k.Frequency

    FROM openxml(@xmlDocumentHandler,'/docs/d/k',2)WITH (id varchar(50) '../@id', Key1 nvarchar(max) './@w' ,Frequency nvarchar(max) './@f') as k

    WHERE k.Key1 IS NOT NULL

    EXEC sp_xml_removedocument @xmlDocumentHandler

    --COMMIT TRAN ;

    END TRY

    BEGIN catch

    -- ROLLBACK TRAN ;

    SET @returnCode = -1

    SET @errorMessage = ERROR_MESSAGE()

    END catch

    END

    sample XML:

    '

    w="program" f="1">w="oe" f="13">f="8">Command Line:/APP:OE /CALLER:WINNT /INSTALLMODE: Install TIME: MachineReg/Unreg Exes: "C:\Program Files\Outlook Express\msimn.exe" /reg-----[START]: OE / WAB Setup 5.0 started on 05/07/2008 at 15:29MODE: Install TIME: User-----[START]: OE / WAB Setup 5.0 started on 05/22/2008 at 17:51Command Line:/APP:WAB /CALLER:WINNT /user /installCommand Line:/APP:OE /CALLER:WINNT /user /install]]>

    '

  • The approach itself is indeed simple...

    As far as I can see it's the document size (4GB) together with the way it's handled that causes low memory.

    I recommend you change the way on how the large xml doc is used.

    From what you've posted I can't see where you get the xml file from. I assume it's a file you're loading into variable @Key1sentenceKey1wordsXML.

    Instead of calling a proc with an 4GB xml variable you should import it into an intermediate (temp) table, add an XML index to it and fill your two tables using XQuery instead of openxml.

    I also wouldn't use a transaction around those statements. Instead of transaction I'd prefer either a delete or even a truncate statement in the CATCH block.

    Maybe some of the folks around here that are more experienced in handling such a data size will jump in with other thoughts...



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

  • you are right .

    given xml is only sample.

    actual xml I pass through variable. '@Key1sentenceKey1wordsXML.'

    Ok. Thanks

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply