Exporting medium sized SQL tables to a xml file

  • Just starting to find my way with XML, and in need of advise.

    I need to copy a number of SQL tables in to an .xml format so that they can be exported to a new site. I have used the following method which seems to work well for small tables. Unfortunately I get a failed result when I try it on any tables with more than 50 rows.

    Step 1, create a simple txt file for format called sample.txt

    <?xml version="1.0"?>

    <Sample>

    <%begindetail%>

    <%insert_data_here%>

    <%enddetail%>

    </Sample>

    Step 2, run the following script

    Use TIR       

    GO

    EXEC sp_makewebtask

     @outputfile= 'C:\temp\TestXML1.xml',  

     @query = 'SELECT * FROM dbo.tblHaz_Main FOR XML AUTO',

     @TEMPLATEFILE = 'c:\temp\Sample.txt'  

    Step 3, with a small table the result is good, however with slightly larger tables, here is the result when I open the TestXML1.xml file

    The XML page cannot be displayed

    Cannot view XML input using XSL style sheet.

    Please correct the error and then click the Refresh button, or try again later.

    --------------------------------------------------------------------------------

    Missing equals sign between attribute and attribute value.

    Error processing resource 'file:///C:/temp/TestXML1.xml'.

    Line 6, Position 1

    zPermanent="0" HazPossDanger="?"/>

    <dbo.tblHaz_Main HazID="604" HazCatID="2" HazSubID="47" HazPermanent="0" HazPossDanger="?"/>

    <dbo.tblHaz_Main HazID="608" HazCatID="2" HazSubID="46" HazPermanent="0" HazPossDanger="?"/>

    <dbo.tblHaz_Main HazID="615" HazCatID="3" HazSubID="43" HazPermanent="0" HazPossDanger="?"/>

    <dbo.tblHaz_Main HazID="642" HazCatID="1" HazSubID="44" HazPermanent="0" HazPossDanger="?"/>

    <dbo.tblHaz_Main HazID="648" HazCatID="1" HazSubID="44" HazPermanent="0" HazPossDanger="?"/>

    <dbo.tblHaz_Main HazID="650" HazCatID="1" HazSubID="44" HazPermanent="0" HazPossDanger="?"/>

    <dbo.tblHaz_Main HazID="651" HazCatID="3" HazSubID="42" HazPermanent="0" HazPossDanger="Incorrect cleaning material could be used in the wrong place.

    ^

    Not quiet sure what to make of it, advise would be appreciated

  • I am not sure the error. It maybe cuased by characters that cann't be encoded by utf-8.

    You can try to add encoding="utf-16" in the processing instruction section, i.e. <?xml version="1.0" encoding="utf-16" ?>

     

  • We actually have the same thing going on.  It turns out that when you get to a certain number of characters (8192) then CRLF is included in the query output.  This causes a bit of a mess.

    To get around this, we took this approach:

    1.  Create a stored procedure to generate the XML.

    2.  Use VBScript to execute the stored procedure and stream it into an ADODB.Stream object.

    3.  Replace all CRLF's in the stream with zero-length strings

    4.  Output the stream into the text file.

    Actual script looks something like this:

    Dim oStream : Set oStream = CreateObject("ADODB.Stream")      

    With oStream                  

     .Type = 2                 

     Call .Open()

    End With

    Dim oCnn : Set oCnn = CreateObject("ADODB.Connection")

    oCnn.ConnectionString = "Provider=SQLOLEDB.1;Trusted_Connection=Yes;Initial Catalog=<UserDatabaseName>;Data Source=<SQLServerInstanceName>"   

    Call oCnn.Open()

    Dim oCmd : Set oCmd = CreateObject("ADODB.Command")      

    With oCmd                   

     .ActiveConnection = oCnn

     .CommandType = &H0008 

     .CommandTimeout = 0

     .CommandText = "dbo.usp_GenerateXML"

     .Properties("Output Stream") = oStream

    End With

    Dim oFso : Set oFso = CreateObject("Scripting.FileSystemObject")

    Call oCmd.Execute( 1, , &H400 )

    Dim sXml : sXml = "<?xml version=""1.0""?>" & Replace(oStream.ReadText,vbCrLf,"")

    Dim oFile : Set oFile = oFso.CreateTextFile("c:\MyDirectory\MyFile.xml",true) 

    Call oFile.Write(sXml)

    Call oFile.Close()

    This has been very effective for us.  The size of our output has been around 3000 rows, 700Mb with no problems at all.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply