May 22, 2013 at 1:01 am
Hi All!
I have an xml file of 44 Gb (Not Meg, its really GB)
Delivered by the Danish custom authorities.
My problem is simple - How to import such a beast?
I have seen a limit of 2.1 Gb everywhere
Best regards
Edvard Korsbæk
May 23, 2013 at 6:20 am
Is there a way to break it up? If not maybe doing the import out of SQL, such as using a .NET app or something to do the import.
that's a HUGE XML file. The largest XML file I ever imported was 8 gb, and I created an .NET app do to the importing into the SQL database
May 23, 2013 at 6:27 am
No, i can't even open it in any text editor i have come across.
Its big!
I really do not think that anybody is using it for anything - But its the only solution Danish Custom gives me.
Can you tell me something about that .net app you made?
Best regards
Edvard Korsbæk
May 23, 2013 at 6:33 am
since you can't open it, I would kick it back to the sender requesting them to separate the data some how so it can be more manageable to work with.
As for the .NET app, I made it read the XML file and parse out the data based on the xml tags and inserted into the tables. it was something like SqlBulkCopy to do the process.
May 23, 2013 at 6:39 am
Use .NET App for parsing this xml file into a DataSet hen you can use this Dataset for saving in SQL
May 23, 2013 at 7:44 am
May 23, 2013 at 1:21 pm
Eugene Elutin (5/23/2013)
1. SSIS - use XMLSource2. SQLXMLBulkLoad:
I don't have experience with SSIS, but I've seen bulk load do some amazing things. The throughput is simply amazing. It takes some time to set up, but it's worth it in the end, especially if you're going to have to use the same source file layout again.
May 23, 2013 at 2:12 pm
If you can identify an XML tag immediately under the root that is repeated for each major section of the text data, then you can bulk import the data with that tag as a delimiter, clean up the dangling header and footer tags, and then you'll have more manageable chunks of the overall text. If those chunks are too big to be manageable, then repeat. And so on.
May 24, 2013 at 7:37 am
edvard 19773 (5/23/2013)
No, i can't even open it in any text editor i have come across.Its big!
I really do not think that anybody is using it for anything - But its the only solution Danish Custom gives me.
I would probably start with a VBscript or .Net script to read in the first 1000 or so lines (assuming it has line breaks) then write those out to a text file to get some sense of the schema and data contained in the file. Then go from there to determine how the file might be split up and processed.
June 5, 2013 at 12:12 pm
There are xml split tools available. Some GUI, some command-line. Some free, some commercial. Just Google "xml splitter tool"
March 20, 2015 at 3:00 am
declare @Xml_prod xml
select @Xml_prod= cast(c1 as xml) from OPENROWSET (BULK 'C:\xmldownload\adzunajobfeeds.xml',SINGLE_BLOB) as T1(c1)
Msg 6365, Level 16, State 1, Line 1 An XML operation resulted an XML data type exceeding 2GB in size. Operation aborted.
if some one hav any solution please reply me on
its very urgent
March 20, 2015 at 3:33 am
govind.sharma1 (3/20/2015)
declare @Xml_prod xmlselect @Xml_prod= cast(c1 as xml) from OPENROWSET (BULK 'C:\xmldownload\adzunajobfeeds.xml',SINGLE_BLOB) as T1(c1)
Msg 6365, Level 16, State 1, Line 1 An XML operation resulted an XML data type exceeding 2GB in size. Operation aborted.
if some one hav any solution please reply me on
its very urgent
It's a SS limit of 2GB for XML data type. Your file cann't be stored in a column or variable as a single value. You need to process it some way when loading to DB, extract some data and store in a number of cols, rows, variables.
Give us more info if more help is needed.
March 20, 2015 at 10:52 am
govind.sharma1:
I think you need to textually parse your XML from the source - an "old school" trick.
Use BULK INSERT to read it like a text file and split it up by consecutive end/start tags. Your choice of tags should be whatever does the most effective job of breaking up your file into more manageable chunks.
Be advised that there may be a lot of textual cleanup required after this breakup of what had been a well-formed XML document, depending on the degree of homogeneity/heterogeneity of your XML content. But at least the initial import will be fast and you will be able to "munge" text in SQL. If your document is fairly homogenous then this kind of solution will probably be faster than anything else.
Please let me know here how it goes.
- Brian
November 29, 2015 at 11:27 pm
Hi,
Please suggest me how to load such huge XML into SQL Server which is of greater than 2gb in size.
Here my requirement is my XML file is huge in size which of greater than 2gb (Even the size will be 20gb also).
I need to load such huge XML file into a SQL Server Table column which is of XML data type.
is there any work around to load such huge XML files. If yes please suggest me how to proceed on this.
Your quick response is highly appreciated.
Thanks in advance.
Kindly do the needful.
November 30, 2015 at 12:52 am
2GB limit is still here even in 2016. Why you need to store 20GB of data as a single value in MS SQL db? Definitely it's not the purpose MS SS was designed for. Can you provide more info about your problem?
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply