April 11, 2002 at 1:26 am
I am looking for a way to read a large XML data using T-SQL. I got a tip that you can define a ntext variable in the store procedure that can accomondate 2GIG of data but the question is remaining that how to read an XML file located on the C:drive?
April 11, 2002 at 4:29 am
Couple of questions which could help formulate an answer
Which version of SQL Server are you using?(2000 has ace support for XML, prior to that your options are more limited.)
What do you want to do with the XML Data once you've got it?
I have found it is possible to please all of the people all of the time if you do exactly what they want. Harold Macmillan 1961
April 11, 2002 at 7:49 am
quote:
Couple of questions which could help formulate an answerWhich version of SQL Server are you using?(2000 has ace support for XML, prior to that your options are more limited.)
==> sql 2000
What do you want to do with the XML Data once you've got it?
==> Use sp_xml_preparedocument, read data and update the db
April 14, 2002 at 1:37 pm
Not sure if this will help you and Ive never tried it myself but atleast its something to look at:
sp_addlinkedserver [ @server = ] 'server'
[ , [ @srvproduct = ] 'product_name' ]
[ , [ @provider = ] 'provider_name' ]
[ , [ @datasrc = ] 'data_source' ]
[ , [ @location = ] 'location' ]
[ , [ @provstr = ] 'provider_string' ]
[ , [ @catalog = ] 'catalog' ]
H. Use the Microsoft OLE DB Provider for Jet to access a text file
This example creates a linked server for directly accessing text files, without linking the files as tables in an Access .mdb file. The provider is Microsoft.Jet.OLEDB.4.0 and the provider string is 'Text'.
The data source is the full pathname of the directory that contains the text files. A schema.ini file, which describes the structure of the text files, must exist in the same directory as the text files. For more information about creating a schema.ini file, refer to Jet Database Engine documentation.
--Create a linked server
EXEC sp_addlinkedserver txtsrv, 'Jet 4.0',
'Microsoft.Jet.OLEDB.4.0',
'c:\data\distqry',
NULL,
'Text'
GO
--Set up login mappings
EXEC sp_addlinkedsrvlogin txtsrv, FALSE, Admin, NULL
GO
--List the tables in the linked server
EXEC sp_tables_ex txtsrv
GO
--Query one of the tables: file1#txt
--using a 4-part name
SELECT *
FROM txtsrv...[file1#txt]
April 15, 2002 at 7:12 am
Soryy I didn't get back sooner, but I'm very intersted in this issue as I've got to do this next week. One route I've pursued is to utilise DTS's ability to call ale objects, you might find this article very interesting http://www.swynk.com/friends/szabo/DTSFileArrival.asp. The implication is that you could call the MSXML parser in the DTS activex script and process direct into the database tables. An alternative would be to use the fso to open the file parse it and convert it into a csv form, then call a second DTS package to process the csv you've just created. I can't believ that MS didn't include a straight XML import option in the DTS!
I have found it is possible to please all of the people all of the time if you do exactly what they want. Harold Macmillan 1961
October 23, 2004 at 8:36 pm
Here's the rub: How big is "big"??
If you are talking about a 200 MB file with lots of nested XML elements, as they say in the Northeast, "can't get thar' from here...". Microsoft often uses the DOM (Document Object Model) parser in their XML tools, which is quite robust in terms of its programmatic functionality; but it requires you to load the entire document into memory. Have you tried loading a 200MB file into memory? I have, using Microsoft's SQL XML tools and the error messages you get when it croaks can be somewhat less than helpful!
Microsoft does offer a SAX parser for XML; the SAX parser API does not offer the context awareness that DOM does, but it works on big files. You can't ask a SAX parser "what was the prior element of this one?"; but you can add that context awareness yourself through the use of state and lag variables. I haven't used Microsoft's SAX parser; I just did a quick search and found somebody grousing about it's poor performance too [http://forums.devshed.com/t184289/s.html; maybe they just didn't set up the calls right??]
My two cents is (are?): if you're going to work with a big XML (really big), you will have to use SAX; DOM just doesn't practically cut it...
Steve
logicom-inc..com
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply