February 7, 2022 at 6:39 pm
Hello,
Currently have a stored procedure that processes an XML Doc stored in a table.
Uses: EXEC sp_xml_preparedocument @idoc OUTPUT. Then through a stored procedure it loops through and calls OPENXML queries to the @idoc and assign variables from data from the XML and then inserts into several different tables.
The issue was before, this worked just fine on small files (6kB), etc but with a new client we will be getting much bigger files to process.
Is there a more efficient way to process larger XML files instead of using OPENXML with sp_xml_preparedocument?
Thanks,
Daniel
February 7, 2022 at 6:50 pm
Have you tried importing a file into a table containing a column of XML datatype and then querying that? I have no idea whether it will be faster, but might be worth a test.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
February 7, 2022 at 7:17 pm
We currently insert into a table the XML file into an XML column data type field.
February 7, 2022 at 7:54 pm
Instead of shoving it into an xml field, can you parse it into real table or 3?
For better, quicker answers, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
February 7, 2022 at 8:11 pm
Hello, it can be an option. The current process is to bring the XML into a SQL table first and then parse the XML into our data model with some business logic/rules. Simple queries against using OPENXML previously take miliseconds now taking close to a minute because of the size of the XML.
Was trying to see if there were alternative ways to help process it.
February 7, 2022 at 9:01 pm
Why not move the parsing out of SQL? Parse the raw files in something else first(powershell, C# whatever) and dump them into the table structure needed then load from those?
February 17, 2022 at 6:28 am
Can you tell me another way to process xml files of greater than 2mb, I try your way to process it but still found a problem with it. My file showed an error during the process. I also take help from my friend from a class of comptia core training courses he said that he is new in this field and does not know how to process this file. That is why I am here to get help from you to process my file as this file is very important for me and I cannot afford to lose it that is why I'm looking for a professional to do it. If you have another way to do it kindly contact me.
February 17, 2022 at 8:40 am
Can you tell me another way to process xml files of greater than 2mb, I try your way to process it but still found a problem with it. My file showed an error during the process. I also take help from my friend from a class of comptia core training courses he said that he is new in this field and does not know how to process this file. That is why I am here to get help from you to process my file as this file is very important for me and I cannot afford to lose it that is why I'm looking for a professional to do it. If you have another way to do it kindly contact me.
If you want help resolving an error, you need to tell us what the error was and the conditions under which it occurs.
Also, it's unclear which post you are responding to, so we don't know exactly what 'your way' refers to.
There are only so many ways of parsing XML. If you've already tried PowerShell, SQL Server and C# and they're all slow, it might be down to the complexity of the file.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply