March 12, 2007 at 9:22 am
Comments posted here are about the content posted at http://www.sqlservercentral.com/columnists/jSebastian/2908.asp
.
March 26, 2007 at 9:15 pm
This is a very good article for me, a .net developer of 5 years, who does less database work and more application development than anything else. The sample code is clean, and concise, just enough detail to to be clear and useful. Thanks again -
March 26, 2007 at 9:55 pm
I notice a lot of people still using OPENXML to shred XML documents. I've only done light testing, but it appears there are definite resource advantages (much less memory), and there might even be some speed advantages, if you use the built-in XML data type and its .nodes() method instead of OPENXML to shred XML documents.
March 26, 2007 at 10:49 pm
All well and good, but a lot of applications will need to actually retreive data from the data base for each line - like codes, costs, discounts etc as each line is used - perhaps that is why a lot of apps hit the db each time a line is required?.
March 26, 2007 at 11:39 pm
Hi Mike,
This article presents a solution that works with SQL Server 2000 specifically. Towards the end of this series, I have an article that presents the same with the XML data type of SQL Server 2005.
Thankx
Jacob
.
March 26, 2007 at 11:46 pm
Hi mike,
I agree with you. There are many cases where you still need to hit the DB many times. But I suppose most of those scenarios can be avoided if we re-design the process a little bit. I have seen this a few times in the past.
Do you have a scenario in your mind where it is absolutely necessary to hit the DB for each item of an entry?
Thankx
Jacob
.
March 27, 2007 at 2:08 am
I don't understand why you do half of it with passed variables and the other half with XML, for consitency and also to make the XML meaningful on its own it would make sense to pass the ordernumber, date and customer number in with the rest of the XML.
March 27, 2007 at 3:39 am
Hi Confucius247,
I agree with you. Actually this article is part of a series, which demonstrates more and more features with each installment. The purpose of the first installment is to present a very simple usage. The second part of this series does use a single XML parameter which contains the order header information as well as item details. It also demonstrates the use of XML attributes.
Thankx
Jacob
.
March 27, 2007 at 7:16 am
Why would it matter if you send it all in XML or just the line items if you are not storing it as XML?
March 27, 2007 at 7:52 am
Other than using XML, have you tried generating a flat file to a shared folder that SQL Server can get to. Then use either OPENROWSET or BULK INSERT to load the data into a staging table where you can look up prices, validate data, etc. From there, you can move valid records into the final table and display errors on any other records left in the staging database.
By using a staging table, you can do bulk updates on the records. For example:
update <staging table> set
unitcost = i.unitcost,
...
from <staging table> s
join items i on s.itemID = i.itemID
where s.poID = @someID
March 27, 2007 at 7:53 am
I don't think the purpose of this particular use of XML is to make it meaningful. It is used as a method to transport data.
For the purposes of this demo, the author takes a clean approach and concisely shows how to accomplish the task. I would object to taking this approach in the real world, however.
It is a small point, but worthy of consideration IMHO: if the insert fails in such a way that the sproc aborts, you leave an open XML document with no handle available. If this happens enough times you will exhaust the "MEMTOLEAVE" area and be forced to restart your SQL server. I would load the XML into a temporary table, destroy the XML document, and then perform the insert, perhaps after validating the data.
This is a good article and it demonstrates a reasonable use of XML in the database.
March 27, 2007 at 8:16 am
I think the article was clear and very to the point but besides the advantages of using this technique the drawbacks should also be pointed out.
I have had memory issues with openxml in the past even though it seems like a nice tool for the job. MSXML parser is an out-of-process memory intense component and you should be really careful on highly transactional environments.
* Noel
March 27, 2007 at 8:24 am
As I mentioned earlier, the last part of this 4 part article has an error handling example, where the XML handle is released if an error occurs.
.
March 27, 2007 at 8:28 am
Hi jacob,
That's what I get for attempting comprehension before the coffee kicks in.
March 27, 2007 at 8:38 am
🙂
.
Viewing 15 posts - 1 through 15 (of 23 total)
You must be logged in to reply to this topic. Login to reply