October 8, 2009 at 4:57 pm
Comments posted to this topic are about the item Using XML to pass and return values to procs[/url]
Note that in the article the query to return the xml in "highly nested" format is missing the actual FROM clauses that will cause it to nest properly. As-is, each new response or request would generate a new entity parent.
Below is the SQL to nest properly (so multiple requests and responses for the same entity appear under that one entity).
Because the denormalized temp table (which acts as the source for this query) will have been previously sorted in a variety of ways, I added a sort_seq_nbr column to it, and then I have this final query order by the sort_seq_nbr (otherwise the FOR XML will re-order the data in its own way).
(Adding the sort_seq_nbr to the temp table uses another new feature of SQL 2005: we define the column as:
ROW_NUMBER() over(order by ~) as sort_seq_nbr
Where ~ matches the contents of the ORDER BY clause exactly. This gives us a sequentially numbered column in our sorted order.)
DECLARE @rslt NVARCHAR(MAX)
SELECT @rslt = CONVERT(NVARCHAR(MAX), x.rslt )
FROM
(
SELECT
'A' 'status',
MAX(MSG.id) 'msg/id',
MAX(MSG.TEXT) 'msg/text',
(SELECT
entity_ver_id'entity/@id',
ENTITY_CAT_ID'entity/cat',
client_id'entity/client_id',
ver_name'entity/entity_name',
ver_desc'entity/entity_desc',
create_date'entity/entity_date',
created_by_user_pin'entity/entity_creator',
entity_ver_id'entity/entity_ver',
entity_amt'entity/entity_amt',
(SELECT
req_id'request/@id',
req_date'request/req_date',
amt'request/req_amt',
priority_code'request/req_priority',
Entr_by_pin'request/entr_by_id',
entr_by_name'request/entr_by_name',
Req_by_pin'request/requestor_id',
requestor_name'request/requestor_name',
[status]'request/req_current_status',
status_date'request/req_current_status_date',
req_cmnt'request/req_cmnt',
(SELECT
resp_id'response/@id',
resp_date'response/resp_date',
Resp_user_pin'response/resp_user_id',
respondent_name'response/resp_name',
resp_type'response/resp_type',
final_approver'response/final_approver',
resp_status'response/resp_status',
(
SELECT
CASE WHEN RESP_USER_PIN IS NULL or COALESCE(resp_delegator_user_pin,RESP_USER_PIN) = RESP_USER_PIN THEN
'N'
else
'Y'
end'delegated_ind',
CASE WHEN RESP_USER_PIN IS NULL or COALESCE(resp_delegator_user_pin,RESP_USER_PIN) = RESP_USER_PIN THEN
CAST(NULL AS VARCHAR(30))
ELSE
resp_delegator_user_pin
end'delegator_pin',
CASE WHEN RESP_USER_PIN IS NULL or COALESCE(resp_delegator_user_pin,RESP_USER_PIN) = RESP_USER_PIN THEN
CAST(NULL AS VARCHAR(30))
ELSE
Delegator_name
end'delegator_name'
FROM #myInfo delgt
WHERE delgt.entity_ver_id = topLvl.entity_ver_id
AND delgt.req_id = reqLvl.req_id
AND delgt.resp_id = respLvl.resp_id
ORDER BY sort_seq_nbr
FOR XML PATH(''), type
)'response/delegator'
FROM #myInfo respLvl
WHERE respLvl.entity_ver_id = topLvl.entity_ver_id
AND respLvl.req_id = reqLvl.req_id
ORDER BY sort_seq_nbr
FOR XML PATH(''), type
)'request/responses'
FROM #myInfo reqLvl
WHERE reqLvl.entity_ver_id = topLvl.entity_ver_id
GROUP BY
req_id,
req_date,
amt,
priority_code,
Entr_by_pin,
entr_by_name,
Req_by_pin,
requestor_name,
[status],
status_date,
req_cmnt
ORDER BY MAX(sort_seq_nbr)
FOR XML PATH(''), TYPE
)'entity/requests'
FROM #myInfo topLvl
GROUP BY entity_ver_id,
ENTITY_CAT_ID,
client_id,
ver_name,
ver_desc,
create_date,
created_by_user_pin,
entity_ver_id,
entity_amt
ORDER BY MAX(sort_seq_nbr)
FOR XML PATH(''), TYPE
)'entities'
FROM (SELECT '900004' AS id, 'Request Accepted' AS [TEXT])msg
FOR XML PATH(''), ROOT('doc'), TYPE
)X(rslt);
SELECT @rslt AS Xml_Result
October 18, 2009 at 11:19 pm
Steve,
Thank you for your article. I would like to add my 2 cents:
- xml data-type should be and must be strongly-typed (i.e. client has to pass not just well-formed XML but also XML with a particular schema, each xml parameter in or out could have different schema);
- there is an alternative to xml data type in SQL Server 2008 it called Table-Valued Parameters.
Cheers
Regards from Down Under
October 19, 2009 at 4:53 am
Hi
I have actually been doing this for years.
But going much further because of my C++ background
Because XML will integrate better with classes including apparent inheritance.
Controling front end business logic by controls status and even alternative front end event handling.
The idea is to build the business logic into to back end database.
What is excellent using XML in this way is actually to breakaway from the Relational constraints (hide this) and treat the data Source as Object streams.
Its very powerful and for me proven. The idea is to send and pass XML to and from. Forming the XML at the server end requires work but the result is awesome because it is as if the Server was an Object Database.
Best regards
Roland
October 19, 2009 at 5:59 am
Great article - thank you!
October 19, 2009 at 6:56 am
We've successfully been using XML as a data transport mechanism with our .Net web application. However, things are a bit cleaner in that we can pass native XML (datatype XML) back and forth without having to convert to NVARCHAR.
One comment: Your use of OPENXML in a web application is putting a lot of pressure on your system.
See: Stop Using OPENXML
October 19, 2009 at 8:37 am
It is an interesting twist on a popular technique. The only point where I was a bit doubtful was to use EXECUTE sp_xml_preparedocument to get the edge table. Do you have any test data on the time it takes and the memory overhead (Temporary, I know!). It is possible to do this in TSQL, though not at all pleasant!
Best wishes,
Phil Factor
October 19, 2009 at 11:15 am
Yes, it comes with a lot of flexibility for smaller xml documents. However, it consumes a lot of cpu cycles to read the xml data as SQL is not natively optimized for XML.
We have high CPU usage under high volume transactions, causing slow response times.
October 19, 2009 at 12:38 pm
The xml that is input to this is usually small: a couple of dozen lines. It takes 6 milliseconds to populate the @xmlEdgeTbl via the openxml.
When I try this with 2,607 lines of xml (more than we would ever process) it takes 474 milliseconds. These are acceptable times for our purposes.
Please see the warning from Mauve, above, regarding its allocation of 1/8 of the available memory.
October 21, 2009 at 1:33 am
Steve McRoberts-357330 (10/19/2009)
Please see the warning from Mauve, above, regarding its allocation of 1/8 of the available memory.
Two points, one quick, and one not so quick:
1. (The quick one) Service Broker uses OPENXML internally, not nodes(). I am not a fan of OPENXML for many reasons, but I just want to point out that it gets a bad press in general. If it's good enough for Service Broker...:-P
2. (The long one) OPENXML certainly does not automatically allocate one eighth of server memory. The referenced MS article is extremely badly worded. Let me reproduce the answers I gave a few months ago at http://www.sqlservercentral.com/Forums/Topic757986-640-1.aspx:
me
I think this is the thing that causes the confusion: sp_xml_preparedocumentSo that's just saying that when you use the XML parser to open a document for use with OPENXML, up to one-eighth of SQL Server memory is potentially available to Msxmlsql.dll - should it need it.
This is the first MSFT answer I found with Google to back that up: MSDN Forums Thread[/url].
My impression (though the method has so far escaped deprecation) is that XML DML methods/XQuery/XPath/FOR XML and the gang are to be preferred to using sp_xml_preparedocument and OPENXML wherever possible. There are many difficulties associated with OPENXML so personally I avoid it.
To be clear, all that applies specifically to prepare/remove document and OPENXML. XML variables, columns and so on allocate memory in the usual fashion. That applies to the XML portion of the query optimizer too. One interesting thing about xml variables (like all the MAX datatypes when used as variables) is that they require tempdb workspace.
Many of the XML features use streaming interfaces to minimize memory consumption - bulk loading XML is a good example of this. See XML Support in Microsoft SQL Server 2005
me again
Yes the wording could be better. I refreshed my memory on the detail last night (one of Ken Henderson's excellent books) so I just want to add:SQL Server uses MSXML and the DOM to process documents you load via sp_xml_preparedocument. It restricts the virtual memory MSXML can use for DOM processing to one-eighth of the physical memory on the machine or 500MB, whichever is less.
It's unlikely that MSXML would be able to get 500MB of virtual memory since SQL Server reserves most of the user mode address space for use by the buffer cache. MSXML allocates memory via the multi-page allocator (the so-called MemToLeave region) which has a default maximum size of 256MB. Since that region is used for all sorts of stuff (by SQL Server as well) it's likely to be much less than that.
MSXML is limited to 500MB of virtual memory regardless of the amount of memory on the machine because SQL Server calls the GlobalMemoryStatus Win32 API function to determine the amount of available physical memory. On machines with more than 4GB of physical memory, GlobalMemoryStatus can return incorrect information, so Windows returns a -1 to indicate an overflow. The Win32 API function GlobalMemoryStatusEx exists to address this shortcoming, but SQLXML does not call it. :laugh:
One last thing that I had forgotten - MSXML had an ordinal-only DLL entry point added specifically for SQL Server to allow it to set the maximum virtual memory to use.
Cheers,
Paul
October 21, 2009 at 1:41 am
Anil-465177 (10/19/2009)
However, it consumes a lot of cpu cycles to read the xml data as SQL is not natively optimized for XML.
Nonsense. SQL Server contains many optimizations for generating and consuming XML. A well-designed system can process XML extremely efficiently. I seem to keep mentioning Service Broker recently, but it is an excellent example of how well an XML-message-based system can scale out and up.
Anil-465177 (10/19/2009)
We have high CPU usage under high volume transactions, causing slow response times.
No doubt. 😉
October 21, 2009 at 2:01 am
Hey Steve,
I just want to say how much I enjoyed reading your article today (even if I don't necessarily agree with all the content!)
I have the greatest admiration for anyone that can dedicate so much personal time and effort to produce such an interesting and well-presented article, with the obvious intent of sharing a creative and thought-provoking idea with the masses.
So, I just wanted to say "well done", and don't pay too much attention to the 'drive-by' posters.
Paul
October 21, 2009 at 5:12 am
Yes, I enjoyed the article too. It has really got me thinking.
Is there an alternative to getting the 'Edge table' other than ...
[font="Courier New"]EXECUTE sp_xml_preparedocument @hdl OUTPUT, @xml
INSERT INTO @xmlEdgeTbl
SELECT *
FROM OPENXML(@hdl, N'/doc')
EXECUTE sp_xml_removedocument @hdl
[/font]
...?
If it is only in the region of 6 Ms and isn't a memory hog, then it looks like a good way of doing it.
Best wishes,
Phil Factor
October 21, 2009 at 6:33 am
Thanks, Paul; you made my day!
Phil: I also started looking for a replacement for openXml. One suggestion was to use nodes, but in our system we have a variety of xml formats coming in for various requests, and they have different tags. The nice thing about openxml is that you don't need to know the tag names in your xml ahead of time; it gives them to you in the edge table. So, I second Phil's question: Does anyone know of a better alternative to producing the edge table?
October 21, 2009 at 7:34 am
irozenberg (10/18/2009)
Steve,Thank you for your article. I would like to add my 2 cents:
- xml data-type should be and must be strongly-typed (i.e. client has to pass not just well-formed XML but also XML with a particular schema, each xml parameter in or out could have different schema);
- there is an alternative to xml data type in SQL Server 2008 it called Table-Valued Parameters.
Cheers
Regards from Down Under
Table-value parameters miss out on a number of the XML advantages. Flexible data structure is one of them. Also, web services are pretty much ubiquitous these days, and the XML for those fits right into the pattern in the article of accepting and returning XML. That's got all the usual consistency advantages.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
October 21, 2009 at 8:27 am
Its an interesting chat.
My use of XML here has been to actually enable the Server to apply various or even alternative business rules logic to an Application even to the extent of changing event handling. As I say the XML can be used as a object stream and interface with back end relational data.
Another use is actually to cut out the need for a SSIS layer altogether
Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply