June 11, 2007 at 7:38 pm
How much do you want to bet that the mechanism for this passing of data will be XML based?? HAH!!! Who cares if TDS or some other binary mechanism is 5 times more efficient?!?! :-))
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
September 12, 2007 at 6:55 am
Nice article. Thanks for sharing this info with us.
September 13, 2007 at 2:15 am
Eric Wilson writes:
>Why not ASK MICROSOFT and other DBMS vendors to build in Table-Vars as
>fully fledged things?! Why are we stuck with XML work-arounds
>(which I too use) when a relational system should handle
>relational-freakin'-variables?
Eric and I speak the same language. Unfortunately most sql users do not understand the meaning of a table as a 'variable' (vs. an sql table as a 'value') and its significant implications and benefits in database technology. I've attempted to illustrate such a database with table variables. Should you have the desire to learn about this and see the contrast with sql visit my blog:
http://www.beyondsql.blogspot.com
In particular see these articles:
http://beyondsql.blogspot.com/2007/09/dataphor-all-tables-are-typed-variables.html
http://beyondsql.blogspot.com/2007/06/dataphor-13-passing-table-as-parameter.html
http://beyondsql.blogspot.com/2007/08/dataphor-creating-super-function.html
September 20, 2007 at 3:39 pm
When tables are 'variables' you can create what I called
a 'Super Function'. Pass any table of the appropriate 'type'
to the function, it's that simple. The example was a function
that returned a dense rank including when the target column
of the rank repeats. The function assumed a single primary key.
And only the PK and the target column of the rank constituted
the table to be passed to the function.
http://beyondsql.blogspot.com/2007/08/dataphor-creating-super-function.html
Well it's a simple matter to overload the function for tables that
have a compound primary key. And you can envision a library of different functions based on table 'type'.
http://beyondsql.blogspot.com/2007/09/dataphor-super-function-ii.html
This concept is applicable universally, independent of any particular database.
http://www.beyondsql.blogspot.com
September 23, 2007 at 5:38 am
Very good article!
Is there any way to do a similar thing with SQL Server 2000?
Thanks.
September 23, 2007 at 7:25 am
Carlos Shoji (23/09/2007)
Very good article!Is there any way to do a similar thing with SQL Server 2000?
Thanks.
Hi Carlos,
You can do this in SQL Server 2000 too. SQL Server 2000 does not support XQUERY or XML data type. So you will have to pass the data as NVARCHAR or NTEXT and then use OPENXML to retrieve a resultset from the XML data.
You can find an example here:
http://www.sqlservercentral.com/articles/Miscellaneous/2908/
.
November 18, 2007 at 11:14 pm
What a great solution to a common problem. I prefer not to work with XML in SQL unless there is a very compelling need but this is the best usage of it that I have seen. Very clever.
Thanks
Karen Gayda
MCP, MCSD, MCDBA
gaydaware.com
December 7, 2007 at 5:31 am
Scott Gammans (5/30/2007)
I don't think Jacob is the one suffering from a lack of knowledge, Sergiy. What happens when two different users simultaneously call the same stored procedure? Kaboom, that's what happens. And if you think that scenario is unlikely, you haven't worked on OLTP systems.
Nothing happens, that's what's happening. A distinct temp table is created for every connection that uses the proc. Only a global temp table is shared by all the active connections.
Honestly I don't see where passing a table to a sp would be preferred in the form of an XML document. Using temp tables is the way. I doubt that using XML has better performance, not to mention the hassle having to deal with XML querying syntax.
Don't just give the hungry man a fish, teach him how to catch it as well.
the sqlist
December 7, 2007 at 6:39 am
Hi,
Good article. I have one question though; what is the maximum number of records in the table that you worked on with this approach.
My problem is that I may have more than 100,000 records and at least 20 concurrent users. Do you suggest this as a good approach in this case.
Thanks
December 7, 2007 at 6:58 am
I have not done a "MAXIMUM" test yet. I do not recommend this as an option for BULK INSERT/UPDATE operation. There are other utilities for that. This approach might be good for passing information between stored procedures in a normal OLTP application.
I think, rather than thinking by MAX records, we should think by MAX size. An XML variable can hold upto 2 GB of data. Now, depending upon the size of your record, the total number of records that you can store will vary.
You can find furthre information about MAX sizes here: http://msdn2.microsoft.com/en-us/library/ms143432.aspx
.
December 7, 2007 at 8:20 am
temp tables aren't necessarily better. Even local (to the connection) ones. The reason is tempdb can be a source of contention and you are guaranteeing disk access. If you send in an XML document, especially in 2005, there's a chance it will parse and exist only in memory.
As the data sets grow larger, and you'd have to test what "large" is on your system, I would lean more towards temp tables over time.
December 7, 2007 at 10:48 am
You can pass table valued parameters in SQL 2008...
December 7, 2007 at 2:02 pm
The reason is tempdb can be a source of contention and you are guaranteeing disk access. If you send in an XML document, especially in 2005, there's a chance it will parse and exist only in memory.
Temp tables are created in memory.
Only if temp table becomes too big to fit in memory it's recorded to disk.
So, no advantage for XML here.
But there is a disadvantage: XML will take typically 4 times more memory then temp table holding the same data.
_____________
Code for TallyGenerator
December 7, 2007 at 2:16 pm
In SQL 2005, it has been my experience that parsing an XML data type using XQuery is more resource intensive than other methods.
December 7, 2007 at 2:59 pm
Have been doing this for along time.
But also passing Controls state with XML. Responding with business logic back from T SQL. Including naming Events. In fact began to drop Recordsets and use only XML because it ties together Controls State Event handling and Data rather like Object streaming!
Viewing 15 posts - 46 through 60 (of 100 total)
You must be logged in to reply to this topic. Login to reply