August 11, 2009 at 4:56 pm
Comments posted to this topic are about the item Sending multiple rows to the Database from an Application: Part II
August 18, 2009 at 3:34 pm
How can I use where clause ??
August 19, 2009 at 2:18 am
I'm Sorry.. I can see any benefit, ever, from sending muliple rows of data to a database. The mechansimsof transactional behaviour on the client side or destaging mechanisms where transactions are the problem should negate the need for ever doing this. I just think it's bad practice.
August 19, 2009 at 2:50 am
aay (8/18/2009)
How can I use where clause ??
I am not confirmed that you are asking about puting the WHERE clause in SP.
WHERE T.Item.value('@Salary', 'INT') < 10
This will filter records whose salary < 10.
August 19, 2009 at 2:56 am
conradr (8/19/2009)
I'm Sorry.. I can see any benefit, ever, from sending muliple rows of data to a database. The mechansimsof transactional behaviour on the client side or destaging mechanisms where transactions are the problem should negate the need for ever doing this. I just think it's bad practice.
This is always be a debate whether you use this kind of approach (bulk data manipulation) or not.
In my application it required. Moreover, that application is based on cross platform distributed architecture and our end user works in disconnected mode as well (sometime). So we come up this approach.
August 25, 2009 at 7:49 am
I prefer OPENXML. I think it is easy to then use WHERE, JOIN, and ORDER BY clauses. Using the data in the article, it would look like this:
Declare @xml XML, @idoc int
SET @xml = N'
'
EXEC sp_xml_preparedocument @idoc OUTPUT, @xml
SELECT Name, Salary FROM OPENXML (@idoc, '/ROWS/ROW', 1)
WITH (Name varchar(50), Salary int)
WHERE Salary >= 1500
ORDER BY Name
August 25, 2009 at 7:51 am
Nice article(s). I'd be interested in seeing 3rd installment that compares performance between delimited strings and XML, you might also want to include an OPENXML implementation as well.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
August 25, 2009 at 7:59 am
In the theoretical sense, it should not be necessary to insert multiple rows in one command (as stated above). But in the practical sense, we have had to do this. The number of round trips to the database (for us) greatly affected performance.
Our technique (for MS SQL Server and Oracle) is not discussed here. We generate Insert statements and concatenate them into a string and execute them using an ADO.NET command object. It reduces the number of round trips to one--and it works.
August 25, 2009 at 12:12 pm
lafleurh (8/25/2009)
In the theoretical sense, it should not be necessary to insert multiple rows in one command (as stated above). But in the practical sense, we have had to do this. The number of round trips to the database (for us) greatly affected performance.Our technique (for MS SQL Server and Oracle) is not discussed here. We generate Insert statements and concatenate them into a string and execute them using an ADO.NET command object. It reduces the number of round trips to one--and it works.
Sure that will work, but it also opens up the server (SQL Server and Oracle) for SQL Injection. What are you doing to reduce that risk?
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
August 25, 2009 at 1:01 pm
We are using the xml technique mentioned in this article in our system and it works like a charm.
August 25, 2009 at 1:45 pm
>Since, delimited string would be passed as a string data type, so limitation of size of string data supported in SQL Server 2005 comes into picture (4000 / 8000 characters max including the delimiter character).
I'm not sure that is true. We use varchar(MAX) to pass comma delimited strings. I believe these are up to 2GB
August 25, 2009 at 4:28 pm
If objects are secured properly and authentication mechanisms are secure, then SQL injection should not be an issue. Remember that you can send any command over ADO.NET as long as the authenticated user has permission to perform that command. Since the code is controlling the SQL that is generated and sent to the server, you have to take measures to make sure the code is secure (appostrophes are escaped, etc.)
This is the same issue if you are using something like NHibernate--it needs CRUD permissions on all tables--versus using stored procedures for all access while locking out table access.
SQL injection is an issue when you are generating DML, so you just have to take proper measures to deal with it and then thoroughly test for known injection methods. This should be part of your coding/testing to start off with. Bunching commands together in one string vs. individual strings is the only difference.
I'm not saying this is the best way to do things, and there are a lot of things that can go wrong if you don't anticipate problems (i.e. injection), but it is another technique.
August 25, 2009 at 5:35 pm
there is another choice:
Update SQL Server Data by Using XML Updategrams
An updategram is a data structure that you can use to express a change in the data. INSERT, UPDATE, and DELETE commands are represented in an updategram by the difference of the image of the data before and the image of the data after a change.
August 25, 2009 at 7:54 pm
Very insightful article!
Any idea if UDT/CLR types can be used?
Say I have AddressInfo User Defined Type and table like this:
CREATE TABLE [AddressBook](
[ID] [int] IDENTITY(1,1) NOT NULL,
[LastName] [varchar](25) NOT NULL,
[FirstName] [varchar](25) NOT NULL,
[Address] [dbo].[AddressInfo] NULL,
)
XML that I am trying to insert looks like this:
DOE
JOHN
123 STREETCITY NAME
STATE NAME
I tried this stored procedure:
CREATE PROCEDURE [dbo].[P_AddressBook_Insert]
(
@xmlAddressBook XML
)
AS
BEGIN
SET NOCOUNT OFF
INSERT
INTO [AddressBook]
([LastName], [FirstName],[Address])
SELECT
T.Item.query('./LastName').value('.', 'VARCHAR(25)') [LastName],
T.Item.query('./FirstName').value('.', 'VARCHAR(25)') [FirstName],
T.Item.query('./Address').value('.', 'AddressInfo') [Address],
FROM @xmlAddressBook.nodes('/AddressBook/Contact') AS T(Item)
END
However, I am not able to use AddressInfo UDT because it's not one of SQL Server built-in types.
I tried dbo.AddressInfo without single quotes it wouldn't take it either.
I also tried to use OpenXML for this and got the error message "CLR types cannot be used in an OpenXML WITH clause".
Please let me know if there is a syntax that I am missing or there is just no way to use UDTs with XML inserts.
Thank you!
August 25, 2009 at 8:50 pm
Need to pass as many delimited strings as the number of parameters.
Not true... please consider the following...
http://www.sqlservercentral.com/articles/T-SQL/63003/
Since, delimited string would be passed as a string data type, so limitation of size of string data supported in SQL Server 2005 comes into picture (4000 / 8000 characters max including the delimiter character).
Definitely not true. Please check Books Online for VARCHAR(MAX) and NVARCHAR(MAX). It's not even true in SQL Server 2000 or SQL Server 7 if you don't mind working with the TEXT or NTEXT datatypes which can be passed as parameter in a stored procedure. There are also a fair number of fairly simple splitters for TEXT and NTEXT.
I would also like to see a performance and resource usage comparison between passing delimited strings and XML strings. It might also be interesting to take a look at what very few people take pause to look at... the impact on the "pipe" and resources. For example...
XML String:
According to MS-Word, that bit of XML is 151 characters including spaces.
Delimited String:
Richard,1100|Cliff,1200|Donna1,3000|Ann,1500|
According to MS-Word, that bit of delimited computational heaven is only 45 characters including the delimiters.
Let's do a quick bit of math... (151-45)/45 = ~2.356. Translation: You have to pass more than 2-1/3 times more data over the "pipe" AND through the I/O resources of SQL Server (and the underlying Windows Server) for XML to do the same thing as the delimited string. THEN, you have to shred the XML or parse the delimited string. I haven't done the performance/resource usage comparisons between shredding and parsing, but rumor has it that shredding is a bit resource hungry compared to well formed parsing techniques (cteTally or just a Tally table... recurrsion is as bad as most loops).
I've heard all the supposed advantages of passing XML like this... I'm just not finding they're all true especially the parts I listed at the beginning of this post.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 52 total)
You must be logged in to reply to this topic. Login to reply