July 9, 2010 at 7:49 am
Scott - Thanks for the article. Reaffirms some of the information I found while investigating this option for some of our performance improvements.
One other thing I discovered through my testing is table valued parameters are not a good solution if you need to pass rows of relatively long text. We had one instance where we had 5 strings that needed passed into the stored procedure for each new row. Using Table Valued Parameters to pass the data showed a huge degradation in performance over alternative solutions. Passing numeric values in Table Valued Parameters showed a huge increase in performance.
The result of my testing showed (as always) that you need to consider all the alternatives, but if you are passing purely numeric data it is an excellent solution. As you add varchar() and nvarchar() columns, you need to analyze the performance and consider alternatives.
July 9, 2010 at 9:20 am
Hi Scott
m.scott.bryant (7/9/2010)
... rows of relatively long text...
Probably you want to have a look for the new SQL Server 2008 WRITE() function of VARCHAR(MAX) and VARBINARY(MAX):
Concatenation of Text and Binary Data in T-SQL
Greets
Flo
July 9, 2010 at 9:20 am
Thank you. It is true. As the size of your data row grows, the performance boost from table valued parameters may decrease. If the data passed in contains several large fields such as varchar(Max), the single row insert approach may perform better. Thanks for pointing that out.
July 9, 2010 at 8:38 pm
Great article! Phil this technique is great when used with merge. Another thing to consider is the use of tvp's for optional parameter queries.
July 10, 2010 at 2:42 am
I’m sorry I don’t have a real example but I know when I see trouble at a distance
Please don’t get me wrong, it’s not that I don’t like this feature but just that this will not be effective for a large table which expects 80-90 % read operation with “read committed”.
Updates like this will not help replication (transaction replication ) as well, if I remember right transactions may get stuck at the distributor and may even cause deadlocks at the subscriber. Granular the transaction the more success it have of completing without causing significant pain to DB.
Let me know if you want to take this conversation offline ; would love to share what I know
July 10, 2010 at 9:24 am
Thank you for your input. The primary benefit of table valued parameters in my example is eliminating the overhead of multiple calls to the database. For each call there is overhead from ADO.net to set up the connection and configure the parameters and procedure call. Table valued parameters help reduce the repetitive creation of the ADO.net object plumbing for each call. For a true bulk insert operation, this feature may not be the optimal solution. Like most tools in the SQL Server toolbox table valued parameters may be a good fit in certain scenerios. I am using in production, stored procedures with table valued parameters on a database that is part of transactional replication. I have not seen any adverse effect on latency. It is a good thought to concider what side effects a new technology can have on something like replication. Thanks.
July 12, 2010 at 7:59 am
Scott,
Did you decide to skip the section where you populated the Table Valued Parameter? In the script you create it, but I dont see where its populated? Please let us know.
Thanks,
S
--
:hehe:
July 12, 2010 at 8:07 am
In the article right below the declaration of the UpdateTableViaTableValuedParameter stored procedure I populated the table valued parameter with data from a table named PODetailtest:
DECLARE @ProcTableParamExample PODetailTableType
INSERT INTO @ProcTableParamExample
SELECT TOP 750 PurchaseOrderDetailID,
OrderQty,
UnitPrice
FROM PODetailtest
You may just need to scroll down to see it. I hope this helps clarify. Thanks for your interest.
July 12, 2010 at 8:14 am
Scott,
I think the incorrectly named parameters confused me.
You used @TableParamExample and @ProcTableParamExample.
Anyhow, thanks.
Regards,
S
--
:hehe:
July 15, 2010 at 5:32 am
A lot of thanks to everybody for your contribution on this subject.
This is a problem we has in our applications and SQL Server 2000.
We are now considering the upgrade to SQL Server 2008.
We will make a extensive analysis of performance on this kind of solution.
Again, a lot of thanks!
January 2, 2013 at 3:51 am
Hello,
Did You make any test which would measure the whole operation performance (CPU and memopry usage, IO reads and execution time):
I. preparing XML in an external application
II. executing a procedure with this XML as parameter
versus
I. preparing (populating with data) a DataTable on the application side
II. executing a procedure with this table as parameter (which also includes sending the whole data table through the network connection to the server and inserting rows to the table on the server side [II-B])
?
Time and resources used in the step II-B may be significant.
With best regards
Viewing 11 posts - 16 through 25 (of 25 total)
You must be logged in to reply to this topic. Login to reply