August 18, 2011 at 12:56 pm
Hey Everyone,
Has anyone seen performance issues when using Table valued parameters? The reason why I am asking is because I know that there are no statistics for this. That could impact performance issue. I have never used it before and our development team wants to use it. So playing it safe and learning from others who have already used it before.
Thanks
-Roy
August 18, 2011 at 12:58 pm
Much the same as table variables, large datasets can become problematic.
However, it depends on what you are coming from. We switched to TVPs from string splitting and saw huge performance improvement.:-D
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
August 18, 2011 at 1:03 pm
These guys wants to use it to reduce JOINS by passing the values (int) in as TVP.
-Roy
August 18, 2011 at 1:07 pm
I could see that improving perf too. I would be careful in implementation and data change though.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
August 18, 2011 at 1:10 pm
Yep. Advantage for me is that I can review what they are implementing before it hits QA.
-Roy
August 18, 2011 at 2:06 pm
If the dataset is above a certain size (depends on your use of it), insert into a temp table before you join it to other things.
Mainly, I've seen them used as a way to pass multiple values for search/filter type pages. They're a huge improvement over using a string parser or XML for that kind of thing. But those are very small lists, no more than about 100 items tops in most cases, and the lack of stats on them really don't matter.
The one time I did see it used for potentially large datasets, it was a multiselect with ranges, and each range value was passed in as a row. So if someone chose "1-1000, and 2001-3000", for example, it would pass in every number in those ranges, one number per row, and suddenly you've got a table variable with thousands of rows. Expand that out logically, and you could have even larger sets. That's where I'd use a temp table as an intermediate step. Worked well enough.
But, for the most part, they work well for what they're designed for, which is small datasets passed from applications or reports to procs.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
August 18, 2011 at 2:19 pm
Thanks Gus. Appreciate the explanation. 🙂
-Roy
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply