February 16, 2009 at 4:17 am
This article focuses on passing a table (a set of tabular data) to a stored procedure for further processing. It does not refer to passing one of the tables of a database to a stored procedure.
.
February 16, 2009 at 8:28 am
Jose,
The difference is a few things.
one if you pass a table name in you are forced to use Dynamic SQL the whole time your in the SPROC (messy if you have say 5 joins) or you have to create a temp table/variable and load it via dynamic SQL then reference it for the rest of the SPROC . which is also costly due to the overhead involved with using the tempdb and all the operations to insert, read, destroy, etc..
When you pass in a table into a SPROC your "literally" bring the data to the door step of the SPROC. So you get the benefit of directly referencing the data but the convenience of having it disappear after you leave the SPROC. After all if this was data I wanted to keep I probably would have created a table and put it there in the first place. 🙂
The limitation of SPROCS in this area has always been that you can only pass in one value. ex. @parm='name'. This gives you the ability to pass in so much more. It's like going from an array to a Multi-dimensional array.
I also read the piece on Table-Valued Parameters from VG statement
Looks promising thanks for pointing that out.
Peace out...
February 16, 2009 at 4:21 pm
As a possible alternative...
http://www.sqlservercentral.com/articles/T-SQL/63003/
--Jeff Moden
Change is inevitable... Change for the better is not.
February 18, 2009 at 1:31 pm
There is a very strong reason for not allowing to pass tables to procedures as parameters.
Text, varchar(max) allow you to pass a variable up to 2GB in size.
To accept it Server needs to allocate 2GB memory for this variable. And it may be ONLY IN MEMORY - variables don't use tempdb for dumping data exceeding memory limitations.
It's not gonna be a problem while there are 2-3 developers passing small tables to procedures.
But when system goes to production and 100 users start to pass real datasets to SPs - server just goes down. Because 100 GB are gonna swallow even well sized pagefile.
And then we al lgonna read another topic on the forum - my system worked well in DEV but started to fail in PROD - what to do?
I tell you what to do - think ahead.
If you're a programmer then this is the first thing you have to do - think ahead.
And never pass tables to procedures. At least such tables which require variables more than 1 page long.
_____________
Code for TallyGenerator
August 27, 2009 at 3:25 am
Hello Jacob,
Your article is very interesting, but this raised some questions in my mind.
One can adopt this approach as long as the two procedures/ functions are in the same database.
What if we have to pass information to a Linked server. Let me take an example here -
We have two servers, both running SQL Server 2005. Let me name them as Server A and Server B.
We execute a procedure on Server A, which fetches some data from Tables in Server A, then it forms a CSV value for a resulting ID column and stores the same in an NVARCHAR(MAX), so at the end of this step we have something like -
set @ConcatId = '1,2,3,4,5'
This variable then acts as an input parameter for another procedure (invoked from the parent procedure), which is on the Server B (Server B is a linked server, instead of executing a linked query, we are invoking the procedure). Linked Queries have a qery size limit of 8Kb
This procedure gets data from the tables in Server B based on the values being passed to it as a parameter. At the moment, we are using a Dynamic query with an IN clause for the values being passed.
The resulting data is placed in a Temp table and is used by the parent procedure (procedure on Server A) to produce the final result.
If you have faced a similar situation and found an efficient solution for the problem, it will be nice if you could share your thoughts/ knowledge.
Thanks for your time,
Vikas
August 27, 2009 at 4:29 am
Hi Vikas,
I did not get a chance to work on this. I will let you know If I could find a better way of doing this.
I suppose some of the experts in this forum might jump in and share their thoughts.
thanks
jacob
.
August 27, 2009 at 8:36 am
vikas_kalra (8/27/2009)
Hello Jacob,Your article is very interesting, but this raised some questions in my mind.
One can adopt this approach as long as the two procedures/ functions are in the same database.
What if we have to pass information to a Linked server. Let me take an example here -
We have two servers, both running SQL Server 2005. Let me name them as Server A and Server B.
We execute a procedure on Server A, which fetches some data from Tables in Server A, then it forms a CSV value for a resulting ID column and stores the same in an NVARCHAR(MAX), so at the end of this step we have something like -
set @ConcatId = '1,2,3,4,5'
This variable then acts as an input parameter for another procedure (invoked from the parent procedure), which is on the Server B (Server B is a linked server, instead of executing a linked query, we are invoking the procedure). Linked Queries have a qery size limit of 8Kb
This procedure gets data from the tables in Server B based on the values being passed to it as a parameter. At the moment, we are using a Dynamic query with an IN clause for the values being passed.
The resulting data is placed in a Temp table and is used by the parent procedure (procedure on Server A) to produce the final result.
If you have faced a similar situation and found an efficient solution for the problem, it will be nice if you could share your thoughts/ knowledge.
Thanks for your time,
Vikas
You can try to execute a select * from #tmp as the last tsatement in the remote stored procedure making it to return the resultset from the temp table. This can be used in the calling stored procedure to insert the result in a local tempt table, someting like this:
on the remote server (B) you have
create procedure spRemoteB (@ConcatId varchar(4000)) as
...
declare @sql varchar(4000)
create table #tmp_remote (...)
select @sql='insert into #tmp_remote ... select ... from ... where id in ('+@ConcatId+')'
exec (@sql)
...
select * from #tmp_remote
GO -- sp ends here
on local server (A)
create procedure spLocalA (...)
as
...
set @ConcatId = '1,2,3,4,5'
create table #tmp_local (...)
...
exec linkedB.database.dbo.spRemoteB @ConcatId into #tmp_local
...
GO -- sp ends here
Of course you'll have to make sure that the sp and the temp table have the same column number and types.
If this deosn't work, but I don't see why, you might try to create on the remote server a permanent table which will be emptied and refilled every time you call the remote sp (you might need a date stamp column in the permanent tabe to deal with concurrent calls) and then you can query that table from the local server.
Don't just give the hungry man a fish, teach him how to catch it as well.
the sqlist
April 26, 2010 at 4:49 am
Just wanted to add that there is a follow up article on this, which explains how to do the same using the table valued parameters introduced in SQL Server 2008. http://www.sqlservercentral.com/articles/News/3182/
.
April 26, 2010 at 6:57 am
Carlos Shoji-387174 (9/23/2007)
Very good article!Is there any way to do a similar thing with SQL Server 2000?
Thanks.
Kind of... Please see the following article with the understanding that it's limited to 8 kbyte "chunks" (unless you delve into the realm of the TEXT datatype).
http://www.sqlservercentral.com/articles/T-SQL/63003/
--Jeff Moden
Change is inevitable... Change for the better is not.
April 27, 2010 at 2:38 am
Thanks for your explanations. I think I may use a piece of code rather than a stored procedure to do such a thing as I would more adequate for me to use a business layer to do such thing rather than a data layer . Does that make sens ?
Thanks
Jose
April 27, 2010 at 6:59 am
jose-276033 (4/27/2010)
Thanks for your explanations. I think I may use a piece of code rather than a stored procedure to do such a thing as I would more adequate for me to use a business layer to do such thing rather than a data layer . Does that make sens ?Thanks
Jose
Actually, no. It doesn't always make sense to me anyway (ie. It Depends). Are you going to have embedded code do the job for you? If you do, there's a strong chance of possible unnecessary recompile(s) on SQL Server and, depending on how you write the embedded code, could very well be RBAR on steroids.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 11 posts - 91 through 100 (of 100 total)
You must be logged in to reply to this topic. Login to reply