July 25, 2007 at 5:14 am
my self avi i m developing one application in vb6 and sql server 2000 which is run on lan.
i want to send bulk data [i.e. some sort of array one can say.] to stored proc. i send it by concatenating data by "," and in stored proc i split it but i dont want to do this i want some other effective method or i want to pass the xml but i dont know how to pass xml to stored proc and how it parse. plz any one can explain it m with e.g i.e by giving some code sample
thanx in advance
July 25, 2007 at 6:08 am
Why do you want to move the XML?
The parsing technic you are using is fine... an optimal compared to XML.
Th eonly other option is to create a temp table on the client side, send all the ids there, then inner join on the temp table for filtering/processing.
July 25, 2007 at 7:40 am
I found passing bulk data as XML (text) to a stored procedure as an ntext parameter, then use OPENXML to pass data into table variables\temp tables much easier than creating your own sql code text parser
July 26, 2007 at 12:15 am
hi
first up all thanx to u r rply,
i m not getting which parsing technique u r talking plz can u explain it.
thanx
July 26, 2007 at 7:36 am
--setup
IF EXISTS (Select * from dbo.SysObjects where name = 'Numbers' and XType = 'U' and user_name(uid) = 'dbo')
DROP TABLE Numbers
GO
CREATE TABLE dbo.Numbers (PkNumber int identity(1,1) primary key clustered, dude bit null)
GO
INSERT INTO dbo.Numbers (dude)
SELECT NULL FROM (SELECT TOP 100 NULL AS A FROM master.dbo.spt_values) dt100 cross join (SELECT TOP 80 null AS A FROM master.dbo.spt_values) dt80
GO
ALTER TABLE dbo.Numbers
DROP COLUMN dude
GO
CREATE FUNCTION [dbo].[fnSplit_Set] (@IDS as varchar(8000), @vcDelimiter varchar(3))
RETURNS TABLE
WITH SCHEMABINDING
AS
Return
Select dtSplitted.EachID, dtSplitted.Rank from (
SELECT SUBSTRING(@vcDelimiter + @IDs + @vcDelimiter, N.PkNumber + len(@vcDelimiter),
CHARINDEX(@vcDelimiter, @vcDelimiter + @IDs + @vcDelimiter, N.PkNumber + len(@vcDelimiter)) - N.PkNumber - len(@vcDelimiter)) as EachID
, (LEN(SUBSTRING(@IDs, 1, N.PkNumber)) - LEN (REPLACE (SUBSTRING(@IDs, 1, N.PkNumber), ',', ''))) + 1 AS Rank
FROM dbo.Numbers N
WHERE SUBSTRING(@vcDelimiter + @IDs + @vcDelimiter, N.PkNumber, len(@vcDelimiter)) = @vcDelimiter
AND PkNumber < LEN(@vcDelimiter + @IDs + @vcDelimiter)
  dtSplitted where len(dtSplitted.EachID) > 0
GO
--start demo
--this builds a list to split later on
DECLARE @Ids as varchar(8000)
SET @IDS = ''
Select @IDS = @IDS + CAST(id as varchar(10)) + ',' from (Select TOP 10 id from dbo.SysObjects order by NEWID()) dtIds
SET @IDS = left(@IDS, ABS(len(@IDS) - 1))
PRINT @IDS
--2 exemples of the split function
Select O.id, dtIDS.EachId, O.name, dtIDS.Rank from dbo.SysObjects O inner join (Select CAST(EachID as INT) AS EachID, Rank from dbo.fnSplit_Set (@IDS, ',')) dtIDS on O.id = dtIDS.EachID order by O.Name
Select O.id, dtIDS.EachId, O.name, dtIDS.Rank from dbo.SysObjects O inner join (Select CAST(EachID as INT) AS EachID, Rank from dbo.fnSplit_Set (@IDS, ',')) dtIDS on O.id = dtIDS.EachID order by dtIDS.Rank
--Dont't ever forget to cast the eachid column to the same datatype as in the join to avoid table scans.
July 27, 2007 at 2:06 am
Hi.
From vb set a reference to the ms xml library
Create an xml document object and load your xml document into it.
Call you stored procedure using the ADO command object and pass the xml property of the xml document object as a parameter to the stored procedure.
In the stored procedure you can use:
sp_xml_preparedocument and then use OPENXML
You then use XPATH statements to select data from the node tree of the xml
Look in Books Online and search the web, you should find plenty of examples.
Tim.
July 27, 2007 at 6:37 am
You could use the afore mentioned method of parsing using a comma-delimited string. However, using OPENXML is better in this case because you can use it to mass insert large amounts of data into a table or update the records if they already exist. Also, using Xml in this way will allow for future flexibility whereas using a delimited list will not. If you have to add more elements to the Xml or later pass in a heirarchy (say Orders and OrderDetails) then your solution will not change that much.
July 27, 2007 at 6:56 am
I totally see your point about reusability. But here's why I think it's completely off topic :
When I decide to pass a long list of ids, I do so because I want to do X operation to Y list of ids (like set a flag on these ids). If I need to do one more operation (a second flag), I'll want to do that on THAT list in the same proc. And since the all the data I'll need is already on the server I won't need to send anything else.
However if I ever see the need to send the whole set of tables. Then I'm doing Data import, or ETL. Which I'll be sure to treat in a very different way altogether. Which in turn will make me create a new procedure as this new process is totally different from the first one and I will not want to treat the 2 in the same procedure.
And BTW, I still never saw a time where XML was needed (the only way, best way) to do that. I know XML has its place but I've never seen that language creep in my code.
July 28, 2007 at 8:23 am
I agree with Ninja. I use the "Split" function when I can like say passing in a list of IDs I want retrieved from the database. However, I was talking about saving large amounts of data back to the database. In this case, Xml is probably the better choice. You could also use a combination to shorten your Xml (sending Xml in and parsing certain Elements using the Split function). For example, store lists of information inside an Xml Element if need be instead of creating multiples of the same element. However, I've never seen a situation where I could do that.
July 28, 2007 at 12:34 pm
My opinion : XML = Bloatware. Why parse 200 MB when I can parse 2 MB for the same data? I understand the point when working with heterogenous systems, or with web services... but those are the only "good" utilisations I've found so far for XML (not to say that there are not others).
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply