April 13, 2010 at 7:06 am
Finnlake (4/12/2010)
Hi!So we will probably create a procedure taking all parameters and pass the set of positions to the stored proc as xml.
Using XML as a multi-row data transport mechanism to a stored procedure is another solution. We're using it with SQL Server 2005 and a .NET application. The .NET middle-tier code constructs the XML document which is passed as a single parameter of type XML to the stored procedure. The stored procedure then inserts all of the rows (elements) found in the XML via a single INSERT ... SELECT statement using XQuery to extract the appropriate data elements from the XML.
It is one way to perform bulk inserts into a table.
April 15, 2010 at 1:05 am
In SQL Server Database we can check the record before insert to the database and insert the record to the database by creating a store procedure as follows:
CREATE PROCEDURE usp_AddTestData_UsingExists
(
@Name varchar(50),
@Qualification varchar(50)
)
AS
DECLARE @Result int
BEGIN TRANSACTION
IF EXISTS
(
SELECT
NULL
FROM
TestData WITH (UPDLOCK)
WHERE
[Name] = @Name AND
Qualification = @Qualification
)
BEGIN
SELECT @Result = -1
END
ELSE
BEGIN
INSERT INTO
TestData
(
[Name],
Qualification
)
VALUES
(
@Name,
@Qualification
)
SELECT @Result = @@ERROR
END
IF @Result <> 0
BEGIN
ROLLBACK
END
ELSE
BEGIN
COMMIT
END
RETURN @Result
Hope this would add value.Any suggestions are appreciated.
March 19, 2012 at 3:24 pm
Finnlake (4/1/2010)
In our case we cannot batch requests ie messages since we then would have to keep the messages in memory and if the power goes all messages are lost.We need to insert them in the database as fast as possible. And if TVP's is slower that calling a ordinary stored proc then we will not use it.
We can now insert ~100 messages per second using TVP´s and if we can insert 200 or 300 messages per second by not using TVP's then we will not use it.
Finnlake, we're attempting something nearly identical to what you reported here, and finding the same results you were reporting.
Contrary to the feedback you got in this post, there is something odd at work here and the particular use case we share is not performing as we'd expect either. One key we noticed is that after trying to performance test with TVP's in a proc (our proc also takes 3 of them) we see SQL Server waiting quite a lot of logging and the tempdb gets hot, presumably from materializing the TVP's there. We're coming to a similar conclusion as you - for this use case because of recompiles and how TVP's are materialized for use via SQL Server they don't seem to perform very well. Luckily we can fall back to the pre-TVP methods of accomplishing this (which won't suffer from recompiles nor heating up the disk issues, but does require more custom code).
March 19, 2012 at 3:59 pm
Brian.cs (3/19/2012)
Finnlake (4/1/2010)
In our case we cannot batch requests ie messages since we then would have to keep the messages in memory and if the power goes all messages are lost.We need to insert them in the database as fast as possible. And if TVP's is slower that calling a ordinary stored proc then we will not use it.
We can now insert ~100 messages per second using TVP´s and if we can insert 200 or 300 messages per second by not using TVP's then we will not use it.
Finnlake, we're attempting something nearly identical to what you reported here, and finding the same results you were reporting.
Contrary to the feedback you got in this post, there is something odd at work here and the particular use case we share is not performing as we'd expect either. One key we noticed is that after trying to performance test with TVP's in a proc (our proc also takes 3 of them) we see SQL Server waiting quite a lot of logging and the tempdb gets hot, presumably from materializing the TVP's there. We're coming to a similar conclusion as you - for this use case because of recompiles and how TVP's are materialized for use via SQL Server they don't seem to perform very well. Luckily we can fall back to the pre-TVP methods of accomplishing this (which won't suffer from recompiles nor heating up the disk issues, but does require more custom code).
wow 2 year old thread and the OP has not logged in to the site since then as well.
(For those who may come after)
For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw[/url]
Need to Split some strings? Jeff Moden's DelimitedSplit8K[/url]
Jeff Moden's Cross tab and Pivots Part 1[/url]
Jeff Moden's Cross tab and Pivots Part 2[/url]
March 20, 2012 at 1:40 pm
Maybe the performance problem is not at the DBMS but at the application.
You are creating and setting a lot of objects just to use it as TVPs.
.net is a great tool but is not that performatic.
Maybe the problem lies in creating/setting the DataTables and DataBase objects.
Maybe the problem lies in a deeper level when the framework is manipulating these objects as parameters to make the call to the DBMS.
You can try to find the bottleneck in the whole functionality to solve it but...
When Paul says the problems lies in the design you can do a couple os thing:
First: Are you persisting redundant data?
These data already lies in some place at the DB? Can I just store a FK to it?
Second: There's a lot of data being bumped to the BD at once, can you change how to do it?
Can you send these data in another way?
Can you save "vehicleOperationalData" and "driverSupportEvaluationData" at different times (supposing these objects are created at different times)?
Updt: Oddly these topic was labeled as HOT. Not oddly enough one long line in the first page make the page render in the browser in a way I missed the page counter and dont read the entire second page and therefore the old topic post. My bad anyway I wasted time in it. 🙁
Viewing 5 posts - 16 through 19 (of 19 total)
You must be logged in to reply to this topic. Login to reply