October 20, 2014 at 7:12 am
Hi,
I'm trying to use the User-Defined Table Type and having a couple of issues.
1. Transactions seems to be getting all messed up. I'm using the enterprise library in my c# code and normally handle transactions there but for some reason it always throws an exception on the .Commit. So, I moved the transaction handling into my stored procedure and if there is an error and I try to rollback, I get an error saying there isn't a transaction to rollback.
2. Returning records from a temp table. I need to get the auto increment Id's back from the stored procedure so I can use them to insert child records but the stored procedure only seems to be returning one Id, not all of the Id's.
Can someone lend some guidance as to what I'm doing wrong?
Thanks in advance.
User-Defined Table Type:
CREATE TYPE [dbo].[DesignTableType] AS TABLE(
[Id] [int] NULL,
[DesignTableIndex] [int] NULL,
[Name] [varchar](512) NULL
)
GO
Main Stored Procedure:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Description:Saves a collection of DesignCases
-- =============================================
ALTER PROCEDURE [dbo].[DesignTableBulkAdd]
@DesignTable DesignTableType READONLY
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
DECLARE @err_message as nvarchar(512)
DECLARE @CurrentId SMALLINT = 1
DECLARE @Id INT
DECLARE @Name VARCHAR( 512 )
DECLARE @Identity INT
--Create a temp variable table to hold the values of the Id's to pass back to the caller
DECLARE @DesignTableReturnTable TABLE
(
Id INT,
Name NVARCHAR(512)
)
BEGIN TRANSACTION
WHILE @CurrentId <= ( SELECT MAX( DesignTableIndex ) FROM @DesignTable )
BEGIN
Select @Id = Id, @Name = Name From @DesignTable Where DesignTableIndex = @CurrentId
BEGIN TRY
execute DesignTableAdd @Id, @Name, @Identity OUTPUT
Insert Into @DesignTableReturnTable (Id, Name )
Select @Identity, Name From @DesignTable where DesignTableIndex = @CurrentId
END TRY
BEGIN CATCH
ROLLBACK TRANSACTION
SELECT @err_message = ERROR_MESSAGE();
RAISERROR (@err_message,11, 1) --In order to get an exception in C# the error must be higher than 10
END CATCH;
SET @CurrentId = @CurrentId + 1
END
COMMIT TRANSACTION
Select * From @DesignTableReturnTable
END
Child Stored Procedure:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Description:Saves a DesignTable
-- =============================================
ALTER PROCEDURE [dbo].[DesignTableAdd]
@Id int,
@Name nvarchar(512),
@Identity int OUT
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
DECLARE @err_message as nvarchar(512)
If @Id > 0
BEGIN
SET @Identity = @Id
SELECT @Identity As Id
END
ELSE
BEGIN
IF EXISTS ( Select Name from DesignTable where Name = @Name )
BEGIN
SET @err_message = @Name + ' already exists.'
RAISERROR (@err_message,11, 1) --In order to get an exception in C# the error must be higher than 10
END
ELSE
BEGIN
INSERT INTO DesignTable (
Name)
VALUES (
@Name)
SET @Identity = SCOPE_IDENTITY()
SELECT @Identity As Id
END
END
END
c# Code:
public bool BulkSave( ObservableCollection<DesignTable> designTables )
{
Database db = null;
db = DatabaseFactory.CreateDatabase();
using( DbConnection connection = db.CreateConnection() )
{
if( connection.State != System.Data.ConnectionState.Open )
connection.Open();
//using( DbTransaction transaction = connection.BeginTransaction() )
//{
DataTable designTableDataTable = new DataTable( "DesignTable" );
DataColumn dc = new DataColumn( "Id", typeof( Int32 ) );
designTableDataTable.Columns.Add( dc );
dc = new DataColumn( "DesignTableId", typeof( Int32 ) );
designTableDataTable.Columns.Add( dc );
dc = new DataColumn( "Name" );
designTableDataTable.Columns.Add( dc );
int counter = 1;
foreach( DesignTable designTable in designTables )
{
int id = designTable.Id;
if( id <= 0 )
{
id = 0;
}
designTableDataTable.Rows.Add( new object[] { id, counter, designTable.Name } );
counter++;
}
try
{
using( DbCommand command = db.GetStoredProcCommand( "DesignTableBulkAdd" ) )
{
SqlParameter param = new SqlParameter();
param.ParameterName = "@DesignTable";
param.SqlDbType = SqlDbType.Structured;
param.Value = designTableDataTable;
command.Parameters.Add( param );
IDataReader dr = null;
try
{
dr = db.ExecuteReader( command ); //, transaction );
}
catch( Exception e )
{
this.LastError = "Message: " + e.Message + "\rInner Exception: " + e.InnerException;
throw new Exception( "BulkSave method failed!" );
}
//transaction.Commit();
try
{
int index = 0;
while( dr.Read() )
{
designTables[ index ].Id = Convert.ToInt32( dr[ "ID" ] );
index++;
}
}
catch( Exception e )
{
this.LastError = "Message: " + e.Message + "\rInner Exception: " + e.InnerException;
throw new Exception( "BulkSave method failed!" );
}
}
}
catch( SqlException ex )
{
//transaction.Rollback();
this.LastError = ex.Message + "\r" + ex.InnerException;
return false;
}
catch( Exception e )
{
//transaction.Rollback();
this.LastError = e.Message + "\r" + e.InnerException;
return false;
}
finally
{
connection.Close();
}
return true;
//}
}
}
October 20, 2014 at 7:21 am
I don't have time to dig through all of that but I will say 2 things:
1) I don't see a table definition that actually has an identity on it.
2) You are calling this a "bulk" insert mechanism, but you are doing it in a VERY painful and slow row-by-agonizing-row (RBAR) process. I would think this process could be done in a single set-based process. This would not only be orders of magnitude more efficient, it would also remove the concurrency issue(s) your mechanism is facing whereby you can get bad data loaded or values out if multiple of these happen at the same time (or other processes DELETE from the table).
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
October 20, 2014 at 7:24 am
Wrap the COMMIT transaction in a check to see whether or not there are any open transactions. At the moment, if an error occurs and the catch block runs, SQL will roll the transaction back, then try to commit a transaction that's already been rolled back, since you don't have any checks around the COMMIT.
This is going to be slow as hell. What are you trying to do here? Perhaps a change of approach is necessary.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
October 20, 2014 at 7:30 am
I'm actually trying to fix a performance issue, if you can believe that :). It seems I've not chosen the correct fix to fix my issue. Here is what I've got. We are employing the Repository model in our application code which normally just inserts one row of data at a time. We added some functionality and now we need to insert upwards of 20k records. Inserting 20k records one at a time is EXTREMELY slow so I thought if I could send multiple rows of data into the stored procedure it would be faster.
I would really be interested in hearing other ways to do this, maybe I should have put this in the newbie section.
Thanks.
October 20, 2014 at 7:35 am
Sending multiple rows to the procedure's a great idea, but not if you then sit and loop through those rows one by one and insert them. You've done nothing to fix the inherent slowness of processing one row at a time, you've just moved it down a layer.
Now, with more useful detail, what exactly are you trying to do here?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
October 20, 2014 at 7:44 am
I'm trying to insert around 20k records into one of my table from a c# application. We currently employ the Repository design model in the application which knows about the structure of the database. So for example, I have a DesignTable that contains a collection of DesignCases. My DesignCases has a collection of DesignLevels (which is where my 20k records exists).
The way it's currently working is, I save a DesignTable, the repository adds them one record at a time and then calls the DesignCases repository which adds each item individually and then calls the DesignLevels repository adding 20k records, one at a time. I guess you can see why this is so slow and I'm trying to create better performance.
So, my idea was to offload all of the inserts into a new stored procedure called DesignTableBulkAdd, DesignCasesBulkAdd and DesignLevelsBulkAdd. In this new design, the repository would all accept collections and send it to the stored procedures where it would be processed. In my design, I still need to get back the Id's of the newly added items so that I can send them in with the child objects so my foreign keys stay in tact.
Am I making the scenario more confusing?
October 20, 2014 at 7:52 am
Ok, you don't need to get the IDs back. What you want to do is send parent and child data to the stored proc (table-type parameters are fine for this), then do two inserts. One into the parent table, OUTPUT the identity values and whatever the unique columns are into a table var/temp table, second insert into the child table.
Two inserts, not two loops or two sets of one row at a time.
Look up the OUTPUT clause.
If you can't figure it out, please post table definitions and some sample data in an easy-to-use format (INSERT statements) and someone with some free time will likely write it for you.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
October 20, 2014 at 7:54 am
Thanks Gail, I'll lookup the OUTPUT and see what I can come up with.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply