October 9, 2015 at 12:12 pm
Hello SSC,
Currently another developer and I are prototyping using a Table Type Parameter in a c# application.
The idea is to fill the Table Type with “EntityID”. The Table Type is very simple and holds just 1 column EntityID INT.
We tested filling this Table Type with 50,000 EntityIDs and passed it though via the code below.
The code passes the Table Type to an SP we call a “driver” SP and this will add the entityIDs to a global temp table which we can use later in some other stored procs.
using (var sqlCmd = new SqlCommand())
{
sqlCmd.CommandType = CommandType.StoredProcedure;
sqlCmd.CommandText = this.StoredProcName;
DataTable dataTable = new DataTable("SearchEntity");
dataTable.Columns.Add("ID", typeof(int));
foreach (PropertySearchResult result in resultContainer.Items)
{
dataTable.Rows.Add(result.PropertyId);
}
SqlParameter parameter = new SqlParameter();
parameter.ParameterName = "@pSearchEntityIDs";
parameter.SqlDbType = System.Data.SqlDbType.Structured;
parameter.Value = dataTable;
var parameters = new List<DbParameter>
{
new SqlParameter("@pTableName", tableName),
//new SqlParameter("@pUserResultGuid", analyticSearchCriteria.ResultGuid),
parameter,
new SqlParameter(UseExistingTableParameterName, useExistingTable)
};
var outputTableName = new SqlParameter(OutputTableParameterName, SqlDbType.VarChar)
{
Size = 1000,
Direction = ParameterDirection.Output
};
parameters.Add(outputTableName);
this.AddCustomParameters(parameters, analyticSearchCriteria);
sqlCmd.Connection = connection as SqlConnection;
watch.Restart();
this.databaseOperations.ExecuteNonQuery(sqlCmd, parameters.ToArray());
executionTimes.Add("DriverTableSPTime", watch.ElapsedMilliseconds);
this.ElkLog(executionTimes);
return outputTableName.Value.ToString();
}
We are seeing in SQL Profiler is 50,000 Insert statements. Also I am thinking these calls to the database are passed over the wire which is a lot of data.
declare @p2 dbo.SearchEntity
insert into @p2 values(35337)
insert into @p2 values(35338)
insert into @p2 values(35341)
insert into @p2 values(35342)
insert into @p2 values(35343)
insert into @p2 values(35344)
insert into @p2 values(35346)
.....
exec Driver.uspCreateAndPopulateProperties2 @pTableName=N'a120582dcf9c44f892a2bcdd0afb758f',@pSearchEntityIDs=@p2,@pReUseExistingTable=1
I am wondering if anyone in the community has used Table Types in stored procs and have called it in C# with thousands of records in the TT. And if so, did you also see some performance issues?
I was hoping to see so much data passed over the wire but maybe that is by design.
Is there a better way to pass 50,000 INTs over the wire?
Thank you for the time,
Brad
October 9, 2015 at 12:59 pm
Why not generate one INSERT command:
INSERT @p Values
(id0), (id2), ..., (idn)
Gerald Britton, Pluralsight courses
October 9, 2015 at 1:07 pm
The code in Profile is auto generated. All we are doing is calling the stored proc with the Table Type populated.
Maybe there is a more efficient way to call to call a table type in C# that will not generate all these inserts.
October 9, 2015 at 1:15 pm
brad.mason5 (10/9/2015)
Is there a better way to pass 50,000 INTs over the wire?
Are the numbers are being generated by the app? What I mean by that is it seems like the app isn't actually reading this data from somewhere. Instead, you're providing some "criteria" to an algorithm in the app and then it's generating the ints based on that algorithm.
With that in mind, why not write a stored procedure that you pass the small number of parameters to and have SQL Server generate the numbers? It will be much faster and pipe friendly.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 9, 2015 at 1:45 pm
Hi Jeff,
A little background. The application we developed allows a user to search properties in the U.S. The user can select any area and click "view results".
Behind the scenes we have the Primary Key of the table in the middle tier and then store that in a VARBINARY column in the DB. This is working pretty well as we only pass the binary string across the wire.
We have a CLR function which we use to de-serialize this VARBINARY column to a list of EntityIDs.
The developer and I were experimenting with removing the dependency on this DB and seeing how well the app would perform if we send a table type or list of IDs to a stored proc.
Sending a table type with say 50k properties is taking much longer than expected. Our current solution as of now is faster (storing VARBINARY, then CLR to de- serialize) .
The goal of this project was to remove the dependency of the DB and our first thought was try Table Type parameter.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply