May 9, 2008 at 11:39 am
I'm having a problem executing a SQLCLR function: this function
calls a web services that processes a query to a data base and
returns a table to be used in a stored procedure. In a low
concurrency scenario (not to many clients connected), the function
returns correctly, however when the concurrency level is increased
we have a SQLCLR command execution problem (all the SQLCLR
processes hangs), making the server unavailable to all web services
processes.
At first we thought the problem could be the SQLCLR, since the web
services is 100% available, all the time. We monitored to come to
this conclusion. Do you know of some SQLCLR bug?
Could someone help me with this? I'm in a difficult situation with
my client, considering that we defended the MS SQLServer technology
and now it's not working properly.
Att,
Assis Marques
DBA
May 9, 2008 at 6:36 pm
I think you might be running into a straight resource issue. CLR functions can't (as far as I know - I haven't been able to make this work) tap into streaming results. Instead - they have to materialize the table FIRST, then dequeue it out from there. If the result sets get big/lots of concurrency, things start breaking.
out of curiosity - what SP level is the server at? have you applied any of the hotfixes post-SP2? Prior to SP2 - SQLCLR resource management just sucked, and it would run itself out of resources fairly regularly.
Otherwise - how much out of process memory are you reserving in your SQL memory? you might care to bring that number up (it's the -G startup parameter). The default is 256 which again may not be enough if you have high concurrency.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
May 11, 2008 at 7:40 am
You have one database invoking remotely another database's process through web services.
Web Services are capable of dealing with large amounts of data to transfer, but in that case you would have to resort to a special type of asynchronous invocation, which requires a different design for both the Web Service and the client doing the invocation.
I could not find in SQL Server 2005 documentation that it supports this kind of web service standard (WSE 3.0 MTOM support).
Are the two databases SQL Server 2005 DBs?
If that is so, you should try Service Broker instead of Web Services.
If that is not the case, can you redesign the process in such a way that it could work with smaller sets of data?
Do you really need the entire table?
If you really need the entire table for your process, why don't you try using Integration Services for the transfer part of the process?
I hope that these ideas could help you solve this problem.
Best regards, Gastón
May 13, 2008 at 6:58 am
Hi Matt and gaston,
First of all, thanks for all your help.
Matt,
The SP applied to the server is SP2. All post-SP2 hotfixes were applied also. Regarding the reserved memory, we have already tuned that. We tried 1GB. Looks like there were no influence after we did that modification.
gaston,
One of the DBs is SQL Server 2005 9.00.3175 (the client), the other is SQL Server 2000. We already work with small sets of data (we do not retrieve the entire table, usually no more than 15 records. There are no CLOB or BLOB columns). We are not using Integration Services because our client works under a SOA architecture. It's kind of a requirement.
Again, thanks a lot for you guys help. We are really looking forward to a solution for this problem, since this is impacting our client and our perceived service quality.
Assis Marques
May 14, 2008 at 7:22 pm
Test hitting you web server directly from a .net console app to simulate the activity your getting to cause failure. (get SQLCLR out of the picture) many public services, google MAPS in particular have a governor that require you to put a delay between posts (~100-200ms) seems to work.
Also, I found that if your payload is large, its best to stream to a filestream first, not a memorystream.
what datatype are you returning in your function, can you post the code?
-Tom
May 15, 2008 at 6:17 am
Hi tom,
Regarding the tests in isolation (removing SQLCLR from the scenario): we already did that and we could not reproduce the problem. In fact, we increased concurrency without any other problem happening.
The code follows below. We are going to try changing from memorystream to filestream, but anyway, we do not have a high payload, only high concurrency.
[SqlFunction(Name = "fn_getDadosUsuarioContratante",
FillRowMethodName = "FillRowDadosUsuarioContratante",
TableDefinition = "CodUsuario Integer, NomUsuario NVarchar(max), CodMatricula NVarchar(max), CodCPF NVarchar(max),DesUsuarioRede NVarchar(max), IndAtivo Integer")]
public static IEnumerable SqlArrayDadosUsuarioContratante(Int32 CodAplicacao, string strSenha, Int32 CodUsuario)
{
UsuarioContratanteTOCollectionXML objCollectionXML = new UsuarioContratanteTOCollectionXML();
WSACS_Associado service = new WSACS_Associado();
objCollectionXML = service.GetDadosUsuarioContratante(CodAplicacao, strSenha, CodUsuario);
//Libera Objeto
service.Dispose();
ArrayList rowsArray = new ArrayList();
foreach (UsuarioContratanteTOXML item in objCollectionXML.usuario)
{
object[] column = new object;
column[0] = item.codUsuario;
column[1] = item.nomUsuario;
column[2] = item.codMatricula;
column[3] = item.codCPF;
column[4] = item.desUsuarioRede;
column[5] = item.indAtivo;
rowsArray.Add(column);
}
return rowsArray;
}
private static void FillRowDadosUsuarioContratante(Object obj, out SqlInt32 CodUsuario, out SqlString NomUsuario, out SqlString CodMatricula, out SqlString CodCPF, out SqlString DesUsuarioRede, out SqlInt32 IndAtivo)
{
object[] row = (object[])obj;
CodUsuario = (Int32)row[0];
NomUsuario = (string)row[1];
CodMatricula = (string)row[2];
CodCPF = (string)row[3];
DesUsuarioRede = (string)row[4];
IndAtivo = (Int32)row[5];
}
May 15, 2008 at 12:35 pm
I have not tried to send a varchar(max) let alone four of the to a tvf, so I really cant comment if that the issue but I though the limit was 8000 chars per clr tvf row, but I could have gotten things mixed up.
Anyhow I have done the following with success:
break your tvf function into two seperate functions:
1: create a udf that call the WS and returns the response as new sqlXml(stream) ( serialize if necc.)
Sqlxml udf_getUsuarioContratanteXML(Int32 CodAplicacao, string strSenha, Int32 CodUsuario)
2: create a tvf in T-SQL that parses the XML.
tvf_ parseUsuarioContratanteXML(@UsuarioContratanteXML xml)
call it like this:
Select * from tvf_ parseUsuarioContratanteXML(udf_getUsuarioContratanteXML(1,2)
Hope this helps, If you need more details let me know.
-tom
May 15, 2008 at 1:26 pm
form the little experience I have with clr: I wouldn't use it with web services. Many forget a webservice is to be considered asynchronous.
If it needs to much of your ram, it may cause your instance to crash.
Work around : :ermm:
- Can this not be performed using a linked server ?
- replication scenarios ?
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply