October 12, 2007 at 2:23 pm
rudy komacsar (10/11/2007)
On a more personal note, my deepest sympathies to those converting time in Indiana. Whilst being a life long Hoosier, I have been luck to live in one of the more 'normal' areas (the Northwest corner just outside of Chicago).
sympathy much appreciated!
😛
---------------------------------------
elsasoft.org
October 12, 2007 at 11:33 pm
A couple of things I'm considering : a CLR function to validate a hash, a CLR proc triggered from service broker to send a formatted TCP/IP message to a different machine. As others have said, it's a small niche but it definitely adds some useful functionality if used wisely.
October 13, 2007 at 10:54 am
Grant, I'm with you... I see no compelling reason for CLR's. My opinion, for the most part, for them being there is the same as DTS and Cursors... they were created to make it easier for folks that don't really know SQL.
Of course, there are always exceptions to the rule... for example, if one does not know about the undocumented xp_DirTree extended stored procedure and all 3 of the parameters it can take, a person might be tempted to write a CLR to do such a thing... especially since Microsoft could change the functionality at any moment or remove the xp in future releases as they did with xp_GetFileInfo.
Another reason might be so that you can create the functionality not allowed by some DBA's. For example, shelling out to certain DOS/CMD functions might be impossible at some sites because the DBA (wisely) does not allow the use of excursions to the command shell.
But, compelling reasons? No... just convenience, for the most part.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 13, 2007 at 10:19 pm
Jeff Moden (10/13/2007)
Another reason might be so that you can create the functionality not allowed by some DBA's. For example, shelling out to certain DOS/CMD functions might be impossible at some sites because the DBA (wisely) does not allow the use of excursions to the command shell.
this is unlikely, unless the DBA is uneducated about the CLR.
any DBA that doesn't allow use of xp_cmdshell is likely not going to allow the import of UNSAFE or EXTERNAL_ACCESS assemblies, which is what would be required to touch the filesystem/registry/etc from a CLR proc or udf.
---------------------------------------
elsasoft.org
October 14, 2007 at 8:10 am
My opinion, for the most part, for them being there is the same as DTS and Cursors... they were created to make it easier for folks that don't really know SQL.
I think you are really missing the point here. Cursors may be more inefficient that writing some code in C# or VB.NET. I'm not saying this is the case, but it is another useful tool in a database developer's armoury. Before I may have had to write an app that had to connect to the server. Now that CLR exists it can run on the server managed by SQL Server itself making it much more efficient than in the past. I'm not saying don't use SQL wherever you can, this has to be the bottom line. If there is a set based solution use it, but where you need to work one row at a time or use functionality that isn't available in SQL then this is a great new feature.
October 14, 2007 at 9:32 am
jezemine (10/13/2007)
this is unlikely, unless the DBA is uneducated about the CLR.any DBA that doesn't allow use of xp_cmdshell is likely not going to allow the import of UNSAFE or EXTERNAL_ACCESS assemblies, which is what would be required to touch the filesystem/registry/etc from a CLR proc or udf.
Not quite true... Much like an extended stored procedure, a CLR is precompiled and not changeable from SQL. Since it is precompiled, it can be set to do only do certain things very much like an extended stored procedure (for example, produce a directory listing for BUILK INSERT imports) which makes it very safe. The key is that the DBA must do his/her job by doing the proper code review before allowing it into the server... of course, the other part of that is Developers should NEVER have write access to a production database nor the machine/disk space it lives on.
matty_p75 (10/14/2007)
I think you are really missing the point here. Cursors may be more inefficient that writing some code in C# or VB.NET. I'm not saying this is the case, but it is another useful tool in a database developer's armoury. Before I may have had to write an app that had to connect to the server. Now that CLR exists it can run on the server managed by SQL Server itself making it much more efficient than in the past. I'm not saying don't use SQL wherever you can, this has to be the bottom line. If there is a set based solution use it, but where you need to work one row at a time or use functionality that isn't available in SQL then this is a great new feature.
Ah... you miss the point... I said cursors are for folks that don't really know SQL. The ONLY reason I can ever see using a cursor is in a control loop to step through databases or, perhaps, similarly named tables (even then, I'll use a different method)... In both cases, there is likely something wrong with the design of the database(s) if you need to do that or maybe the collective doesn't understand how to use partitioned views. RBAR processing of individual rows in a table using cursors or While loops will always be slower than a proper setbased solution and is the wrong solution.
If you're using a While loop to limit the number of rows affected by a set-based operation (for example, deleting a million rows from a 100 million row table), that would be OK in my book... but you certainly don't need and shouldn't use a cursor for such a thing.
You said "but where you need to work one row at a time"... like for what? Running total? Running Count? Statistical Medians? Running Averages? Splitting CSV columns? Grouped Running Counts? Grouped Running Totals? Ranking? Ugh... Reporting? Etc, etc... All of those "problems" have very high speed set-based methods that require no Cursor or While loop even in SQL Server 7 and 2000 never mind 2005/2008. I find that people simply give up too quickly because they "think" (oxymoron in this case 😀 ) they "need" to work one row at a time and are mistaken... Yes... cursors are a " useful tool in a database developer's armory"... for those who don't really know T-SQL and SQL Server.
There's an article out there on CLR's that has one particularlly good set of sentences that sums up the use of CLRs (and DTS/Cursors/While loops, in my opinion)... and I quote...
"This is the hammer that, I suspect, will be used to hammer in nails, screws, and 2x4s in projects [...] It will be overused. Don't let your project be the ones caught guilty of this."
--Jeff Moden
Change is inevitable... Change for the better is not.
October 14, 2007 at 11:46 am
Jeff Moden (10/14/2007)
Not quite true... Much like an extended stored procedure, a CLR is precompiled and not changeable from SQL. Since it is precompiled, it can be set to do only do certain things very much like an extended stored procedure (for example, produce a directory listing for BUILK INSERT imports) which makes it very safe. The key is that the DBA must do his/her job by doing the proper code review before allowing it into the server... of course, the other part of that is Developers should NEVER have write access to a production database nor the machine/disk space it lives on.
agreed. 🙂
---------------------------------------
elsasoft.org
October 15, 2007 at 4:44 am
I use CLR to access SSAS. When data changes in data warehouse, the cube has to be processed immediatly based on the changes and I have found that using CLR to process cubes (using AMO) is the most efficient way and is really fast.
Proactive caching (or) using ROLAP are other options but MOLAP with CLR to update (based on complex business logic) works out very good for me as we have the compelte control.
October 15, 2007 at 5:42 am
Jeff Moden (10/13/2007)
Grant, I'm with you... I see no compelling reason for CLR's. My opinion, for the most part, for them being there is the same as DTS and Cursors... they were created to make it easier for folks that don't really know SQL....
I'm not sure I agree 100% on this one. I think there may be a very compelling reason to have & use CLR within the database. So far, I haven't found any for myself and the few I've seen that make sense only nibble around the edges, solving a niche problem in more elegant fashion that could have been solved some other way without CLR. I haven't seen the "Oh, that's why it's there, how wonderful" application of it unlike other additions like CTE's, CROSS APPLY, OVER ON, etc., which have immediate positive impact on code and/or performance. So far CLR within the dba feels like we're keeping up with the Joneses more than actually providing useful service.
But, I'm open minded and here to be convinced.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
October 15, 2007 at 5:46 am
Jeff Moden (10/14/2007)
"This is the hammer that, I suspect, will be used to hammer in nails, screws, and 2x4s in projects [...] It will be overused. Don't let your project be the ones caught guilty of this."
Now that's a statement I can get behind 100%.
We have some local developers that are now putting CTE's into every query they write, whether they need to be there or not. I'd hate to see the pile of excrement we'd be dealing with if we let unexamined, untested CLR into our db's.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
October 15, 2007 at 9:43 am
Grant Fritchey (10/15/2007)
I'd hate to see the pile of excrement we'd be dealing with if we let unexamined, untested CLR into our db's.
...which of course, would be true if you inserted ANYTHING in there instead of CLR. Try replacing CLR with: cursor's, UDF's, CTE's, T-SQL, triggers, jobs. As the "holy guardians of the data", letting in anything untested = we didn't do our job.
Remember - any tool can be abused.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
October 16, 2007 at 1:42 pm
The one place I have seen a cursor used where it would be difficult to use set based logic was where the results in a row of a particular query had to be used to call a sequence of stored procedures in a particular order for those particular records.
The first stored procedure inserted a record then returned the id for the newly inserted record for use in the 2nd procedure etc.
If you had to I dare say that it would be possible to boil it down to set based logic but the cost/benefit of doing so would be prohibitive.
In the world of the purest if there isn't the time to do it properly there isn't the time, full stop. In the world of the pragmatist the solution will last until 2012
October 16, 2007 at 10:04 pm
Possibly, David... in most cases that I've come across like that, the multiple procs were actually written (initially) to support the natural RBAR that occurs in GUI related code... such jobs take a comparitively very long time to run batch code against. If you have a large number of records to process, it's usually worth rewritting. For example, we've done just that against many 8-16 hour jobs and have gotten it so that most of them run in 10 to 30 minutes... it's really cut our costs for night operators.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 16, 2007 at 10:05 pm
Matt Miller (10/15/2007)
...which of course, would be true if you inserted ANYTHING in there instead of CLR. Try replacing CLR with: cursor's, UDF's, CTE's, T-SQL, triggers, jobs. As the "holy guardians of the data", letting in anything untested = we didn't do our job.
Here, here!
--Jeff Moden
Change is inevitable... Change for the better is not.
October 17, 2007 at 2:11 am
Well, the compelling reason for me (and the only time I've felt it necessary) was to do conversions from datetime differences to a human readable form in days,hours minutes.
I could have done it in T-SQL but the code would have looked horrible and been difficult to maintain. In C# it took about fifteen minutes to type and test - it's clear, easily maintainable, and can quickly be adapted.
You can't have memory leaks in pure managed code. But you can if you involve components that have unmanaged components.
It's definitely something I will use in the future. Can't right now, locked in to 2000 - but will be jumping to 2008 asap. As a developer working on a severely complex Windows app I find I spend far too much time in SSMS checking and testing sprocs. And most probs boil down to sproc problems . Quick poll of my team members says they feel they spend about 70% of their time in QueryAnal or SSMS rather than in the much richer and intuitive VS. We're burning serious money here...
Right here, right now CLR integration is the future. Grow and change.. Or are you the sort of person who doesn't freely share knowledge? Pure DBA's will die out soon - to be replaced by systems engineers and application programmers.
I know where I want to spend my debugging time. And CAST,CONVERT etc isn't where I want to be.
Viewing 15 posts - 31 through 45 (of 71 total)
You must be logged in to reply to this topic. Login to reply