May 28, 2012 at 5:41 pm
I did not try the query without referencing a function; the query below took a bit more than a second on my environment.
-- 2 seconds
SELECT
SUM(sc.column_id * sc2.column_id / 2)
FROM sys.system_columns AS sc
CROSS JOIN sys.system_columns AS sc2;
It seems the OP wants to implement a function to encapsulate a business rule that will be used in several places in her solution.
Hope this helps,
Rock from VbCity
May 28, 2012 at 5:41 pm
I did not try the query without referencing a function; the query below took a bit more than a second on my environment.
-- 2 seconds
SELECT
SUM(sc.column_id * sc2.column_id / 2)
FROM sys.system_columns AS sc
CROSS JOIN sys.system_columns AS sc2;
It seems the OP wants to implement a function to encapsulate a business rule that will be used in several places in her solution.
Hope this helps,
Rock from VbCity
May 28, 2012 at 5:41 pm
I did not try the query without referencing a function; the query below took a bit more than a second on my environment.
-- 2 seconds
SELECT
SUM(sc.column_id * sc2.column_id / 2)
FROM sys.system_columns AS sc
CROSS JOIN sys.system_columns AS sc2;
It seems the OP wants to implement a function to encapsulate a business rule that will be used in several places in her solution.
Hope this helps,
Rock from VbCity
May 28, 2012 at 5:41 pm
I did not try the query without referencing a function; the query below took a bit more than a second on my environment.
-- 2 seconds
SELECT
SUM(sc.column_id * sc2.column_id / 2)
FROM sys.system_columns AS sc
CROSS JOIN sys.system_columns AS sc2;
It seems the OP wants to implement a function to encapsulate a business rule that will be used in several places in her solution.
Hope this helps,
Rock from VbCity
May 28, 2012 at 5:41 pm
I did not try the query without referencing a function; the query below took a bit more than a second on my environment.
-- 2 seconds
SELECT
SUM(sc.column_id * sc2.column_id / 2)
FROM sys.system_columns AS sc
CROSS JOIN sys.system_columns AS sc2;
It seems the OP wants to implement a function to encapsulate a business rule that will be used in several places in her solution.
Hope this helps,
Rock from VbCity
May 28, 2012 at 5:41 pm
I did not try the query without referencing a function; the query below took a bit more than a second on my environment.
-- 2 seconds
SELECT
SUM(sc.column_id * sc2.column_id / 2)
FROM sys.system_columns AS sc
CROSS JOIN sys.system_columns AS sc2;
It seems the OP wants to implement a function to encapsulate a business rule that will be used in several places in her solution.
Hope this helps,
Rock from VbCity
May 28, 2012 at 5:53 pm
Rock from VbCity (5/28/2012)
I did not try the query without referencing a function; ... It seems the OP wants to implement a function to encapsulate a business rule that will be used in several places in her solution.
I know. The OP has gone away happy. The code I posted is for Jeff, to compare in-line with SQLCLR.
May 28, 2012 at 11:07 pm
Jeff Moden (5/28/2012)
Rock from VbCity (5/28/2012)
The in-line table valued function handled the 1.6 billions records in around 1 second, the CLR function took a bit more than 6 seconds while the remaining two approaches took more than 100 seconds!This is why I always want to see the test code and method of meaurement, etc, etc. If your test bed is simply using a Cross Join on sys.system_columns and it only has 4,611 rows in it, then it is NOT possible for the table valued function to have handled 1.6 billion rows. In fact, using that number, there should only be 21,261,321 rows.
I missed it in my first read. I see that the iTVF won here which answers my previous question but will still test it myself. I'd still like to do my own testing but I'm no C# (or other flavor) programmer. Paul, any chance of you posting the T-SQL script to make the SQLCLR function? Thanks.
This is probably a misunderstanding. On SQL 2008r2, my test query returns 1.6 billion as the result. But that's the sum it's calculating, not the row count.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
May 28, 2012 at 11:32 pm
That's totally true :blush:
Hope this helps,
Rock from VbCity
May 29, 2012 at 6:05 am
SQL Kiwi (5/28/2012)
OK, here's a SQLCLR implementation (with integer overflow checking):
Outstanding! Thanks, Paul! And thanks for your own timings on this.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 29, 2012 at 6:13 am
RBarryYoung (5/28/2012)
Jeff Moden (5/28/2012)
Rock from VbCity (5/28/2012)
The in-line table valued function handled the 1.6 billions records in around 1 second, the CLR function took a bit more than 6 seconds while the remaining two approaches took more than 100 seconds!This is why I always want to see the test code and method of meaurement, etc, etc. If your test bed is simply using a Cross Join on sys.system_columns and it only has 4,611 rows in it, then it is NOT possible for the table valued function to have handled 1.6 billion rows. In fact, using that number, there should only be 21,261,321 rows.
I missed it in my first read. I see that the iTVF won here which answers my previous question but will still test it myself. I'd still like to do my own testing but I'm no C# (or other flavor) programmer. Paul, any chance of you posting the T-SQL script to make the SQLCLR function? Thanks.
This is probably a misunderstanding. On SQL 2008r2, my test query returns 1.6 billion as the result. But that's the sum it's calculating, not the row count.
Uh-huh... agreed and it didn't take much to figure that out but the posted results listed 1.6 billion as the number of rows and then was also stated to be 1.6 billion rows in text. I understand that mistakes happen but it furthedr supports why I want to actually see the test code, harness, etc, used for any and all claims of performance... mistakes happen. 🙂
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 11 posts - 31 through 40 (of 40 total)
You must be logged in to reply to this topic. Login to reply