March 7, 2019 at 7:50 pm
Jeff Moden - Thursday, March 7, 2019 4:12 PMHeh... not quite right. The Tally Table is still faster than the table constructors. It's not by much and it's less noticeable for most folks due to the speed of today's machines but, when you're willing to trade reads for speed, a physical Tally Table is difficult to beat.And, no... I have no bad words for Itzik's method (which I refer to as cCTEs or Cascading CTEs). Done correctly, it's blazing fast and produces no reads.
As with all else in SQL Server, "It Depends". 😀
I believe that Drew meant that you shouldn't read physical tables to create a tally table, not that you shouldn't use a physical tally table. I'm not sure if someone has actually tested the performance of an In-Memory table. I haven't had the opportunity to work with those on real systems.
I have. No faster than a CTE tally in my experience.
-- Itzik Ben-Gan 2001
March 8, 2019 at 9:33 am
Luis Cazares - Thursday, March 7, 2019 6:32 PMJeff Moden - Thursday, March 7, 2019 4:12 PMHeh... not quite right. The Tally Table is still faster than the table constructors. It's not by much and it's less noticeable for most folks due to the speed of today's machines but, when you're willing to trade reads for speed, a physical Tally Table is difficult to beat.And, no... I have no bad words for Itzik's method (which I refer to as cCTEs or Cascading CTEs). Done correctly, it's blazing fast and produces no reads.
As with all else in SQL Server, "It Depends". 😀
I believe that Drew meant that you shouldn't read physical tables to create a tally table, not that you shouldn't use a physical tally table. I'm not sure if someone has actually tested the performance of an In-Memory table. I haven't had the opportunity to work with those on real systems.
I have. No faster than a CTE tally in my experience.
Like I said, it's not by much but, as always, "It Depends".
--Jeff Moden
Change is inevitable... Change for the better is not.
March 8, 2019 at 11:03 am
Alan.B - Thursday, March 7, 2019 7:50 PMLuis Cazares - Thursday, March 7, 2019 6:32 PMJeff Moden - Thursday, March 7, 2019 4:12 PMHeh... not quite right. The Tally Table is still faster than the table constructors. It's not by much and it's less noticeable for most folks due to the speed of today's machines but, when you're willing to trade reads for speed, a physical Tally Table is difficult to beat.And, no... I have no bad words for Itzik's method (which I refer to as cCTEs or Cascading CTEs). Done correctly, it's blazing fast and produces no reads.
As with all else in SQL Server, "It Depends". 😀
I believe that Drew meant that you shouldn't read physical tables to create a tally table, not that you shouldn't use a physical tally table. I'm not sure if someone has actually tested the performance of an In-Memory table. I haven't had the opportunity to work with those on real systems.
I have. No faster than a CTE tally in my experience.
Like I said, it's not by much but, as always, "It Depends".
I could be mistaken but I thought Luis was talking the performance of an in-memory tally table vs a traditional tally table.
I was looking for this yesterday and just found it: https://www.sqlservercentral.com/Forums/1101315/Tally-OH-An-Improved-SQL-8K-CSV-Splitter-Function?PageIndex=36 It would appear that, in this case, the memory optimized table was faster but, for me, I have not had the same level of success. In my personal experience I have never seen a performance improvement switching from a CTE tally table to a memory optimized tally table.
That said, I have never had a primary key on mine; here's the DDL for the one I use:CREATE TABLE dbo.eTally
(
N INT NOT NULL,
UNIQUE NONCLUSTERED (N ASC)
)
WITH (MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_ONLY);
The one Magoo used in his testing had a PK (nonclustered).
On a separate note - here's a great example of "there being no spoon or default ORDER BY in SQL Server":
SELECT TOP (10) t.N
FROM dbo.eTally AS t
Returns: 998753, 998754.....998762
-- Itzik Ben-Gan 2001
March 9, 2019 at 7:54 pm
Ah... sorry, Alan. I flat out missed (even though bolded and underlined :blush:) that folks were talking about "memory optimized" tables. I can't speak to that because I don't use them... at least not yet. I can say that it's my understanding that "memory optimized" tables work the best for non-unique data and would speculate that there'd be little difference made by using them for a Tally Table. Again, I don't actually know because I've not tested it and probably won't in the near future.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 10, 2019 at 9:09 am
A minor point I came across today - your "favourite way" to produce the Tally table does not work in an Azure database:
SELECT TOP 11000 --equates to more than 30 years of dates
IDENTITY(INT,1,1) AS N
INTO dbo.Tally
FROM Master.dbo.SysColumns sc1,
Master.dbo.SysColumns sc2
Access to Master.dbo.SysColumns not allowed.
You can change it to dbo.SysColumns - which seems to work fine.
July 10, 2019 at 2:38 pm
Young man (or woman),
Implicit joins are evil. Always. Even in examples or practice exercises. Repent of your wicked ways.
July 10, 2019 at 11:11 pm
A minor point I came across today - your "favourite way" to produce the Tally table does not work in an Azure database:
SELECT TOP 11000 --equates to more than 30 years of dates
IDENTITY(INT,1,1) AS N
INTO dbo.Tally
FROM Master.dbo.SysColumns sc1,
Master.dbo.SysColumns sc2Access to Master.dbo.SysColumns not allowed. You can change it to dbo.SysColumns - which seems to work fine.
Your point is well taken and it IS time to upgrade the article. I wrote this article about 12 years ago (2007) and (IIRC) I was still stuck in SQL Server 2000 where dbo.SysColumns hadn't been deprecated yet. I'm also one of those folks that won't publish code unless I've actually executed it and that's why I used deprecated but still working objects even though 2005 had been out for a couple of years.
That being said, here's what I normally use nowadays when I want a quick "row source" and don't happen to have my fnTally function available. Does it work in Azure?
SELECT TOP 11000
N = IDENTITY(INT,1,1)
INTO dbo.Tally
FROM sys.all_columns ac1
CROSS JOIN sys.all_columns ac2
;
ALTER TABLE dbo.Tally
ADD CONSTRAINT PK_Tally PRIMARY KEY CLUSTERED (N)
;
--Jeff Moden
Change is inevitable... Change for the better is not.
July 10, 2019 at 11:28 pm
Young man (or woman), Implicit joins are evil. Always. Even in examples or practice exercises. Repent of your wicked ways.
BWAAA-HAAAA!!!!! Someone needs to tell MS that. Properly written correlated sub-queries require them.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 11, 2019 at 1:33 am
Don’t make me post some examples. The issue isn’t so much the implicit join but rather understanding the criteria in the WHERE statement when it contains the join information as well as any specific criteria. Especially when the {lengthy list of redacted adjectives} developers intermixed the two. Mix in an inline select on one side of the criteria and it goes South even faster.
July 11, 2019 at 2:03 am
Don’t make me post some examples. The issue isn’t so much the implicit join but rather understanding the criteria in the WHERE statement when it contains the join information as well as any specific criteria. Especially when the {lengthy list of redacted adjectives} developers intermixed the two. Mix in an inline select on one side of the criteria and it goes South even faster.
THAT's what I'm getting at. It's not the implicit joins that are the problem. They're merely a symptom of the rest that follows.
Still and like I said previously, correlated sub-queries work using implicit joins. In fact, the same equi-join manner using a WHERE clause that is so non--ANSI join.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 11, 2019 at 10:29 am
Are summary executions off the table?
July 11, 2019 at 12:43 pm
Yes Jeff your new version works in Azure just fine
SELECT TOP 11000
N = IDENTITY(INT,1,1)
INTO dbo.Tally
FROM sys.all_columns ac1
CROSS JOIN sys.all_columns ac2
;
I didn't realise sysColumns was deprecated - I've changed our version of the Tally table accordingly
Many Thanks
July 11, 2019 at 2:50 pm
david.holley wrote:Don’t make me post some examples. The issue isn’t so much the implicit join but rather understanding the criteria in the WHERE statement when it contains the join information as well as any specific criteria. Especially when the {lengthy list of redacted adjectives} developers intermixed the two. Mix in an inline select on one side of the criteria and it goes South even faster.
THAT's what I'm getting at. It's not the implicit joins that are the problem. They're merely a symptom of the rest that follows. Still and like I said previously, correlated sub-queries work using implicit joins. In fact, the same equi-join manner using a WHERE clause that is so non--ANSI join.
Are you saying that this:
SELECT
* -- I know, should be an actual column list
FROM
dbo.tableA as a, dbo.tebleB as b
WHERE
a.AJoinColumn = b.AJoinColumn
is not ANSI Standard?
But it is, it is ANSI-89 Standard. With that, I personally never learned to use this style having started with the ANSI-92 style joins when I first started working with SQL Server 6.5 back in 1996.
July 12, 2019 at 7:37 pm
I can’t speak as to the ANSI standards. I never knew that implicit joins existed until about 5 years ago despite being introduced to SQL 20 years ago. It was something about supporting a legacy app with code that makes a plate of spaghetti look lite a straight line.
July 12, 2019 at 7:43 pm
I can’t speak as to the ANSI standards. I never knew that implicit joins existed until about 5 years ago despite being introduced to SQL 20 years ago. It was something about supporting a legacy app with code that makes a plate of spaghetti look lite a straight line.
Hey, I've seen that code before! It was a 1300-line trigger in Oracle that was used to balance out hours accounting records for a Baan ERP system where the ERP couldn't get it right. In inherited it from someone else, cleaned it up and organized it and got it to work, but it was still a mess. The original gave new meaning to spaghetti. I'm so thankful I don't live in that world any more.
Viewing 15 posts - 466 through 480 (of 511 total)
You must be logged in to reply to this topic. Login to reply