July 31, 2014 at 6:15 am
SQLRNNR (7/30/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.It could work - depending on what the data is in that table and the data types. Of course you'd have to have enough memory for it. I'm sure Microsoft is hoping for bigger than that at some point.
Yes, probably. Unfortunately the OP didn't say how big the rows of data are nor how much memory his server has so one can only speculate that he doesn't understand the new in-memory tables. I don't, but from what I read there are restrictions and conditions one must understand before using them. Not something I would want to jump into without having time to test and play in a sandbox first.
July 31, 2014 at 6:22 am
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.
I'd rather use a clustered columnstore index.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
July 31, 2014 at 6:28 am
Koen Verbeeck (7/31/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.I'd rather use a clustered columnstore index.
Depending on how heavily read/modified it is...
Clustered columnstore and in-memory are kinda on the opposite ends of that spectrum. Columnstore great for read-heavy, in-memory designed for massive high inserts/sec.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 31, 2014 at 6:30 am
Lynn Pettis (7/31/2014)
SQLRNNR (7/30/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.It could work - depending on what the data is in that table and the data types. Of course you'd have to have enough memory for it. I'm sure Microsoft is hoping for bigger than that at some point.
Yes, probably. Unfortunately the OP didn't say how big the rows of data are nor how much memory his server has so one can only speculate that he doesn't understand the new in-memory tables. I don't, but from what I read there are restrictions and conditions one must understand before using them. Not something I would want to jump into without having time to test and play in a sandbox first.
Okay, he has the memory;
Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
That leaves 25 GB of memory for SQL Server. Curious what will happen when there are large queries being run against that table.
July 31, 2014 at 6:31 am
Lynn Pettis (7/31/2014)
Lynn Pettis (7/31/2014)
SQLRNNR (7/30/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.It could work - depending on what the data is in that table and the data types. Of course you'd have to have enough memory for it. I'm sure Microsoft is hoping for bigger than that at some point.
Yes, probably. Unfortunately the OP didn't say how big the rows of data are nor how much memory his server has so one can only speculate that he doesn't understand the new in-memory tables. I don't, but from what I read there are restrictions and conditions one must understand before using them. Not something I would want to jump into without having time to test and play in a sandbox first.
Okay, he has the memory;
Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
That leaves 25 GB of memory for SQL Server. Curious what will happen when there are large queries being run against that table.
I wonder how the design of that database looks like...
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
July 31, 2014 at 6:34 am
Lynn Pettis (7/31/2014)
Okay, he has the memory;Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
No way is that a good idea for in-memory (and why the hell 90 indexes...). Really nasty things happen when an in-memory table runs out of memory.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 31, 2014 at 6:48 am
Lynn Pettis (7/31/2014)
Lynn Pettis (7/31/2014)
SQLRNNR (7/30/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.It could work - depending on what the data is in that table and the data types. Of course you'd have to have enough memory for it. I'm sure Microsoft is hoping for bigger than that at some point.
Yes, probably. Unfortunately the OP didn't say how big the rows of data are nor how much memory his server has so one can only speculate that he doesn't understand the new in-memory tables. I don't, but from what I read there are restrictions and conditions one must understand before using them. Not something I would want to jump into without having time to test and play in a sandbox first.
Okay, he has the memory;
Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
That leaves 25 GB of memory for SQL Server. Curious what will happen when there are large queries being run against that table.
I'd say no, they don't have the memory. Microsoft recommends you have 3-5 times the size of the table to support versioning and other aspects of the in-memory management.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
July 31, 2014 at 6:48 am
Lynn Pettis (7/31/2014)
Lynn Pettis (7/31/2014)
SQLRNNR (7/30/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.It could work - depending on what the data is in that table and the data types. Of course you'd have to have enough memory for it. I'm sure Microsoft is hoping for bigger than that at some point.
Yes, probably. Unfortunately the OP didn't say how big the rows of data are nor how much memory his server has so one can only speculate that he doesn't understand the new in-memory tables. I don't, but from what I read there are restrictions and conditions one must understand before using them. Not something I would want to jump into without having time to test and play in a sandbox first.
Okay, he has the memory;
Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
That leaves 25 GB of memory for SQL Server. Curious what will happen when there are large queries being run against that table.
90 indexes? I wonder how many columns. I wouldn't be surprised if there were more indexes than columns.
--------------------------------------
When you encounter a problem, if the solution isn't readily evident go back to the start and check your assumptions.
--------------------------------------
It’s unpleasantly like being drunk.
What’s so unpleasant about being drunk?
You ask a glass of water. -- Douglas Adams
July 31, 2014 at 6:51 am
GilaMonster (7/31/2014)
Lynn Pettis (7/31/2014)
Okay, he has the memory;Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
No way is that a good idea for in-memory (and why the hell 90 indexes...). Really nasty things happen when an in-memory table runs out of memory.
I don't have the knowledge or experience to say that it isn't a good idea.
My gut, however, says they are looking for trouble.
July 31, 2014 at 6:56 am
patrickmcginnis59 10839 (7/30/2014)
Grant Fritchey (7/30/2014)
patrickmcginnis59 10839 (7/30/2014)
Then whats the problem with defending SO?I wasn't aware there was one. Someone offered an opinion. You disagreed. Others disagreed with you. Etc.
I think the implication came about because you mentioned that you were specifically curious why it was so important to me to defend SO, and that you were also moderating your behavior strictly in reply to my posts.
Well, to be blunt, you seem hyper-sensitive about this topic, so I was attempting to address the issue without setting you off. It's called being nice.
Let's put it this way. SO and their approach is absolutely fine. It doesn't work in all situations and it doesn't work for everyone. But, a very healthy chunk of the inhabitants at SO are extremely rude and very unhelpful, so I generally stay away.
Now, that's my opinion. You don't have to agree with it. Nothing you say at this point is going to change it because it's based on my experiences posting over there. Where, I'll point out again, I'm in the top 12% of posters, so it's not like I'm taking an ego hit (as you've stated several times was the issue for people here). The SO approach works for certain types of problems, but the SO attitude leaves a lot to be desired.
Honestly, at this point, I'm beginning to think this is bordering on you being a troll. And we all know, don't feed the trolls. See ya!
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
July 31, 2014 at 7:00 am
Koen Verbeeck (7/31/2014)
Lynn Pettis (7/31/2014)
Lynn Pettis (7/31/2014)
SQLRNNR (7/30/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.It could work - depending on what the data is in that table and the data types. Of course you'd have to have enough memory for it. I'm sure Microsoft is hoping for bigger than that at some point.
Yes, probably. Unfortunately the OP didn't say how big the rows of data are nor how much memory his server has so one can only speculate that he doesn't understand the new in-memory tables. I don't, but from what I read there are restrictions and conditions one must understand before using them. Not something I would want to jump into without having time to test and play in a sandbox first.
Okay, he has the memory;
Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
That leaves 25 GB of memory for SQL Server. Curious what will happen when there are large queries being run against that table.
I wonder how the design of that database looks like...
90 INDEXES! And the performance is still so poor that they're trialling in-memory tables? That smells to me like cr@p queries.
For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
July 31, 2014 at 7:03 am
ChrisM@Work (7/31/2014)
Koen Verbeeck (7/31/2014)
Lynn Pettis (7/31/2014)
Lynn Pettis (7/31/2014)
SQLRNNR (7/30/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.It could work - depending on what the data is in that table and the data types. Of course you'd have to have enough memory for it. I'm sure Microsoft is hoping for bigger than that at some point.
Yes, probably. Unfortunately the OP didn't say how big the rows of data are nor how much memory his server has so one can only speculate that he doesn't understand the new in-memory tables. I don't, but from what I read there are restrictions and conditions one must understand before using them. Not something I would want to jump into without having time to test and play in a sandbox first.
Okay, he has the memory;
Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
That leaves 25 GB of memory for SQL Server. Curious what will happen when there are large queries being run against that table.
I wonder how the design of that database looks like...
90 INDEXES! And the performance is still so poor that they're trialling in-memory tables? That smells to me like cr@p queries.
Maybe they want to improve inserts. No locking and all with in-memory tables 😀
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
July 31, 2014 at 7:49 am
GilaMonster (7/31/2014)
Lynn Pettis (7/31/2014)
Okay, he has the memory;Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
No way is that a good idea for in-memory (and why the hell 90 indexes...). Really nasty things happen when an in-memory table runs out of memory.
90 indexes tosses in-memory out the window. OP will need to cut down on number of indexes first.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
July 31, 2014 at 7:50 am
Koen Verbeeck (7/31/2014)
Lynn Pettis (7/31/2014)
Lynn Pettis (7/31/2014)
SQLRNNR (7/30/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.It could work - depending on what the data is in that table and the data types. Of course you'd have to have enough memory for it. I'm sure Microsoft is hoping for bigger than that at some point.
Yes, probably. Unfortunately the OP didn't say how big the rows of data are nor how much memory his server has so one can only speculate that he doesn't understand the new in-memory tables. I don't, but from what I read there are restrictions and conditions one must understand before using them. Not something I would want to jump into without having time to test and play in a sandbox first.
Okay, he has the memory;
Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
That leaves 25 GB of memory for SQL Server. Curious what will happen when there are large queries being run against that table.
I wonder how the design of that database looks like...
My gut says it is a warehouse and not so much an oltp style database.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
July 31, 2014 at 7:54 am
Koen Verbeeck (7/31/2014)
ChrisM@Work (7/31/2014)
Koen Verbeeck (7/31/2014)
Lynn Pettis (7/31/2014)
Lynn Pettis (7/31/2014)
SQLRNNR (7/30/2014)
Lynn Pettis (7/30/2014)
Really?? Let's put a 500 million row table into an in-memory table.It could work - depending on what the data is in that table and the data types. Of course you'd have to have enough memory for it. I'm sure Microsoft is hoping for bigger than that at some point.
Yes, probably. Unfortunately the OP didn't say how big the rows of data are nor how much memory his server has so one can only speculate that he doesn't understand the new in-memory tables. I don't, but from what I read there are restrictions and conditions one must understand before using them. Not something I would want to jump into without having time to test and play in a sandbox first.
Okay, he has the memory;
Size of the table: 195 GB
Number of Indexes: 90
220GB memory is allocated only for the sql server....out of total 260 GB Server memory
That leaves 25 GB of memory for SQL Server. Curious what will happen when there are large queries being run against that table.
I wonder how the design of that database looks like...
90 INDEXES! And the performance is still so poor that they're trialling in-memory tables? That smells to me like cr@p queries.
Maybe they want to improve inserts. No locking and all with in-memory tables 😀
That is possible. I think it is to prove they can do their nightly load into the warehouse faster.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
Viewing 15 posts - 44,986 through 45,000 (of 66,712 total)
You must be logged in to reply to this topic. Login to reply