February 8, 2018 at 10:41 am
In the SQL 2016 documentation it states that you must create a new filegroup called Memory_Optimized_Data ut when I open my database which is SQL 2016 SPK1-CU6 there's
already a filegroup called [Memory Optimized Data] that I did not create and I'm the only DBA that would have. Do I still need to create another filegroup with the same name?
February 8, 2018 at 4:01 pm
The name of the filegroup doesn't matter so much as whether or not it was created using CONTAINS MEMORY_OPTIMIZED_DATA.
If so, yeah, that's the one you'll use for your database. If not, regardless of what you call it, you still have to create one.
One HUGE point, you can't delete a memory optimized filegroup after you add it to the database. Don't experiment on production.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 9, 2018 at 6:39 am
Grant Fritchey - Thursday, February 8, 2018 4:01 PMOne HUGE point, you can't delete a memory optimized filegroup after you add it to the database. Don't experiment on production.
Yowch... that sucks. So much for temporary usage during the night for large ETL jobs. Don't need it or even want it for anything else.
I guess the workaround for me will be to create a temporary "scratch" database, do the work, and then drop that "scratch" database.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 9, 2018 at 8:07 am
Jeff Moden - Friday, February 9, 2018 6:39 AMYowch... that sucks. So much for temporary usage during the night for large ETL jobs. Don't need it or even want it for anything else.I guess the workaround for me will be to create a temporary "scratch" database, do the work, and then drop that "scratch" database.
Yeah, it has burned more than a few people. Still, memory optimized tables handy for table variables too. Just saying.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 15, 2018 at 4:50 am
February 15, 2018 at 8:21 am
Grant Fritchey - Friday, February 9, 2018 8:07 AMJeff Moden - Friday, February 9, 2018 6:39 AMYowch... that sucks. So much for temporary usage during the night for large ETL jobs. Don't need it or even want it for anything else.I guess the workaround for me will be to create a temporary "scratch" database, do the work, and then drop that "scratch" database.
Yeah, it has burned more than a few people. Still, memory optimized tables handy for table variables too. Just saying.
I basically forbid the use of Table Variables because of their non-persistent nature for troubleshooting and the single row estimates they normally kick up unless you do a recompile. The only place we use them is where we have to, such as in iTVFs. I'm a bit new to the in-memory thing... is there a way to tell iTVFs to use In_Memory? Is it a simple as creating the iTVFs on the in-memory group?
--Jeff Moden
Change is inevitable... Change for the better is not.
February 17, 2018 at 8:49 am
Yeah, you can create the table variables into the In-Memory storage and it really does speed them up a bunch. I wouldn't recommend it for general use for all the reasons you list. However, if you're hitting recompile issues, stuff like that, where table variables MIGHT be useful, then implementing them using In-Memory makes them a lot less painful.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 17, 2018 at 12:02 pm
Grant Fritchey - Saturday, February 17, 2018 8:49 AMYeah, you can create the table variables into the In-Memory storage and it really does speed them up a bunch. I wouldn't recommend it for general use for all the reasons you list. However, if you're hitting recompile issues, stuff like that, where table variables MIGHT be useful, then implementing them using In-Memory makes them a lot less painful.
Cool. Thanks for the lessons, Grant.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply