March 21, 2016 at 11:43 am
Data warehouse team has a large SSIS package which needs to access a temp type table, about 2 million rows and about 400 MB. (this is enterprise edition). From a DBA perspective the question may be which would be better a #temp table (wondering since this table would be referenced 4 times in the package by different steps) whether or not a ##temp or #temp table is even viable), or a memory optimized table. Server is a new high end dell with SSD and 192 gig of RAM. Their current SSIS ETL unfortunately creates this same table 4 times and kills memory (Yes I realize this may be a design issue) more interested if anyone has done any comparisons between the temp table and the memory optimized solution
March 21, 2016 at 2:29 pm
March 22, 2016 at 4:57 am
Testing locally with your system is the only way to be sure. It really depends on if the process is loading the temp tables/table variables once or updating them constantly. You're going to get different behaviors if you need statistics than if you don't. Without seeing your code at all, testing is the only advice I can give.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply