February 16, 2016 at 10:12 pm
I have few databases in my environment which is 2 TB and 1 TB data and we have partitioned the few tables for better performance and implemented page compression for disk space savings...now number of tables for compression is become big and I want to automate thru jobs....can you provide me T sql code pls...
February 16, 2016 at 10:15 pm
Note the section titled "Using Transact-SQL" in this link: Enable Compression on a Table or Index
-- Itzik Ben-Gan 2001
February 16, 2016 at 10:25 pm
Hi Alan,
Thanks for quick reply ...but my requirement is to keep yesterday data as it is and want to compress remaining partitions.
February 17, 2016 at 1:12 am
From BoL
Limitations and Restrictions
--------------------------------------------------------------------------------
•System tables cannot be enabled for compression.
•If the table is a heap, the rebuild operation for ONLINE mode will be single threaded. Use OFFLINE mode for a multi-threaded heap rebuild operation. For a more information about data compression, see Data Compression.
•You cannot change the compression setting of a single partition if the table has nonaligned indexes.
https://msdn.microsoft.com/en-us/library/hh710070.aspx#Restrictions
My suggestion is to use the wizard to do the compression then script it out before committing that should give you an Idea of the sequences you need to follow in order to automate the compression.
_________________________________________________________________________
SSC Guide to Posting and Best Practices
February 17, 2016 at 1:25 am
Hi,
I uploaded a script that I used to use for compressing databases. I had a similar issue like yours.
Igor Micev,My blog: www.igormicev.com
February 17, 2016 at 2:06 am
Hi Igor,
Thanks for your script, In my environment we have partitioned the tables with business_date column so we need to keep yesterday data uncompressed and all previous date should be compressed.please tell me what modification I need to do in your script to meet my requirement.As mine is OLTP environment for better performance we have created multiple file groups with 8 files in each file group.partition scheme is filegroup name and partition column name is business date.
Regards,
Prem
February 17, 2016 at 5:45 am
prem.m38 (2/17/2016)
Hi Igor,Thanks for your script, In my environment we have partitioned the tables with business_date column so we need to keep yesterday data uncompressed and all previous date should be compressed.please tell me what modification I need to do in your script to meet my requirement.As mine is OLTP environment for better performance we have created multiple file groups with 8 files in each file group.partition scheme is filegroup name and partition column name is business date.
Regards,
Prem
Interesting. In this case you'll need to determine which filegroups are not going for compression. Can you determine those filegropus?
The following query will find all filegroups and indexes on them for a database:
SELECT ds.name AS [FG_name], sch.name AS [schema_name], o.type_desc AS [object_type], o.name AS [object_name], i.name AS [index_name]
FROM sys.data_spaces AS ds
INNER JOIN sys.allocation_units AS au ON ds.data_space_id = au.data_space_id
INNER JOIN sys.partitions AS p
ON ( au.type IN( 1, 3 ) AND au.container_id = p.hobt_id ) OR ( au.type = 2 AND au.container_id = p.partition_id )
INNER JOIN sys.objects AS o ON p.object_id = o.object_id AND o.is_ms_shipped = 0
INNER JOIN sys.schemas AS sch ON o.schema_id = sch.schema_id
LEFT JOIN sys.indexes AS i ON p.object_id = i.object_id AND p.index_id = i.index_id
WHERE i.name IS NOT NULL
ORDER BY ds.name, sch.name, o.name, i.name;
See spCompressDatabase2.txt for using filegroups in selection.
If you can add a condition to this query, then you're a step closer to your goal.
/*****************
UPDATE #tmpFGandIndexes
SET is_for_compresssion = 0
WHERE <your condition here>
*****************/
Igor Micev,My blog: www.igormicev.com
February 23, 2016 at 5:10 am
Hi Igor,
Now we are trying to compress all partitions including future date (with and without rows),will it affect my performance ?? In this case while inserting data itself it will compress the data right?? Even IO will be faster ...please throw some light on this.
Regards,
Prem
February 23, 2016 at 8:26 am
Hi,
It depends on more factors, but in general you are expected to have a gain of it. As your system seems to be more OLAP-like (or DWH-like) because you're populating it with jobs (am I right?) I think you'll have much gain.
Please see the research in attachment that I did for a company.
I recommend to you to do the compression first on test environment, with some tests - something like in my research example, and then apply on production if results are satisfying.
I hope you'll share good results with us.
Igor Micev,My blog: www.igormicev.com
March 2, 2016 at 3:24 am
Hi Igor,
We tried compressing for future date ,In my case it is taking same time and not much difference, as jobs are running as Batch process but still we are trying compress future date as it is one time process.
Regards,
Prem
March 2, 2016 at 9:01 pm
prem.m38 (2/16/2016)
I have few databases in my environment which is 2 TB and 1 TB data and we have partitioned the few tables for better performance and implemented page compression for disk space savings...now number of tables for compression is become big and I want to automate thru jobs....can you provide me T sql code pls...
Just curious... assuming that you're talking about the performance of queries, did you actually measure the performance of queries before partitioning and then again after?
--Jeff Moden
Change is inevitable... Change for the better is not.
March 2, 2016 at 10:02 pm
March 3, 2016 at 12:12 am
pietlinden (3/2/2016)
Jeff,If you're gonna go there, why not go to Gila Monster's Howlers - Partitioning[/url] ?
Like partitioning, the compression of a database too, could not bring any improvements in terms of faster queries. That's why you should do some checks on test environment first.
However, compression at least brings you space gains. It is expected to bring you query improvements, but it depends on other factors as well.
Igor Micev,My blog: www.igormicev.com
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply