Storing large LOB data

  • This is more of a curiosity question to gather some insight on how to best store/layout data in a table.  Other than configuring multiple file groups with multiple files to strategically store data in a database that you know will be large, what options are there at the "table" level?

    Consider a database with several large tables that each have several XML columns, of which 40% of the total volume comes from LOB storage.  Is there a way to segregate out the data so that the LOB type data gets stored elsewhere?  I believe these columns are more of auditing/historical and aren't used in much querying or reporting, yet are apart of the table wreaking havoc on regular database maintenance tasks.

    I don't think MSSQL allows table columns to get stored across different places but more curious how everyone has handled similar situations.

    ______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience

  • sp_tableoption [ @TableNamePattern = ] 'table' , [ @OptionName = ] 'large value types out of row' ,[ @OptionValue =] '1'

    will force large data out of row.

    https://docs.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-tableoption-transact-sql?view=sql-server-ver15

    https://www.mssqltips.com/sqlservertip/3134/sql-server-performance-comparison-storing-xml-data-inrow-versus-outofrow/

     

  • (1) force large value types of out row (as above).  Note that for an existing table, you need to run an UPDATE to move the actual column values off page.  You'll then definitely want to REBUILD the base table.

    (2) Run sys.sp_estimate_data_compression_savings to see if DATA_COMPRESSION would be useful.  If you have a lot of varchar columns in the table, usually it will be.

    (3) Use COMPRESS when applicable on the very large char data being stored.  This may include data in row and out of row (more commonly out of row).  With the proper views (and trigger(s)), you can make the COMPRESS / DECOMPRESS transparent to the queries: that is, the queries won't know the data in the table is COMPRESSed, they SELECT and INSERT the columns normally.

     

    SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".

  • Great stuff!  Thanks for the advice thus far....I'll play with this a bit and see if it works for this scenario

    ______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience

  • Depending on the actual data in the xml as well as the xml structure of the various records, you find storage improvements by storing the xml as [n]varchar(max).

  • This was removed by the editor as SPAM

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply