Splitting SQL database 120GB (mdf file) into separate ndf files

  • Hi,

    Currently we have a Live database of 120 GB with single SQL file (mdf file). In order to increase performance, i have heard that splitting the database into different ndf files will work and help a lot.

    I would request if somebody let me know the complete process to split the live database into different ndf files so that performance could be increased. If there is any document / reading article / weblink for details, then it will be great....

    Additional info:

    1) This SQL database is used for Dynamics Axapta 2009.

    2) Few tables have 2 Crore + of records.

    Thanks in advance.

  • I think the correct term you're looking for is partitioning. Search for "sql server 2008 database partitioning" on Google and you will find the articles you're looking for. BTW, I've found that for ETL purposes partitioning can have a huge performance impact. So make sure you do your partitioning in a QA environment first and testing it to see if the changes you have made are working positively.

    Try different ideas with the partitioning such as moving your huge tables to their own disks, or moving all the indexes to a seperate disk from your tables. You can even split 1 table over multiple disks or split 1 table multiple times on a single disk.

  • Separating your data into different data files is something that largely depends on usage patterns and the scheme you employ to break out the data. There are many different methods to split the data from giving various tables their own filegroup and data file to partitioning to just splitting out indexes to a separate data file to just adding an additional file to the same filegroup and then rebuilding the indexes to "level" the data between the files in that filegroup.

    The best thing to do is to test a few methods and then employ what you think will work best for your environment.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • Yes for sure, I am going to first execute this with UAT environment database, if I will get the success, then only it will be done with Live database.

    I have heard of various methods mentioned below, but with no confirmation if it works with Dynamics AX or not:

    a) Split of tables into different file based on time frame

    b) moving heavy tables from main database to separate database.

    Can anyone let me know, which is the best suitable way wtih Dynamics Axapta 2009, it this really helps?

  • please also double check guidelines of the software using your sqldb.

    In some cases your software may be out of support when modifying physical stuff. :sick:

    Did you check their performance blog http://blogs.msdn.com/b/axperf/

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • Before you go to all the effort of splitting the files, check that you really are bottlenecked on IO (not memory or CPU) and that the queries as optimised as much as possible (well written, proper indexes). Otherwise you may well be wasting your time splitting the DB up.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply