December 15, 2013 at 11:33 pm
We have a 400 gb of database. Now it is taking too much time to insert, update and delete data.
And data fetching is also taking too much time. so please tell to me what should i need to do for increasing data processing.
Regards,
Arjun
December 16, 2013 at 12:45 am
That is a very generic question for a 400GB database.
You really need to look at some SQL performance tuning articles, how to analyse wait statistics, perfmon, SQL dmv's etc.
If you find your slowest query / most executed query / most IO intensive query and post that, along with the table schema, you might get some more specific answers on how to improve performance.
Some questions you might want to answer:
What is the largest table in number of rows and pages?
What are some of the worst performing queries? How can they be improved?
Which wait stats are the highest?
Run through all the queries in Glen Berry's performance queries, post some results and people may be able to provide a better answer.
December 16, 2013 at 1:10 am
Please check index fragmentation, if the fragmentation % is more than 60% need to rebuild the indexes.
It will improve your performance.
December 16, 2013 at 4:41 am
You're going to have to go through all the processes of tuning the server, the database and the code. I'd suggest getting a copy of my book on query tuning. You're all the way back to the fundamentals. You need to gather metrics on the server in order to understand exactly what is running slow and where. Then you need to identify what is causing that slowness, if it's missing or incorrect indexes, out of date statistics, or just bad T-SQL.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 16, 2013 at 4:48 am
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 16, 2013 at 4:49 am
Sagar Akhare (12/16/2013)
Please check index fragmentation, if the fragmentation % is more than 60% need to rebuild the indexes.It will improve your performance.
Maybe, but probably not, and if it does it would likely have been because the stats got updated as part of the rebuild. Rebuilding indexes is not a panacea for performance problems.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 16, 2013 at 4:56 am
First follow the advise given by Grant and Gail.
But also consider archiving some data. Does the business really need all the 400GB of data? Try to move parts of data from the largest tables to archiving tables (or archiving databases). Besides proper indexing, up-to-date statistics, etc. smaller datasets are generally faster to query.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply