December 19, 2005 at 5:21 am
Hi,
I am facing the Database performance issue, it is very slow. The size of my data is about 1 GB. but the data retrieval and accessing performance is very slow, most of the times it gives timeout expired to the users. and we have 50-60 users accessing data at a time.
Please help me what areas i have to focus on in order to improve the performance and execution time.
I shall be very thankfull if you provide me a solution.
With Regards,
Noman
December 19, 2005 at 5:47 am
I need a lot more information to give anything other than general advice.
Is your data access via stored procedures?
Are there indexes where they should be on the fields?
What is the spec of your SQL Server?
Is your SQL server a dedicated server?
Do you use loads of foreign key constraints?
Do you use loads of triggers?
Is your database recovery mode set to simple?
Do you have autoshrink and autoclose set to true?
Have you tried running SQL Profiler to see what queries are being run against the database?
December 19, 2005 at 6:18 am
Thanx for consideration,
Yes we are using most of our reports in the form of SP, but not all.
Indexing on some tables not on all.
specs of our dedicated server are..
RAID level - 5 (37 GB/drive)
SCSI - 3
3.6 Processor
2 GB RAM
and we have 'SIMPLE' recovery mode.
------------------
I want something about ..
Do you use loads of triggers?
we have Triggers but what is load triggers.
Do you have autoshrink and autoclose set to true?
i am also confused about the results if we checked these boxex... purpose?
and we dont take any risk.
Have you tried running SQL Profiler to see what queries are being run against the database?
yes mostly the Select, Update queries are running on DB.
please guide me as soon as possible.
really appriciated
Noman
December 19, 2005 at 8:25 am
I don't want to tackle everything (dont have the time) but for starters i would make sure that autoclose and autoshrink are not ticked on the the DB properties. Autoclose can cause problems because if the DB is inactive for a while it closes and then the next query to it takes ages because it has to open it again. Auto-shrink is also false economy, if a DB grows to a particular size then it should probably stay that large, using auto-shrink introduces fragmentation into the mix.
My next step would be to look at using profiler to see which of your queries are taking the longest, i'd probably start by filtering out queries less than 5000 (5 seconds) and if that returns too many results then double it so that you have a smaller set of things to look at. Once you've identified the causes of the long running queries there are a couple of hundred other things you can look at.
Performance tuning isn't a one time thing, it's a constantly evolving task and what works well one day may not work properly in the future, it all depends on the data you have in the tables.
Let us know how you get on or feel free to ask if you need any additional pointers,
Mike
December 19, 2005 at 10:38 am
"Have you tried running SQL Profiler to see what queries are being run against the database?
yes mostly the Select, Update queries are running on DB."
Are you doing updates at the same time as the selects? You might want to see if forcing the updates to use row lock will help because otherwise the selects won't be able to access various pages or even entire tables while an update is happening. That could easily cause your users to get timed out if updates are locking by table.
I've been working with indexes on my formerly slow database and it has made a huge difference. Put the selects and updates from your stored procedures into query analyzer and look at the estimated plan. Anything that comes up as a table scan is VERY BAD for performance. Make an index on whatever column is causing the table scan so that it becomes an index scan. NULLs don't index well so if your where clauses have any 'where x is not null' or 'where x is null' then see if you can change the NULL data to an unused value instead, like -999999 and index that.
December 19, 2005 at 6:25 pm
Without much detail to go on, the first two things that come to mind are indexes and statistics. Run a profiler trace for 15 minutes or so and save it to a trace file, then feed it to the Index Tuning wizard. See what recommendations come from that.
Also look at whether your statistics are current etc. sp_updatestats is a quick way to do stats on all tables in a database. If your stats are not accurate, indexes won't always be effective.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply