September 2, 2010 at 2:43 am
Hi all,
Our main product is a system that constantly receives input from different sources (wireless and radio communication, GSM signals, etc), and for the main branch of the client that uses the system the most, speed is becoming an issue (mostly on the generating of reports).
We have optimized as far as we can, but the client is pressuring us to find some other solution. They have suggested we archive the data more than a month old on a monthly basis in another database, but they still want access to it from the same system at any time. To summarize, they want their main database (which currently hovers around 22GB) to be as small as possible (preferably less than 10GB, as they complain the backups take too long and require too much space).
I've had a look at the various DBCC_ShrinkFile, etc commands, and I have been playing with the DBCC SHRINKDATABASE command (on a database on our test environment) and have managed to reclaim about 5GB using the Shrinkdatabase command. Whether or not this will be enough is up to the client when we present our opinions in the next meeting.
Are there any other tips you guys can think of that I can use to optimize and shrink the database as much as possible?
Any tips welcomed!!!!
September 2, 2010 at 2:54 am
Firstly, shrinking the database is possibly the worst thing you can do for performance.
Have a look at Gail's article for more information:
http://sqlinthewild.co.za/index.php/2007/09/08/shrinking-databases/
If the database is constantly growing, shrinking is also futile.
22GB really isn't big at all. If they're worried about the time backups take, it's worth investing in a 3rd party backup compression tool (unless you're using 2k8 enterprise edition, or R2 where compression can be used with native backups).
You need something specific to work with. What reports are taking the most time? Are they truly optimised as much as they can be (I'll bet there's a lot more that can be done to them).
There's no magic switch to fix performance problems, you just have to identify the bottlenecks and fix them.
Edit: Make sure you rebuild your indexes in a maintenance window if you've performed a shrinkdatabase - there's a good index optimisation script here[/url]
September 2, 2010 at 3:33 am
Hi,
Thanks for the reply. When I read about DBCC ShrinkDatabase, I saw an addendum at the bottom of the page that it trashes your indexes, but I was already halfway through the process, and I didn't feel like canceling it and screwing up my database. It's just a test DB anyway.
I only started researching solutions for this request today, so I'm trying everything.
I agree with your suggestion of using a backup tool. Since I've heard such rave reviews of Red Gate's products, I am downloading a trial of their backup solution now. I doubt the client will have an issue with buying their own licensed version. The guy who complains the most about everything doesn't necessarily have the final say 🙂
I had a look at the script you suggested, I'm testing it now to see how it goes, before I recommend anything to the client.
Thanks a lot for the replies!!!!!!
September 2, 2010 at 3:40 am
How long do these backups take?
22GB is not a large database by any measure and a backup of it should not take very long at all.
Are you backing up locally or to a network?
The differnce in time between 22gb and 10gb to backup will be marginal
Also before you worry about the size of the database I would look at optimizing the report query.
Run a server side trace on the worst performing reports and capture what is happening.
Then look at these queries to see where the bottlenecks are, you could simply be missing an important index, I have seen systems speed up 1000% after correct indexing.
For reporting you could look at warehousing the data but at 22Gb i woud not worry about it. Sql server can handle that size easily.
September 2, 2010 at 3:46 am
22GB is tiny. I have test databases bigger than that and they back up in minutes. Check your IO performance as that's about the only thing that affects backup speed. Make sure you're backing up to a separate physical drive, not across the network.
Shrinking won't help backup size or speed. Backups don't back up free space. It will also not help performance (quite the opposite)
If you're having performance problems, you need to identify the worst performing queries and try to optimise those. There's no 'run SQL faster' option.
Try this series - http://www.simple-talk.com/sql/performance/finding-the-causes-of-poor-performance-in-sql-server,-part-1/ or consider getting someone in to look at optimising for you.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
September 2, 2010 at 4:09 am
Hi Horatio,
Your post raises quite a few issues. Here is my take on them...
Shrinking the database - DON'T
When new data is inserted it will auto-grow and your data files will be physically fragmented, introducing a further performance problem (if they arem't already)
Given you're using the shrink commands you probably havent pre-sized the database to account for growth. What is autogrowth set to? If it is the default (1MB) then the database will have expanded in 1MB chunks over the drive and the file will be massively fragmented.
Backup time
Consider using filegroup backups.
Move your nonclustered indexes to a filegroup for indexes and periodically script the index definitions.
This could reduce the backup time (and db/backup size) signigicantly.
If you put the indexes filegroup on a different drive you may also see a small performance benefit too.
Database size
I second HowardW's suggestion re SQL 2008 Enterprise for data and backup compression.
It is an expensive licence though for a small company.
Speed of generating reports
This is a can of worms. Without seeing the system that is.
Query optimization is probably the most important thing to do.
Otherwise >
Do you maintain indexes and statistics? (indexes may well be fragmented)
Do you build the reports singularly or in parallel?
Is the system under stress as you do so?
Apologies if all this is obvious. I've seen all these for real though.
rich
September 2, 2010 at 4:49 am
Thanks all.
I'll go into the code a bit more and have a look. I have optimized some of the procs for the reports before, but not all of them, as there are quite a lot.
We are replicating the database to another server as well, we have also recommended to our client to use that one for reporting, but since the reporting interface is a Windows app, config is a bit difficult. We will also push that notion in the future.
I also just found out the production database is 55GB. The DB I was using earlier is a very old one.
Using SQL Backup Pro from Redgate, I tested the backup speed on my dev machine, 19GB DB to 2GB backup in 10 minutes, which is not too bad I think.
Thanks loads for all the tips and recommendations.
September 2, 2010 at 4:53 am
Hi there, me again,
Just another question, the database server is currently running on Windows Server 2003 Enterprise R2 (x86), with SQL 2008 Standard (x86) with SP1.
Would upgrading the server to a 64 bit OS and SQL 2008 x64 (Standard, the client doesn't want to pay for Enterprise) help performance at all? The primary reason we are running on 32 bit is because one of the apps that access the database is 32 bit only and the client doesn't like to separate the app and d.b. servers (don't ask me why). But the 32 bit app (actually, it's a DLL) can easily be re-written using SQL 2008 CLR, since it is not that big of a task, and would assist in future proofing our application a lot.
Any thoughts would be helpful!!!
September 2, 2010 at 4:57 am
Maybe, but not necessarily. Doubtful that it will be significant if it does.
The 32 bit app will run fine on a x64 OS, running under WOW (windows on windows). There's no need to rewrite anything in CLR.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
September 2, 2010 at 4:59 am
Oops, the application being on the same server as SQL means you'll probably have to pay attention to memory settings to make sure they don't fight over memory.
Replicating to a reporting server is a good idea if you can afford the hardware/licence.
Remember that you can apply a different set of indexes on the replicated db to the source db. That will mean the application isnt bogged down by having to update reporting indexes.
r
September 2, 2010 at 5:04 am
The app on the server does not use that much resources, the tasks it has to do are relatively mundane.
I also suspect that server was not really set up correctly (the server was set up before I started working at the company).
I'll have a look and discuss it with the sysadmins a bit.
September 2, 2010 at 6:44 am
Horatio (9/2/2010)
Hi there, me again,Just another question, the database server is currently running on Windows Server 2003 Enterprise R2 (x86), with SQL 2008 Standard (x86) with SP1.
Would upgrading the server to a 64 bit OS and SQL 2008 x64 (Standard, the client doesn't want to pay for Enterprise) help performance at all? The primary reason we are running on 32 bit is because one of the apps that access the database is 32 bit only and the client doesn't like to separate the app and d.b. servers (don't ask me why). But the 32 bit app (actually, it's a DLL) can easily be re-written using SQL 2008 CLR, since it is not that big of a task, and would assist in future proofing our application a lot.
Any thoughts would be helpful!!!
Whoa! The app and the database are running on the same box? That's a problem. If you want to optimize SQL Server, it needs to not have to face contention from other applications. You can make the db scream, but if the app is using all the CPU power, you've got no where to go. From the sounds of things, I suspect the db & the code have a number of tuning opportunities, but the first thing I'd do is seperate the app & database, absolutely.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply