October 23, 2008 at 2:47 am
Hi All,
In our Production Server one SQL Database is having 2 GB of size running slow and its is a online Database to the Web Application. I wants to know what are the area I can look for improve the performance of Database with out changing Below options.
1. Database Design
2. SQL Query.
I mean to say with respect to DBCC and other command which will improve the performance...
Cheers!
Sandy.
--
October 23, 2008 at 2:55 am
You really need to run profiler to capture the queries being run and then determine if there are areas that can be improved such as adding indexes or tuning the queries. You can also look to see if the statistics need updating. You can use perfmon to give you an indication of whether the db is cpu or memory bound and also collect other counters to show how sql server is performing but ultimately you will need to run profiler.
October 23, 2008 at 3:04 am
But, One more thing here..
If I wants to do some DBA Activity like DBCC...If my Database is Online.. will it become a issue for my endusers.?
Can you please tell some DBCC commands for Database performance..
Cheers!
Sandy.
--
October 23, 2008 at 3:05 am
supposing your hardware is well dimensioned (enough RAM, processor power...), configured and working,first thing you can do is to Check indexes!
Use profiler to check if the tables has the correct indexes
Check index fragmentation and if needed reorganize/rebuild indexes (or build a maintenance plan to do this on a regular schedule)
Check index usage: may be you have indexes not used and lack of an useful index.
EDIT: sorry, I was writing while DNA was answering!
See The Jenga blog
October 23, 2008 at 3:14 am
Can you please brief the Commands also for individual points what you have nicely explained last post, So I can use directly....
Cheers!
Sandy.
--
October 23, 2008 at 7:13 am
Run profiler and use the tuning template to capture the queries being run. Then either pick on the worst performing queries and manually get an explain plan using Management Studio or load the trace file into Database Tuning Advisor. This will help identify missing/incorrect indexes. To check whether stats should be updated you can see when they were last refreshed using the stats_date function (look in Books Online)
October 23, 2008 at 8:09 am
To check indexes status you can use:
DBCC SHOWCONTIG 'tabname' 'indexname'
Or if you have SQL 2005 use the dm_db_index_physical_stats to show the status of your indexes.
For Example:
DECLARE @DB_NUMERO AS INT
-- Select id for current database
SELECT @DB_NUMERO=DB_ID()
-- seleziona gli indici con una frammentazione > 30 percento nel database corrente
SELECT a.object_id as id_object, c.name as object_name, b.name as index_name, d.name as schema_name,
a.avg_fragmentation_in_percent, a.fragment_count, a.avg_fragment_size_in_pages , a.page_count,
* FROM sys.dm_db_index_physical_stats(@DB_NUMERO, NULL , NULL , NULL, NULL)
AS a
inner join sys.objects AS c ON a.object_id = c.object_id
inner JOIN sys.indexes AS b ON a.object_id = b.object_id AND a.index_id = b.index_id
inner join sys.schemas AS d ON c.schema_id = d.schema_id
WHERE a.avg_fragmentation_in_percent > 30
AND a.index_type_desc <> 'HEAP'
ORDER BY a.avg_fragmentation_in_percent DESC
Here I put a >30 on the fragmentation of the index to extract only those who seem to need a rebuild.
This works fine for my environment.
There are also other sys.dm to check idex usage (sys.dm_db_index_usage_stats) and operational status.
On indexes, take a look at this:
http://technet.microsoft.com/en-us/magazine/cc162476.aspx
This might not solve all your problems, but a well indexed table is always good!
See The Jenga blog
October 23, 2008 at 10:05 am
Sandy (:P),
I have a nice query for you for DB performance. Given Below...
but I am facing one problem..after running this query my DB Log file is increasing each time I am running this...Why So????
"I wants to know from SQL SERVER CENTRAL-DBA Guru's , Why Log file increasing after running this query to my DB and I wants to know My Query is correct or not for DB Performance as per DBA activity"
Look to this Points :
1. Check Database Integrity
2. Reorganize Index
3. Rebuild Index
4. Update Statistics
5. Clean Up History
6. Back Up Database (Full)
7. Maintenance Cleanup Task
My Query is Given Below......
-- Updating the stats of Database
EXEC SP_UPDATESTATS
-- Shrinking Database
DBCC SHRINKDATABASE (@DBName, TRUNCATEONLY) WITH NO_INFOMSGS;
-- Shrinking Database files
DBCC SHRINKFILE (@FileName1, TRUNCATEONLY) WITH NO_INFOMSGS;
DBCC SHRINKFILE (@FileName2, TRUNCATEONLY) WITH NO_INFOMSGS;
-- Added for Maintance Plan
DECLARE @db sysname, @sql nvarchar(4000)
DECLARE CRDB CURSOR LOCAL FAST_FORWARD FOR
SELECT name FROM master.dbo.sysdatabases
WHERE name IN (@DBName)
OPEN CRDB
FETCH CRDB INTO @db
WHILE @@FETCH_STATUS=0
BEGIN
PRINT '======= '+QUOTENAME(@db)+' "'+CAST(GETDATE() as varchar(100))+'" ======='
SET @sql= N'SET XACT_ABORT OFF'
SET @sql=@sql+N' DBCC DBREINDEX (''?'') WITH NO_INFOMSGS'
SET @sql=@sql+N' IF EXISTS (SELECT * FROM sysindexes WHERE indid=1 AND
id=OBJECT_ID(''?''))'
SET @sql=@sql+N' DBCC INDEXDEFRAG (0, ''?'', 1) WITH NO_INFOMSGS'
SET @sql=@sql+N' UPDATE STATISTICS ? EXEC sp_recompile ''?'''
SET @sql= N' USE ['+@db+'] EXEC sp_msforeachtable N''' +
REPLACE(@sql,'''','''''')+N''''
EXEC(@sql)
DBCC UPDATEUSAGE (@db) WITH NO_INFOMSGS
--DBCC CHECKDB (@db) WITH NO_INFOMSGS
FETCH CRDB INTO @db
END
CLOSE CRDB
DEALLOCATE CRDB
-- Cleaning the Proc Cache of Database
DBCC PROCCACHEWITH NO_INFOMSGS;
DBCC FREEPROCCACHEWITH NO_INFOMSGS
DBCC DROPCLEANBUFFERSWITH NO_INFOMSGS
DBCC PROCCACHEWITH NO_INFOMSGS;
--DBCC CHECKALLOC WITH NO_INFOMSGS;
--DBCC CHECKDBWITH NO_INFOMSGS;
My Problem Here is After running this MY DB performance increase but my log file is Increasing as leading to degrade my performance...
What I wants, is to run this query without increasing the log size...
Please add any new command If you from DBA Gurus...
And If Gail can comments on it I will be more happy.....:)
Luv's
Milu 🙂
October 23, 2008 at 12:48 pm
Do not run DBCC SHRINKDATABASE on a PRODUCTION database :w00t:, especially when it is ONLINE.
Do a dbcc update_statistics(fullscan)
Cleaning PROC Cache calls for more compilation time, thereby slowing things down.
One thing I would say is check the BufferCacheHit Ration ( Ideally > 99%) for a Production Database.
Also, CURSORS are one big aspect to look. you could check to see the source code for the usage of cursors and then do a
DBCC SHOWCONTIG.
Check for the logical scan fragmentation on key tables, if it is greater than 10% ( and less than 30%) do a re-organize indexes and if it is >30% then please re-build the indexes.
Hope this helps..
Thanks!!
The_SQL_DBA
MCTS
"Quality is never an accident; it is always the result of high intention, sincere effort, intelligent direction and skillful execution; it represents the wise choice of many alternatives."
October 23, 2008 at 12:53 pm
Milu (10/23/2008)
Sandy (:P),My Problem Here is After running this MY DB performance increase but my log file is Increasing as leading to degrade my performance...
What I wants, is to run this query without increasing the log size...
Please do not run so many DBCC commands, these are resource intensive. And so many DBCC's on a PRODUCTION database :w00t: is a NO MOJO...if you are to run them to check for the performance on a scheduled basis then change the RECOVERY MODEL to BULK LOGGED and once you are done you can always switch back to FULL RECOVERY...
Waiting for advice from Gail..:)
Thanks!!!
The_SQL_DBA
MCTS
"Quality is never an accident; it is always the result of high intention, sincere effort, intelligent direction and skillful execution; it represents the wise choice of many alternatives."
October 23, 2008 at 9:09 pm
The_SQL_DBA (10/23/2008)
Do not run DBCC SHRINKDATABASE on a PRODUCTION database :w00t:, especially when it is ONLINE.
Heh... wanna tell us why not? 🙂
--Jeff Moden
Change is inevitable... Change for the better is not.
October 23, 2008 at 11:37 pm
Thanks...DNA ,Meely (:P), Igor Brusetti & The_SQL_DBA ...
Jeff Can you please help me for the same...??
Cheers!
Sandy.
--
October 24, 2008 at 7:43 am
Please don't think me too rude, but you need to get a DBA. You simply can't run a database server - including MS SQL Server - for non-trivial applications without someone who knows how it works and how to tune it.
You need to do the basic work of profiling the server engine and the hardware utilization. These are minimum skills for any DBA; and they can be learned from books and websites,so it doesn't mean you have to pay someone big bucks.
You seem very focused on DBCC. Why? Your problem is performance (although you haven't quantified or qualified that in any way).
You say "without changing the schema" - does that include not adding indices? I understand about running 3rd party code - although I have more than once gone to a vendor with proof that they could improve a query or a schema and gotten them to do so.
However, sometimes you can add an index, which arguably does not change the schema (arguably, it does).
You really need to crack the books and learn this stuff - it's learnable, and a badly performing server is just the lab you need - or pay someone who can. Or else realize you don't have the resources to support this application.
roger reid
Roger L Reid
October 24, 2008 at 2:21 pm
Jeff...correct me if I am wrong 🙂
After the shrink, as users add rows etc in the database, the file has to grow again.
Growing a database file is an expensive operation, it takes time and it also hurts performance (lots of resource usage). Also, some modifications will be blocked until the grow operation has finished.
(Applies to shrinking of both data and log files.)
Heavy shrinking and growing of database files will fragment your file system, which will further hurt performance.
(Applies to shrinking of both data and log files.)
Sandy..with just 2Gb of size,only you can determine whether you want to shrink or not...:P
Thanks!!
The_SQL_DBA
MCTS
"Quality is never an accident; it is always the result of high intention, sincere effort, intelligent direction and skillful execution; it represents the wise choice of many alternatives."
October 24, 2008 at 5:27 pm
The_SQL_DBA (10/24/2008)
Jeff...correct me if I am wrong 🙂After the shrink, as users add rows etc in the database, the file has to grow again.
Growing a database file is an expensive operation, it takes time and it also hurts performance (lots of resource usage). Also, some modifications will be blocked until the grow operation has finished.
(Applies to shrinking of both data and log files.)
Heavy shrinking and growing of database files will fragment your file system, which will further hurt performance.
(Applies to shrinking of both data and log files.)
Sandy..with just 2Gb of size,only you can determine whether you want to shrink or not...:P
Thanks!!
Heh... that's better. Even if ya have to tell people on a 100 posts, when you tell them to do something a certain way, it's very beneficial to them and a dozen other readers to also say "why".
By the way, your explanation is pretty much spot on except for the part about the 2GB shrinks... same rules apply no matter how big or small the DB is.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 30 total)
You must be logged in to reply to this topic. Login to reply