February 22, 2010 at 7:09 pm
The version of sql server is 2008.
I just restored a backup of a database from the development server to the production server. The database was performing very good in the development server all the seeks were very fast in development server. After I restored the same database onto production server, the data seeks were very slow and our production server has very good hardware much better than development box. When I try to read data from the application even from single table from production environment, it is very slow. I tried every thing like rebuilding indexes etc in the production box, but still
nothing is helping. Trying to figure out how to trouble shoot this performance issue.
Please let me know, how I can find out where the poblem is. Thanks.
February 22, 2010 at 7:46 pm
Try SQL Server Profiler, and see what the apps are doing.
SQL Server Database Administrator
February 23, 2010 at 8:27 am
The application we have is MS Dynamics Navision ERP, the application data seeks are slow in production. Like I said it's fast in Test environment. We ran the trace for some, duration of the command execution is more than 300 ms in prod.
But one thing I do'nt understand is our production environment is far better than test enviornment, the database has same structures and same indexes in both prod and test environments. Now I am trying to figure out how to dump all the configuration/switch information on the servers
as they should be close to identical. Please let me know how I can get the configuration/switch information on the servers so that I can compare from both environments.
Thanks.
February 23, 2010 at 11:07 am
Here is the configuration of our Production server:
Windows server 2008 R2 Enterprise
4 processors
Installed memory(RAM): 128GB
64 bit OS
Below is the info on the Memory which I got from the sql server properties:
Memory:
Server memory options:
Use AWE to allocate memory: Not checked
Minimum server memory(in MB): 102400
Maximum server memory(in MB): 102400
Given the above information, I am trying to see if there is anything related to the memory issue, where SQL server is not using the available memory.
How do I find out, how much memory SQL Server is using?
I am not sure how to resolve this issue, any help is greatly appreciated. Thanks.
February 23, 2010 at 11:50 am
These is one big NONO with your configuration:
Minimum server memory(in MB): 102400
Maximum server memory(in MB): 102400
Don't set the same value for min and max server memory ! It confuses sqlserver memory manager !
With 64-bit you don't need to enable AWE because it can directly access all of your ram (+2TB)
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
February 23, 2010 at 12:00 pm
What value shall I assign to the Minimum server memory?
Is the Maximum server memory value okay?
Thanks.
February 23, 2010 at 12:24 pm
ALZDBA (2/23/2010)
These is one big NONO with your configuration:Minimum server memory(in MB): 102400
Maximum server memory(in MB): 102400
Don't set the same value for min and max server memory ! It confuses sqlserver memory manager !
With 64-bit you don't need to enable AWE because it can directly access all of your ram (+2TB)
FYI, Per this article: This says to set both Min and Max memory to the same value:
February 23, 2010 at 1:24 pm
Have you compared the execution plans between the two environments to see if they are the same? How about any hits to the disk? How hard is your hardware working processing other queries in production?
February 23, 2010 at 3:51 pm
Consider looking at the following:
page life expectancy - (I think 5 mins is the MS best practice) this will tell you how long data pages are living in memory
buffer cache hit ratio - (I think 99+% is the MS best practice) this will tell you how frequently SQL goes to memory for data
pagefile reads - (lower the better) this will tell you if the system is using the page file and reading (memory issue)
disk queue reads / writes - this will tell you if the box is having trouble fetching data from disks assuming it's not in the cache
To me, memory is something that should be configured according to the duties of the box. If, say, the box is solely for SQL, I'd recommend using the majority (90%) for the database engine. Note: this will probably set off alarms (SCOM or other alert software) because it'll look like the box is slammed, but it really isn't. It can also result in crazy-high page life expectancies which isn't a bad thing, but can indicate that the database doesn't need that much memory and can possibly take on more work. I hope that kind of makes sense.
---FR
February 23, 2010 at 11:57 pm
Mh-397891 (2/23/2010)
http://www.sqlmag.com/Articles/ArticleID/37890/pg/2/2.html%5B/quote%5D
The article is dated April 2003. (sql2000)
The NONO is about 64-bit.
Experiences with 64-bit have shown it is better to just have these values deffer.
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
February 24, 2010 at 12:21 am
Hmm... I think the number you're looking for will be based on the SLA / user experience you're bound to deliver. I also think that the exact number you're looking for is dynamic in nature as it will change over time as the database grows. Once you determine the level of performance you want to deliver---and can meet that given the hardware you have, allocate your resources such that you meet your goals. If you can exceed them, perhaps you can do so until the spare resources are needed by other databases seeking a home. Use the page life, query response, cache hit ratio and other metrics listed earlier along with execution plans to find out what you're working with.
---FR
February 24, 2010 at 11:37 am
mem settings
•If server memory is <= 3 GB, set min memory to 1 GB, and max memory to server less 512 MB.
•If server memory is > 3 GB, settings depend on the server platform (32-bit or 64-bit), presence of /3 GB in the boot.ini and on usage of AWE.
•For servers on 64-bit platform AWE is not required, and min memory should be server memory less 2 GB, max memory – server memory less 1 GB.
•On 32-bit platform with AWE : min memory and max memory should be server memory less 1 GB; without AWE – if /3 GB switch is not used, max memory = 2 GB, with the switch it is 3 GB (with min memory 1 GB less than that).
February 24, 2010 at 11:39 am
1. Are both builds of SQL the same?
2. Do the execution plans look the same?
3. How are you comparing response times between the two systems?
February 24, 2010 at 11:54 am
BaldingLoopMan (2/24/2010)
mem settings•If server memory is <= 3 GB, set min memory to 1 GB, and max memory to server less 512 MB.
•If server memory is > 3 GB, settings depend on the server platform (32-bit or 64-bit), presence of /3 GB in the boot.ini and on usage of AWE.
•For servers on 64-bit platform AWE is not required, and min memory should be server memory less 2 GB, max memory – server memory less 1 GB.
•On 32-bit platform with AWE : min memory and max memory should be server memory less 1 GB; without AWE – if /3 GB switch is not used, max memory = 2 GB, with the switch it is 3 GB (with min memory 1 GB less than that).
If you can I would leave at least 3GB for the OS in a 64bit system. Things like replication/connections/third party items all take up pieces of that memory space. I've run into quite a few issues as regards paging when not leaving plenty of room.
February 6, 2013 at 1:47 am
Yeah you should check the actual query plan.
One more thing maybe you should check the amount of IO or scanning actually happened using Set statistics io on then execute your query or sp.
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply