Viewing 15 posts - 1 through 15 (of 60 total)
Applied DBCC UPDATEUSAGE for 3 databases which resolved the issue. These databases are recently migrated from another box (2000 to 2005).
Any more maintenance to be performed now?
June 24, 2009 at 10:44 pm
3 Databases are corrupted. I am unable to fix it using DBCC CHECKDB repair allowing data loss..Any other ideas other than refreshing?
June 24, 2009 at 10:31 pm
It is OS memory and i have used 15 GB to SQL and 4 GB to OS. I will see if i leave more memory to OS to see if...
June 24, 2009 at 10:30 pm
On 2 drives, file fragmentation is 99% and 95%. I just took over these servers few weeks back. I am trying to defragment it and it seems to be taking...
June 24, 2009 at 7:20 pm
This is stored in memory dump
===================================================================== ...
June 24, 2009 at 8:59 am
Point noted, thanks Pradeep. But just wonder comm vault backup is not that faster or the process is complicated one?
June 22, 2009 at 8:45 am
I just disabled an audit trigger now and i think errors gone down. But still need to recycle service to clear everything..
Thanks,
Sudhie.
June 22, 2009 at 5:40 am
Is it windows level? But we have SP2 on both windows and SQL. How can i proceed with debugging the issue.
Additional info:
-> Server is on cluster
Memory configuration
Total Physical Memory:...
June 22, 2009 at 4:58 am
Thanks for the reply,
This morning i observed something, Just before service got stopped, i was below 2 spids occupying most of the memory.
60981200x00422656PAGEIOLATCH_SH ...
May 29, 2009 at 7:05 am
Its Cognos data reader, third party tool pushing data to SQL. Databases are created for itself, i am trying to find a guy who should be able to tell me...
May 28, 2009 at 5:00 am
Thanks Lock, i will go through them..
Thanks,
Sudhie.
May 28, 2009 at 4:12 am
Client confirmed today, this is not completely reporting and want full recovery model. Thanks a lot for your inputs, i may got for 10 GB increase and will enable the...
May 6, 2009 at 1:54 pm
Glenn Dorling (5/5/2009)
The required size for the log file of a database in Simple recovery is dependent on how much data is changed by the largest transaction. Assuming that...
May 5, 2009 at 4:31 pm
It gets refreshed for the first week and rest of the month, it will have calculated data and less transactions coming in. I am fine to suggest client on changing...
May 5, 2009 at 4:25 pm
I should have said, 8 Processors 😛
Thanks for the link, Yes i think Update statistics individually will be good.
April 27, 2009 at 1:45 pm
Viewing 15 posts - 1 through 15 (of 60 total)