February 20, 2012 at 2:58 am
GilaMonster (2/20/2012)
My point still stands, if SQL is on a non-dedicated server, then setting a sensible max server memory is almost essential
Agreed, that and removing wacko maintenance plans 😀
-----------------------------------------------------------------------------------------------------------
"Ya can't make an omelette without breaking just a few eggs" 😉
February 20, 2012 at 8:31 am
Perry Whittle (2/20/2012)
GilaMonster (2/20/2012)
My point still stands, if SQL is on a non-dedicated server, then setting a sensible max server memory is almost essentialAgreed, that and removing wacko maintenance plans 😀
+1000
WRACK, with a maintenance plan like that you ABSOLUTELY MUST get a professional on board to give your systems a performance review and health check. That plan is simply horrible, and based on that there is no telling how badly other things will be done - including things that could put your company's data at risk.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
February 20, 2012 at 2:50 pm
Perry Whittle (2/20/2012)Wow, you do like to make your server work for a living don't you 😉
I am not a DBA for this company, just a temp assignment for 4 weeks. I am just following and cleaning up what the previous DBA left off and do not have enough time to set everything straight. This works for the moment and I shall continue it.
WRACK
CodeLake
February 20, 2012 at 2:53 pm
TheSQLGuru (2/20/2012)
Perry Whittle (2/20/2012)
GilaMonster (2/20/2012)
My point still stands, if SQL is on a non-dedicated server, then setting a sensible max server memory is almost essentialAgreed, that and removing wacko maintenance plans 😀
+1000
WRACK, with a maintenance plan like that you ABSOLUTELY MUST get a professional on board to give your systems a performance review and health check. That plan is simply horrible, and based on that there is no telling how badly other things will be done - including things that could put your company's data at risk.
Thanks all. I will be setting the MAX memory soon, when the management approves of it! Again I am on a temp assignment here until they get a proper DBA here. I need to get things running for the time being.
I will be reviewing all index soon as few tables have around 70 indexes on them (yeah 70, it had 124 and I removed duplicate ones) and the update performance just sux. It is a little better now that I have removed duplicate indexes but I still need to review the unnecessary indexes.
Thanks all for the advice.
WRACK
CodeLake
February 20, 2012 at 3:03 pm
I would strongly suggest that you consider removing that shrink from the maint plan. There's just about no good reason to have one running on a regular basis.
Also, you may find this useful: http://www.simple-talk.com/books/sql-books/troubleshooting-sql-server-a-guide-for-the-accidental-dba/
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
February 20, 2012 at 3:41 pm
GilaMonster (2/20/2012)
I would strongly suggest that you consider removing that shrink from the maint plan. There's just about no good reason to have one running on a regular basis.Also, you may find this useful: http://www.simple-talk.com/books/sql-books/troubleshooting-sql-server-a-guide-for-the-accidental-dba/
Thanks. I will remove the shrink operation but I still don't really see it having any ill effect.
I will see that book and get any pointers 🙂 TA:)
WRACK
CodeLake
February 20, 2012 at 3:47 pm
So large amount of CPU and time shrinking the data file and fragmenting the indexes followed by another large amount of CPU and time spend rebuilding those indexes and regrowing the data file (possibly leaving the file the same size it was before the shrink) and possible file fragmentation, internal log fragmentation and resultant slow backups from the regrowth are not a problem?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
February 20, 2012 at 4:31 pm
WRACK (2/20/2012)
Perry Whittle (2/20/2012)Wow, you do like to make your server work for a living don't you 😉
I am not a DBA for this company, just a temp assignment for 4 weeks. I am just following and cleaning up what the previous DBA left off and do not have enough time to set everything straight. This works for the moment and I shall continue it.
Forgive me for laughing and I don't mean to be rude but!
You're saying you don't have an hour or so of your time to go find a decent maintenance script (and there are plenty of them on this site alone), but you have all the time in the world to mess around with indexes!!
-----------------------------------------------------------------------------------------------------------
"Ya can't make an omelette without breaking just a few eggs" 😉
February 20, 2012 at 4:55 pm
GilaMonster (2/20/2012)
So large amount of CPU and time shrinking the data file and fragmenting the indexes followed by another large amount of CPU and time spend rebuilding those indexes and regrowing the data file (possibly leaving the file the same size it was before the shrink) and possible file fragmentation, internal log fragmentation and resultant slow backups from the regrowth are not a problem?
Point taken. It shall be done. I wasn't aware of LOG fragmentation.
WRACK
CodeLake
February 20, 2012 at 5:03 pm
Perry Whittle (2/20/2012)Forgive me for laughing and I don't mean to be rude but!
You're saying you don't have an hour or so of your time to go find a decent maintenance script (and there are plenty of them on this site alone), but you have all the time in the world to mess around with indexes!!
No offence taken 🙂
I understand your point and finding a good script is not the problem here. I am at this place to do a temp job for few weeks and I have been told to keep things running and not change too many things! Whole reason I haven't done anything drastic yet.
If it was my way then I would start with deleting so many redundant index on few of the core tables used by the LOB app and trust me when I say this, you will be shocked to find out the number of index on each one of those table.
e.g. A table with 3.39 million rows (small I would say) and 46 columns, there are 124 index on that table alone. Now imagine the update performance! I am here to clean that up slowly and have been asked just keep an eye on things in the background.
That being said there is no reason for me to not improve the background tasks but I need to take a lot of caution as there is a struggle between departments on who to blame when things go wrong!
WRACK
CodeLake
February 20, 2012 at 11:55 pm
Completely understand, but in my experience pointing out their flawed maint plan and correcting it should serve to gain you a little kudos and maybe even an extension to your contract 😉
-----------------------------------------------------------------------------------------------------------
"Ya can't make an omelette without breaking just a few eggs" 😉
February 21, 2012 at 1:30 am
Perry Whittle (2/20/2012)
Completely understand, but in my experience pointing out their flawed maint plan and correcting it should serve to gain you a little kudos and maybe even an extension to your contract 😉
Already been pointed out so that is no issue and they are our long time clients but not for DBA purposes but from Software Development point of view. But I take your point.
Thank you all for help and suggestions. Appreciate it. 🙂
WRACK
CodeLake
February 22, 2012 at 8:46 am
Let me guess - someone got happy (probably multiple times) with the DTA?? 🙂
This sounds like my typical client. I love going to places like this because there are so many things I can quickly improve it makes me look like I walk on water! 😎
BTW, can you please pass them my contact information?? 😀
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
February 27, 2012 at 8:41 am
Just a quick poll(ish) question. [No pun intended.]
True or False: If the SQL Server service is the only thing running (dedicated server), the default max memory setting of (bignum) is actually preferred as SQL Server is intelligent enough to handle its own memory management automatically.
February 27, 2012 at 8:52 am
False.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 15 posts - 16 through 30 (of 34 total)
You must be logged in to reply to this topic. Login to reply