February 2, 2007 at 10:51 pm
Hi all,
I got Windows 2000 SQl 2000 Std Server.I got few databases with 40Gb, 25 Gb, 10GB and few GB databases. Only 4GB RAM and 4 x CPU 2.0GHz.
Application is Epicor ERP System
But our server 4 CPU take 50-88% all the time and users are complain about the slowness.
Please tell me do i need to upgrade to SQL Ent or share the load between 2 servers?
If i run big databases with new server what sort of Server and HDD configuration i should use?
AusNetIT Solutions
Web Design | Web Hosting | SEO | IT Support
February 5, 2007 at 6:15 am
We run Epicor here as well and we have noticed that there are some user actions in the application that cause most of our performance problems. Between the frx reporting application and some of the actual Epicor application screens, there are some places where the users should be selecting some criteria but are not forced to. This can cause queries that run a long time. The worst part is that our users then "give up" and close the application but Epicor continues to run the queries to completion in the background. You may want to run profiler and find out exactly what is running when your performance problems are happening.
From what you have described, you probably have enough hardware.
February 6, 2007 at 4:39 am
Hi Michael,
Thanks, I know lots of DB locks and system freez due to same sort of user action. I got eBO,eFO and eDistribution But running all 3 database in one system cause the issue i think.
I got 52 SQL Jobs running in this serve? anyone run jobs from another server ( test Server?)
I think i need more CPU Power ?
Is anyone run Epicor 7.3 on MS SQL 2005 ? And the other question if so what sort of server we required to run above databases?
Please send me the server spec and MS SQL Version?
AusNetIT Solutions
Web Design | Web Hosting | SEO | IT Support
February 6, 2007 at 5:15 am
Hi Asela,
From just what you've said above, processors are not your problem. We've had boxes regularly running at 80 - 90% absolutely fine. Unless you're seeing your cpu's maxing for extended periods of time, your not likely to see much improvement in performance (I think MS say over 95% for extended periods). More likely you've got memory or i/o issues, which are possibly linked to what Michael says.
It may be worth looking at your buffer cache hit ratio, and also your WAITSTATS and disk queue lengths to get 1st indications, before looking even deeper.
rgds iwg
February 6, 2007 at 5:42 am
HI
Thanks, Can you tell me the values for above counters? I got Idera SQL Diagnostic Installed. So i can check it.
AusNetIT Solutions
Web Design | Web Hosting | SEO | IT Support
February 6, 2007 at 7:44 am
sounds to me that it's probably your application - does it use lots of dynamic/embedded sql, leading wildcard ( contains ) searches ?
I have a crm system that doesn't perform too well, 7 procs at 100% for extended periods. I threw memory at it and I turned off parallelism ( the sql was so full of scans that virtually all queries ran parallel plans and blocked themselves ) and I'm still working on it, but it's inherently poor in as much as it makes extensive use of cursors, can't re-use most query plans and of course doesn't use procs, oh and many of the queries allow an almost total ad-hoc build menaing I can't optimise indexes for best performance - although I'm working on it.
If you profile for all queries over 1000 reads that may give you an indication < grin >
As Ian says you need to find your bottleneck before making any changes.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
February 11, 2007 at 4:11 pm
HI,
I got 52 SQL Jobs running in this serve? anyone run jobs from another server ( test Server?)
AusNetIT Solutions
Web Design | Web Hosting | SEO | IT Support
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply