February 9, 2009 at 10:18 pm
any rules of thumb/wisdom about how many processors/cores per interactive user? We've got a canned client so the queries are "standardized" and tuned; the users cant make up queries on their own crazy queries. The "main" table could be around 100M -150M rows and the rest of the tbls are under 500k rows with old data regularly archived to keep only 'current' data.
We're running about 15-20 users now, the db is under 24GB (only 50k rows in the main tbl) and CPUs rarely spike above 25%. usually single digits. How linear is CPU load vs. user count?
tks much.
February 9, 2009 at 10:22 pm
The classic answer is "it depends".
I'm sure you see queries that consume more CPU and eat up more resources. If all users run those, the server can be overloaded, but if they all run highly indexed queries, there may be no load at all.
If you have enough users, you can probably correlate the CPUs to users to some extent, but with a small number of users, I'm not sure how well you can correlate them. It's more a matter of the workloads (and timing) compared to CPU usage.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply