January 24, 2011 at 3:05 pm
Hi All,
I have the following situation.
An instance of SQL Server 2008 R2 running Windows Server 2008 R2.
I have a website which connects to this database and inserts records in around 6 tables(All have Identity column as Primary Key).
We have lot of inserts into these 6 tables when there are active users on the website.
Last time I had lot of traffic on the website it slowed down.
Now the problem is we might have a lot of users (25000) simultaneously on the website.
How do I make this scalable and not let the db slow down ? How do I handle the Identity column on these tables?
Thanks,
Suri
January 24, 2011 at 3:15 pm
the identity() columns themselves are not a problem; I read an
article years ago where SQL2000 could handle 400 new identity values per second in a table...and that was given the hardware at the time was a lot less than today.
lots of constraints, like FK's, check constraints, etc tend to slow things down more than identity() columns ever will.
tuning this situation's going to be a bit more details, and a lot less general; i think you might want to get some of the performance monitoring tools in place, see where the bottlenecks really are, and tune from there.
Lowell
January 24, 2011 at 3:23 pm
Thanks for the reply.
I cannot test my solution because I dont know to have 25000 concurrent users on the website. Also their navigation through the site.
I am running sql standard edition . Will clustering help anyway and how will it handle the identity problem ?
January 24, 2011 at 3:29 pm
no i was thinking that you should run monitoring tools against your live server and gather statistics that help identify your bottlenecks;
there's a link on this thread that has a good writeup on comparing different tools, and if you search for "monitoring" here on SSC there's a lot of good articles and threads
http://www.sqlservercentral.com/Forums/Topic872079-146-1.aspx
Lowell
January 24, 2011 at 3:59 pm
Hello Lowell,
Thanks for the quick replies.
As you said we can insert 400 rows in sql 2000 per second. Using the info you provided and @@max_connections how can I find out the max number of users I can have concurrently who insert into the tables ?
I understand the identification of the bottleneck you have suggested. I will surely look into this
My primary goal is to find out how many users my database can support , with all of the user inserting records concurrently
January 25, 2011 at 5:24 am
Hi If all Users are only reading the data then just create a Snapshot of ur Db that will help you in a bigway
NEVER NEVER EVER GIVE UP;-)
January 28, 2011 at 11:27 pm
You will need to collect statistics like what was suggested using some tool. Perfmon is free and works. The only way to determine your bottleneck is to collect the data and look at the values to see what resources you have contention for and then look closer at that resource. It could be one of many things we would just be guessing.
Start at a high level and work down to a granular level rather than focusing on the identity property. Long tern you will want to baseline your system so you have data values to compare to and can trend over time. As things change on your system the max supportable users is a moving target and directly relates to resource performance.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply