October 23, 2015 at 8:59 am
I have an application that generates numerous messages at each stage of a process. The application handles multiple customers and there could be hundreds of messages generated per second per customer. There is a potential for the application to generate many thousands of messages per second.
In order to handle the high TPS the application will automatically generate a new table as each customer comes on board. So, in the end if we have 500 customers we will have 500 versions of the same table. In addition to that, i want to have different file groups created on different spindles and the 500 tables distributed over those file groups.
The reason for this design is to make sure we get the most TPS we can possibly get. If anyone has a better idea please let me know.
I have two questions.
How do i dynamically analyse the performance/load of each of the file groups and second if i am using just one stored procedure to populate the tables will that become a bottleneck. Should i have a different store procedure for each of the 500 tables?
Thanks for any help with this.
October 23, 2015 at 9:12 am
October 23, 2015 at 9:16 am
We are using message queues. There is concern the queues will get flooded.
October 24, 2015 at 5:27 am
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply