March 6, 2008 at 12:47 pm
If the shopper database is small, it has no problem, but if there are more than 2 million shoppers, right now they put it in LDAP database because when it tries to retrieve the shopper info, it gets time out.
Remember there are more than 1 shopper database in the server. They tried to squeeze in as many database as possible in the server.
Another problem is the merchant can import and export the data during the day. For example the merchant wants to update the product table, then a text file is sent and the system would load the info into the product table while the online transaction is running. On the other hand, the merchant can export the order table to get the current order to process.
I suggest to have a snapshot database to do the export function, in that way, it will not affect the online transaction. But my boss insists the import function need to run concurrently.
There is one query to retrieve the shoppers with paging, sorting and filtering.
It is written in dynamic sql because some of the fields passed into the procedure
SET @sql = 'WITH ent as (SELECT ROW_NUMBER() OVER (ORDER BY sort_column) AS ROW FROM shopper WHERE filter statement)
SELECT shopper_id, first_name, last_name, email, phone, fax
FROM ent WHERE ROW BETWEEN inputstartingrownumber AND numberofrows'
EXEC sp_executesql @sql
March 6, 2008 at 1:38 pm
Oooohhh... I just ran into that paging problem... really bad case of parameter sniffing and the traditional methods don't fix it... only full blown dynamic SQL fixes it... lemm see if I can find the proof-of-principle examples I did that show the problem and how to fix it... you may not have to change anything to get the speed you want except maybe the paging routine.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 6, 2008 at 2:27 pm
Sorry, but it is my professional opinion that hunting and pecking on this forum will never get you to 5000 hits/sec returning at 0.02 sec max per hit spread across hundreds or thousands of databases with thousands to 10s of millions of rows of data. That kind of scalability can only be achieved with ground-up design and testing from A-Z and back again. It is astounding that your manager has given you such an edict - complete disconnect with reality IMNSHO, especially given the existing data structures and system design.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
March 6, 2008 at 2:33 pm
If it ain't broke, don't fix it, else throw more hardware instead of fixing the underlying problem. Seems like how most management approaches problems like this.
All we can do is try and help those who don't want the help (and Loner, I am NOT talking about you here, you are the one looking for help).
Best of luck in this endeavour! We will help you where we can!
😎
March 6, 2008 at 2:44 pm
Hardware can certainly help. I am regularly amazed at how well SQL Server and Windows perform on modern hardware - at least when SQL, databases, code and hardware are properly designed, configured tuned and maintained!! But hardware alone aint gonna solve Loner's problems, and neither is some indexes or a few spiffy code rewrites. Hopefully he will be lucky and the manager's requirements are pink elephant numbers that don't match reality and are 1 or even 2 order's of magnitude too 'hopeful'. 🙂
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
March 6, 2008 at 2:48 pm
Loner (3/6/2008)
If the shopper database is small, it has no problem, but if there are more than 2 million shoppers, right now they put it in LDAP database because when it tries to retrieve the shopper info, it gets time out.Remember there are more than 1 shopper database in the server. They tried to squeeze in as many database as possible in the server.
Another problem is the merchant can import and export the data during the day. For example the merchant wants to update the product table, then a text file is sent and the system would load the info into the product table while the online transaction is running. On the other hand, the merchant can export the order table to get the current order to process.
I suggest to have a snapshot database to do the export function, in that way, it will not affect the online transaction. But my boss insists the import function need to run concurrently.
There is one query to retrieve the shoppers with paging, sorting and filtering.
It is written in dynamic sql because some of the fields passed into the procedure
SET @sql = 'WITH ent as (SELECT ROW_NUMBER() OVER (ORDER BY sort_column) AS ROW FROM shopper WHERE filter statement)
SELECT shopper_id, first_name, last_name, email, phone, fax
FROM ent WHERE ROW BETWEEN inputstartingrownumber AND numberofrows'
EXEC sp_executesql @sql
Let's hope you don't need to do this very often. Without indexes to help you (since sortorder can change at any time) - this one is likely to be a DOG.
You may also care to try it with TOP(N) as well - it may be at times faster than the ROw_NUMBER() scenario.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
March 6, 2008 at 4:52 pm
A major architectural problem!!!!!!
This is what you need.
x64 SQL2005 Enterprise.
x64 Windows server 2008
4 or 8 7000 serious quad core xeon processors 16 or 32 cores
64 GB of Ram
A massively fast SAN
Table partitions on the table.
May have to federate the database or go with Memory Hard drives.
You have to prevent table scans and keep the table in memory.
You would have one server and database just for customer information
And have a 1-3 server and database for the product (load balancer)
And one server and database for financial transactions.
There no high availability yet.
Viewing 7 posts - 16 through 21 (of 21 total)
You must be logged in to reply to this topic. Login to reply