April 16, 2010 at 2:54 pm
I have a client that has 5000 customers and in the database each customer has a table.
dbo.Customer1
dbo.Customer2
...
dbo.Customer5000
I need an efficient way of reporting on these tables without looping through each table to get the information on each customer. I attempted creating a view, but a view with 5000 UNION ALL's is not efficient. I'm looking for suggestions.
Thanks in advance.
April 16, 2010 at 3:23 pm
How much data?
How often does it change?
Is a snapshot ok?
Are all the tables identical schema wise?
April 16, 2010 at 3:27 pm
Interesting situation, I can see that for every new customer, the new table will created ...will it continue or it stops at 5000th customer's table, and you just trying to read data from these tables!?
April 16, 2010 at 3:29 pm
sgj31 (4/16/2010)
I have a client that has 5000 customers and in the database each customer has a table.dbo.Customer1
dbo.Customer2
...
dbo.Customer5000
I need an efficient way of reporting on these tables without looping through each table to get the information on each customer. I attempted creating a view, but a view with 5000 UNION ALL's is not efficient. I'm looking for suggestions.
Thanks in advance.
I don't believe that a view with 5000 UNION ALL's will actually work. IIRC correctly, SQL Server 2000 had a limit of something like 243 UNION ALL's and I don't know if that's changed or not.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 16, 2010 at 3:45 pm
Garadin (4/16/2010)
How much data?How often does it change?
Is a snapshot ok?
Are all the tables identical schema wise?
Approx 9000 rows in each table. All tables are the same schema wise. The data can change daily and they keep adding a new table for every new Customer. They get 1-3 new customers a month.
They are expecting on-demand reports, which I can provide, but the backend sql takes anywhere from 10 - 30 minutes to loop through all the tables and this is not acceptable to the users.
I have never seen a database designed like this so I'm at a loss for handling the data. Most of the processes use dynamic sql to decide which customer table to update. I'm using the same logic for reports. I will suggest a snapshot of the data that refreshes once or twice a day and see if that will be close enough to real-time for the users. I imagine it won't....darn this world of instant gratification!
April 16, 2010 at 3:50 pm
All I can think of is to use the undocumented sp_foreachtable (google for details) which really is looping through the tables.
I'm not sure though, if this database design really is the best posiible solution...
Especially, if (some) data need to be queried that are related to more than one customer.
I don't know enough regarding the business case but I probably would have used a master table holding all the customer data and add views per customer if needed.
April 16, 2010 at 3:56 pm
Build another table. Copy the data from all the individual tables into the central table and add a customer id, if there isn't one already. Report from the combined table.
You could do this periodically, or put triggers on the individual tables that replicated the DML into a permanent combined table.
And change whatever goofy app they're using to work off the combined table.
And please let us know the name of the guy who designed this so no one makes the mistake of hiring him.
April 16, 2010 at 4:09 pm
David Webb-200187 (4/16/2010)
Build another table. Copy the data from all the individual tables into the central table and add a customer id, if there isn't one already. Report from the combined table.You could do this periodically, or put triggers on the individual tables that replicated the DML into a permanent combined table.
And change whatever goofy app they're using to work off the combined table.
And please let us know the name of the guy who designed this so no one makes the mistake of hiring him.
Excellent suggestion. Thanks.
April 16, 2010 at 4:11 pm
You *may*, emphasis on *may* get better performance creating a bulk staging table where you copy all the data into... then create indexes on it, and run your queries off of that. If they're ok with the data being a snapshot of the night before, you could populate this table overnight at a low usage time and then run a much faster query off of the single table.
April 16, 2010 at 4:37 pm
9000 * 5000 = 45 000 000 records and growing up if they have new customers hahaa you should look for the nice performance with those records in a one table, sure if you have HDD space...
and sure that I vote for this:
...And please let us know the name of the guy who designed this so no one makes the mistake of hiring him.
:w00t::w00t::w00t::w00t::w00t::w00t:
April 16, 2010 at 4:39 pm
I think the only way to do this is David's suggestion, which is essentially a data warehouse. Move the data over periodically, every hour or so, and build reports off something that makes sense.
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply