February 23, 2012 at 8:01 am
Hello to everybody,
I'm a bit new to SQL and from yesterday I'm trying make a report for somebody.
I have in one DB many tables let's say 1000, from these 1000 tables only about 200 contain data relevant to my report. These tables have the same columns and let's say they are in the format table_1 - table_1000.
the table number is given by a special key from another bigger table and by executing a select to it i found about 200 keys which results in about 200 table like ( table_01, table_123, table_554...and so on) which contain my needed data.
My question is which is the fastest way that I can extract the data from those 200 tables.
Your posts are very appreciated.
Regards,
February 23, 2012 at 8:19 am
are you extracting the data from DB1 to DB2? or just out into a report direct from the base tables?
February 23, 2012 at 8:24 am
It's also not clear what you're getting from each table. Rather than try to describe this, why not post 3 tables with the pattern, and explain what data you want to get out. Extracting data could mean a few things.
Also, a report typically doesn't have data from 200 tables. That doesn't really make sense, so it would help if you could clarify with 3-4 real tables (include CREATE TABLE statements) and some data.
February 23, 2012 at 8:31 am
You can use something like this to generate the sql:
create table table_01 (id int);
create table table_123 (id int);
create table table_554 (id int);
insert into table_01 values (2), (200);
insert into table_123 values (3), (300);
insert into table_554 values (4), (400);
create table keys (tablenum varchar(10));
insert into keys select '01' union select '123' union select '554'
declare @sql varchar(max)
set @sql =
(select STUFF(
(select 'union all select * from table_' + tablenum + ' '
from keys
for xml path (''))
, 1, 10, '')
);
print @sql;
______________________________________________________________________________
How I want a drink, alcoholic of course, after the heavy lectures involving quantum mechanics.
February 23, 2012 at 9:05 am
The report is to analyze how many events clients play on one ticket for sport betting.
a ticket can contain from 1 - 12 events played
report should look like :
January 2012
x tickets - 1 event
x tickets - 2 events
............
x tickets - 12 events
the tables contain info about the machines who sold the ticket , the ticket id, ticket price , number of events played per ticket.
I need the data from all those tables into one temporary table to execute a query to give me the number of tickets grouped by events on the ticket
February 23, 2012 at 9:23 am
max.airz (2/23/2012)
My question is which is the fastest way that I can extract the data from those 200 tables.
Does the "report" needs the data coming from the 200 tables at the same time? If this is the case a view on the base tables appears to be a sensible solution - view can be created on-the-fly.
_____________________________________
Pablo (Paul) Berzukov
Author of Understanding Database Administration available at Amazon and other bookstores.
Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.February 23, 2012 at 9:50 am
You still haven't provided any DDL or sample data that can help.
I assume that you have a number of tables that are all the same, but for different events. A very inefficient storage structure. Perhaps what you ought to consider is a central table for reporting (like a data mart) and ETL the data from the source tables to this central table for reporting.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply