June 1, 2011 at 12:10 am
Hi Guys,
We have a situation and we need some insight from experienced folks here...
We have a couple of database which has financial data. Data is sent everyday in XML file which is flattened and loaded into tables in say DATABASE "A". Each of these tables (one for each file) have few billions of records and have indexes as per need. Tables have paritions based on the datefield which if often used by queries. Data from these tables are further loaded into a database which is used by front-end applications.
Bunch of users have access to Database "A" which has those huge tables. They run queries against the data and create data-sets (exports data) to be used for their analysis or do their analysis against the data. What we need is a separate DATABASE "B" which they should use for that purpose... Now how should I create the DATABASE "B" so that the query runs faster? Their analysis and export is faster? We are using SQL Server 2008 right now.. Do we have to use tools like Netezza? Any suggestion is appreciated.
Thanks a lot.
June 1, 2011 at 11:13 am
bumping up!
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply