November 28, 2012 at 11:39 pm
We have database 'testDb' with 5 tables.
mainly 'user_details_tab' and 'user_deatils_backup_tab'.
The database 'testdb' filled with large volume of data daily.
The data in 'user_details_tab' moves to 'user_deatils_backup_tab' and clears 'user_details_tab' in each day.
The problem is that,
1.after 2 or weeks the backup 'user_deatils_backup_tab' table contain millions of data. this will effect the querying of data in that table.
2.millions of data inserted into this database daily . In this situation i cann't imagine appliction running continously for year or more .
note: Need to backup this database
Any one can suggest a solution to this problems ?
November 29, 2012 at 1:20 am
sumith1andonly1 (11/28/2012)
Any one can suggest a solution to this problems ?
A couple of things suggest themselves offhand.
We have database 'testDb' with 5 tables.
mainly 'user_details_tab' and 'user_deatils_backup_tab'.
The database 'testdb' filled with large volume of data daily.
The data in 'user_details_tab' moves to 'user_deatils_backup_tab' and clears 'user_details_tab' in each day.
5 tables for an entire system? These tables sound horribly wide and denormalized. I'd start there.
The problem is that,
1.after 2 or weeks the backup 'user_deatils_backup_tab' table contain millions of data. this will effect the querying of data in that table.
2.millions of data inserted into this database daily . In this situation i cann't imagine appliction running continously for year or more .
note: Need to backup this database
Is the User Details actually changed to different daily or are you simply snapshotting a day's data every day and never removing the archive? If you're merely snapshotting, I'd recommend using a logging table instead of copying the data daily, which would allow you to know the user's details at a point in time.
We'd need some more information however if you really want us to be able to assist you. We need to know what the purpose of this data is, roughly how large it is, etc. We'd also need to know exactly the reasoning behind trying to maintain copies of data daily. This is usually what a backup is for.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
November 29, 2012 at 5:24 am
Millions of rows are not necessarily a problem. It really depends on your indexes and statistics. If you have something like an identity column as the primary key, then you may be seeing statistics aging terribly over time because all the new data is inserted past the end of the stats. This leads to issues where the optimizer assumes a single row. So, you get around this by manually updating the stats as needed.
Also, millions of rows doesn't necessarily mean large databases. You can have that and still have a small database. So backups won't be a problem.
With such vague requirements, it's hard to give you hard and fast suggestions for what to do about it. Will all data be needed forever? If you are in a situation where you can partition the data and move older parts out, you might want to look into partitioning as a data management solution (not a performance enhancement).
Other than that, tune the queries, tune the structures, tune the indexes and maintain the statistics are the best things you can do in general.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply