February 11, 2015 at 5:22 am
Every night I do a full backup of my database and it's starting to get too big to do this with. Additionally from what I understand if I do in fact lose a table some how in order to get it back I have to have space somewhere to use the full backup to re-create the database in order to get the one table that I'm looking for. These seems a bit cumbersome to me and I either don't fully understand it or there's got to be a better way.
When I used to use MySQL there was a command that you'd use in DOS called mysqldump and it would create a .sql file for every table within the database. These .sql files were the create statements, primary key, indexes and insert statements for all data for that table. I feel this would be easier to recover from as I would only have to open one file to recreate one table. Is it possible some how to write a stored procedure in T-SQL that gives me at the least the create table statement of my tables that I can then output to a .sql file for every table? Then maybe I can add the primary key and index creation so at the very least I have a job that re-creates the create tables on my tables that I can run maybe every week or every other week in case a table gets corrupted?
February 11, 2015 at 5:44 am
If you need to recover a single table, and want to minimize data loss, I would restore from the backup to a separate database. Then, use the restored database to recreated the dropped table in your live database. Then you can drop the restored database. Of course, you'll have to deal with any foreign keys or other constraints on the table you restore.
By all means, test this on your own.
February 11, 2015 at 5:46 am
While it's possible to export data (look up BCP), SQL Server doesn't script databases as a backup.
SQL's backups are more efficient than scripts of the DB would be. Could you imagine scripting out a 1 TB database? The files would end up many times the size of the DB in total due to the extra characters from the INSERT statements. Whereas SQL backups allow you to take full backups, to easily get a differential backup that contains just the changes since the full, and then to have log backups which allow for restoring a database to any point in time that is required. Doing this with exports of the database would be impossible.
Also, you mentioned corruption, if a SQL database gets corrupted, it's not necessarily a single table. It could be the allocation structures, the system tables or multiple tables, serious cases necessitate a restore from backup.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
February 14, 2015 at 5:23 am
@ Ed Wagner
That's my point, it's too huge to recover to another database just for one table. It would take forever. I have about 80% of my tables that I just want to retain the structure and the stored procedure that inserts the data from a flat file and re-creates the indexes and keys. I've found a way with the system tables to get the stored procedures but can't figure out how to write some code that generates the create table statements for all 200 of my tables (since I don't want to right click all of those).
@ GilaMonster
I do plan on still having the entire database backed up but not necessarily every day still. My situation is that there is about 80% of the tables that I don't need the insert statements on because the data comes from a flat file but I do have another 20% of the tables that are user interactive and we update the tables through a web interface. If that got lost I'd be in real trouble.
Thank you both for your posts, it's helping me understand more about sql server.
I'd like to (if possible) export the stored procedures and create table statements for all of my tables and then re-create insert statements and primary keys / indexes etc.. on select tables (the 20%).
I would have thought that someone else would have done this somewhere on the internet but I can't find anything anywhere. I was just going to install the SSMS tools pack because it looks like it may help me in what I'm doing. Does anyone have experience with this? Can I automate what I'm looking to do with this tool? Thanks in advance!
February 14, 2015 at 8:59 am
DKY (2/14/2015)
@ Ed WagnerThat's my point, it's too huge to recover to another database just for one table. It would take forever. I have about 80% of my tables that I just want to retain the structure and the stored procedure that inserts the data from a flat file and re-creates the indexes and keys. I've found a way with the system tables to get the stored procedures but can't figure out how to write some code that generates the create table statements for all 200 of my tables (since I don't want to right click all of those).
@ GilaMonster
I do plan on still having the entire database backed up but not necessarily every day still. My situation is that there is about 80% of the tables that I don't need the insert statements on because the data comes from a flat file but I do have another 20% of the tables that are user interactive and we update the tables through a web interface. If that got lost I'd be in real trouble.
Thank you both for your posts, it's helping me understand more about sql server.
I'd like to (if possible) export the stored procedures and create table statements for all of my tables and then re-create insert statements and primary keys / indexes etc.. on select tables (the 20%).
I would have thought that someone else would have done this somewhere on the internet but I can't find anything anywhere. I was just going to install the SSMS tools pack because it looks like it may help me in what I'm doing. Does anyone have experience with this? Can I automate what I'm looking to do with this tool? Thanks in advance!
You can do it manually by right clicking on the database, select "Tasks", and then "Generate Scripts". There's also an EXE buried somewhere that does this but the name escapes me just now. I'm also not sure that it's actually available anymore.
Part of the reason I don't do what you want insofar as generating all the DDL and saving it somewhere is because I have all of that in source control. If you don't, I highly recommend that you do.
I also ran into the problem of backups and the related restores getting too big and taking too long. You might want to look into what is actually making your database so large. Chances are, the biggest tables are some form of mostly static history table such as an Audit table or an InvoiceDetail table (for example) where the rows in only the most recent month or two would suffer a change. I recently ran into just such an extreme case on one of my systems. The database is ~320GB and more than 319GB of that is all in one audit table (it's a phone system and we keep all the call recordings in the database). In the process of partitioning all that data by month into a separate file/filegroup per month, I also moved all of that data into a different (new) database and setup a synonym in the main database to point to it so that I wouldn't have to make any front-end code changes. I can now restore the <1GB main database to a temporary database in seconds to get any important data without mucking the original and without it taking 6-10 hours (the backups are on slower NAS instead of SAN) and 320GB to do a restore.
Not everyone has that kind of ratio of static large historical data to essential data but most people do have a couple of Audit or other large mostly static tables where putting them into a different database would be a great benefit for the type of thing you're talking about. I went even further with the partitioning and making older months READ_ONLY so that I don't have to keep backing up things that won't ever change. Like Grandma used to say, there's no sense in washing the clean clothes. 😛
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply