June 25, 2007 at 10:06 am
We have a database with about 500 tables/5G We would like to export it into an output file that would be compatible with other Database engine (ex: Oracle). I tried to use an ASCII text but it appears you can only supply 1 table at a time. I tried with Excel but it failed because it was too big.
We could manually select the tables and split it into multiple Excel files, but is there an easier way ?
This output file is to be kept for about 5-6 years and needs to be in a compatible format in case it needs to be accessed in about 5-6years.
Thanks
June 25, 2007 at 11:50 pm
I too find no way than using multiple text/excel files. You say your data is about 5GB but excel can handle only 64K rows per sheet and there is a limitation in number of sheets too. So use mulitple excel's.
Cheers,
Sugeshkumar Rajendran
SQL Server MVP
http://sugeshkr.blogspot.com
June 26, 2007 at 7:41 am
what about using bcp to dump out each table to a text file and then either zipping/tarring/raring all the files up into one file to be archived for your time 5-6year archive period. I would imagine that you'd be able to script that into a job with not a terribly large amount of work if it is something that will need to be done repetatively.
Check out the sp_foreachtable syntax to see if that might help you with your BCP commands so that if you ever need to add a table over time it will get included in the export as well.
-Luke.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply