August 28, 2008 at 3:55 pm
I have more than 30 tables to restore on another reporting database. I have attached a spread sheet of the sizesof the tables.
I want to load an initial load and then do increamental loads based on MAxid.
Whats the best way of approaching this ? Should I do replication ?
Or bcp out the text files and then import them into the other database ? Is it possible to use a where clause during bcp and select only certain columns during bcp and only extract increamental load
August 28, 2008 at 3:59 pm
There are 586 tables listed in your Excel file. Which are the 30 that you want to move?
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
August 29, 2008 at 3:34 pm
I've attached the the spreadsheet with the tables I want to bcp out
August 29, 2008 at 4:01 pm
You've got some huge tables.
Where is your "reporting" database?
same NAS / SAN drive(s)?
different server/different drive(s)?
August 29, 2008 at 11:49 pm
They are two separate databases are on seperate servers and drives. So I have Server 1 as the source and Server 2 as the reporting server.
August 30, 2008 at 1:52 pm
raym (8/29/2008)
I've attached the the spreadsheet with the tables I want to bcp out
Your largest table contains 1.2 BILLION rows... the average output for BCP in the native format is probably around a MILLION rows per minute depending on the width of the table. If you do the math, it will take 20 hours just to export that table and another 20 to import it for a total of 40 hours.
Copy the whole table is definitely NOT the way to do it. Instead, I recommend you make monthly summaries except for the last 3 months which should be kept in detail.
I'd also recommend that you create a "PreviousMonths" table and shrink the large tables. Archive all but the last 3 months to tape.
THEN worry about creating a reporting database.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 30, 2008 at 4:01 pm
raym:
You've got half a Terabyte of data here.
If you need just a single static copy, then I think that you have to go with building a SAN Mirror and then splitting it. If you need to refresh it daily, then although I don't say this often here: You need to hire an expert consultant. This is a major job and not to be undertaken lightly. I did this for a customer a couple of years ago and it took 90 days to design, develop, test and implement it, and their site was not nearly as big as yours.
There is no silver bullet answer here, it all depends on every little detail of your infrastructure and environment: the server configurations, the network configuration, your available storage configurations and resources, your timetable, your downtime windows, your budget, etc, etc. There is just no way that you are going to get an adequate answer for free in any online forum. You would have to give us way more information that you can or should and we we have to do far more work than we can or should.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
August 30, 2008 at 4:09 pm
I should add that my example above was the simplest one of these that I have done. I have been working with two other customers for over 20 months (each) on implementing a service like this (One for Reporting, the other for Disaster Recovery).
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
September 7, 2008 at 12:16 pm
The tables have been reduced in size and I have listed the new tables. Can Bcp work on these ?
September 7, 2008 at 1:57 pm
It looks like you've made a 20-25% reduction, which is pretty good, but still probably not nearly enough to change the big issues here.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
September 7, 2008 at 3:36 pm
raym (9/7/2008)
The tables have been reduced in size and I have listed the new tables. Can Bcp work on these ?
Heh... sure... we've already established that it could... you still have tables with 1.2 billion rows in them and so the problem has not changed. It's still going to take time to export and then reimport the tables.
If you have a SAN, and I'm no pro at it, my understanding is that there is something that will allow a full snapshot from one db to the other on huge databases in a matter of minutes. That's provided that you want your reporting database to be a read only database (and most do).
If you're really hung up on copying the tables one at a time, my recommendation stands... create an aggregate table (summary table, whatever) the pre-summarizes the data and copy that, instead... I just can't see copying 1.2 billion rows of detail data for reporting purposes.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply