March 12, 2014 at 8:24 am
As the title says, I want to transfer all the tables along with data to a new database. In the source database the tables are in the dbo schema in the new database they will be in a named schema. This is part of a consolidation of multiple databases into a single database where each of the source database tables will be copied to a named schema in the destination database. What's the best way/tools to do this? The sources and destination servers are both 2008R2.
March 12, 2014 at 8:31 am
buddy__a (3/12/2014)
As the title says, I want to transfer all the tables along with data to a new database. In the source database the tables are in the dbo schema in the new database they will be in a named schema. This is part of a consolidation of multiple databases into a single database where each of the source database tables will be copied to a named schema in the destination database. What's the best way/tools to do this? The sources and destination servers are both 2008R2.
Pretty simple to do this in SSMS. You can do this as an export or an import. It makes no difference. Just right click on the DB you want to export/import -> tasks -> Import/Export Data. Then just select the appropriate locations and objects.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
March 12, 2014 at 8:34 am
Generate the table scripts and change the schema name afterwards. Then use something like bcp, SSIS or straightforward INSERT ... SELECT to move the data.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 21, 2014 at 9:28 am
I just wanted to follow up on this to let people know how this all worked out. What I needed to accomplish was to copy the tables from several (SQL 2008 R2) databases into a new (SQL 2008 R2) database where there was a schema for each of the source databases. So the tables from a source database would be moved into the a schema in the new database named for the database that the tables came from. The above two suggestions were kind of what I had already found but was hoping to find something that I didn't already know about that might get the job done. In my solution the above two suggestions were combined and other stuff added to get the job done.
The first step of this was to use the data import wizard in SSMS to create a SSIS package for the data move. There are several problems with this that had to be patched up. First the package creates a data preparation SQL Task that creates the tables in the new database in the dbo schema (which is the schema they came from). The first problem is right here in that it doesn't create the table with the identity property for the table and doesn't set the Keep Identity values property in the data transform task. Second, it creates a truly horrific set of data flow task to actually move the data. When I say horrific I mean that it creates a single data flow task for every 5 tables that need to be moved. So if you have 500 tables you would get 100 data prep tasks where it creates the new tables and 100 data transformation tasks where it transfers the data. If all you were doing was this simple data transfer no big deal but when you want to add other tasks to this package all those other tasks being in there slows do the solution and makes it hard to see the flow of what's going on. I didn't have that many tables in my databases so I manually combined data flow tasks and I creates my own data prep task that created all the table with the identity fields properly created as such. FYI, I creates this table script using Powershell and SMO since the normal SSMS scripting won't create the tables without all the indexes and keys and such. The following Powershell command is what I used to create the script:
$sql.Databases["MyDb"].Tables | % {$_.script()} | Out-File -FilePath c:\Mydir\myFileTableScript.sql
Once the data was transfered I needed to add all the constraints back (e.g. Indexes, Primary Keys, Defaults, Foreign Keys, etc.). The following Powershell commands were used to creates the scripts that where then put into an Exec SQL Task in the package.
$sql.Databases["MyDb"].Tables | % {$_.Columns} | ? {$_.DefaultConstraint -ne $null} | % {$_.DefaultConstraint.script()}
$sql.Databases["MyDb"].Tables | % {$_.Indexes} | % {$_.script() } | Out-File -FilePath 'C:\users\anackerman\My Documents\eEmpCensus_Indexes.sql'
$sql.Databases["MyDb"].Tables | % {$_.ForeignKeys} | % {$_.script()}
If the scripts that these generate are more than 32768 then you'll need to save the output to a file like I demonstrated in the table script command and use the Browse button on the script task's Edit dialog to load the script as you cannot paste more than 32768 characters into the SQL statement dialog. Also, I created a task for each of these sets of scripts (i.e. one for indexes, one for defaults, and one for foreign keys, you mad need to add others if there are other objects that you need, like triggers.
Once that was completed the last step was to move the objects into the appropriate schema. I generated the script for moving all the tables to the new schema using the following SQL statement run from the source database:
SELECT 'ALTER SCHEMA NewSchema TRANSFER ' + SysSchemas.Name + '.' + DbObjects.Name + ';'
FROM sys.Objects DbObjects
INNER JOIN sys.Schemas SysSchemas ON DbObjects.schema_id = SysSchemas.schema_id
WHERE SysSchemas.Name = 'dbo'
AND DbObjects.Type IN ('U')
I created a package like this for each database (5 of them) that I was moving. Once all five were created (in a single SSIS solution) I created a Master package that executed them all.
The only thing I would like is to find is an easier way to combine Data Flow tasks. I see no reason to create dozens of Data Flow tasks, it just clutters up the package.
I hope this is of some interest/help to someone and I would appreciate any comments on what I did and how I could have done it better.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply