January 7, 2013 at 10:58 am
Hi,
I am new to AWS cloud- can any one tell me what would be the steps to migrate SQL Server instances into AWS.
I already have SQLinstance created on AWS cloud
I am thinking of below steps to move the databases:
1. Moving SQL job, Windows batch jobs
2. Moving SQL logins, permissions
3. Moving Application roles
4.Does backup restore of database works? from SQL server instance to AWS SQL instance? or i need to first create the schema and then export the data in the tables?
Does any one has done this migration earlier. Please share your experience.
Thanks.
January 7, 2013 at 11:29 am
1. Moving SQL job, Windows batch jobs
2. Moving SQL logins, permissions
3. Moving Application roles
yes, these are need to be done prior moving.
I was using Visual Studio integration service to transfer login and jobs
4.Does backup restore of database works? from SQL server instance to AWS SQL instance? or i need to first create the schema and then export the data in the tables?
No, backup restore doesn't work.
you need do as you mentioned.
January 7, 2013 at 12:48 pm
Thanks.
Any particular ports need to be opened inorder to import data?
As from local SSMS i cannot connect to AWS SQL instance? then how can we import data into tables?
January 7, 2013 at 3:10 pm
I have tried Backup and restore from my local instance to AWS SQL instance and it worked.
Is i am missing something here?
January 8, 2013 at 8:10 am
That's interesting point. since AWS RDS doesn't have a "file system behind"( of course it has but as end users, we couldn't see it).
I guess you are backing up .bak on your local machine, and open SSMS query window to run restore stmt against the .bak file.
is this what you have tried, could you please post your script that backup and restore to AWS here?
thanks,
January 8, 2013 at 8:46 am
Perhaps I misunderstanding what environment you have in AWS, are you using AWS EC2 instance to host your SQL Server?
If this is the case, then backup and restore should work.
January 9, 2013 at 8:32 am
Thanks for the reply. Can i know what is the difference between AWS EC2 and AWS RDS?
RIght now i am using Bastion server to RDP to SQL server.
FOr backup and restore- i have performed sql native full backup on my local machine SQL instance and moved that .bak file to Bastion via REmote desktop local shared resource and from Bastion server i moved the file to AWS SQL instance.
From there i was able to restore the database backup.
but moving the files is taking more time to Bastion server.
Is there any special way to move move the files?
January 9, 2013 at 1:28 pm
Thanks for the reply. Can i know what is the difference between AWS EC2 and AWS RDS?
AWS RDS is a web service that allow you to create a sql server database on it. It is designed to minimize the administrative work, for instance you don't have to create any Maintenance Plan to execute full backup and log backup. For more details you can see the feature of RDS on its website
RIght now i am using Bastion server to RDP to SQL server.
FOr backup and restore- i have performed sql native full backup on my local machine SQL instance and moved that .bak file to Bastion via REmote desktop local shared resource and from Bastion server i moved the file to AWS SQL instance.
From there i was able to restore the database backup.
but moving the files is taking more time to Bastion server.
Is there any special way to move move the files?
Copy from local to RDP, it really depends on the networking speed and the size of your .bak file, how big is your .bak file, have you think about compressing it then move to AWS server?
[p]I will consider the backup and restore is the best way to move your local DB to AWS sql server.[/p]
January 20, 2015 at 11:46 am
Have you done this? Can you further explain where I can move the database backup then access it via SSMS to restore it? It was my understanding one doesn't have access to the file system in Amazon.
Kurt
Kurt W. Zimmerman
SR DBA
Lefrak Organization
New York, NY
http://www.linkedin.com/in/kurtwzimmerman
February 4, 2015 at 2:21 pm
Greetings all;
Well this past weekend I was able to successfully move a VLD from a VM environment to the Amazon RDS.
To highlight the tool set, I tried to keep things real simple. I used BCP along with Red-Gate's SQL Data Compare. I stuck with BCP with tables > 1 million rows and SQL Data Compare with the remaining tables. I also wrote custom merge scripts when dealing with tables with RI and were cumbersome to strip them down to load the data.
The effort wasn't 100% flawless requiring some quick thinking during the wee hours of the morning to move things along.
Looking back I would seem to think that Amazon should seriously consider opening up the file system for their SQL Server platform so VLD backups can be moved to the server and restored locally. That would save a lot of aggravation. Even if temporary access was given to initialize the database would make this process much easier.
The DBA is left with finding creative ways to push the data to RDS. This process was met with an initial failure. Our first attempt we ran out of time. I had developed a special sync process that would push the data through a Linked Server. Although this worked, the process was very very slow.
The other critical aspect to this was to make sure I didn't run out of room, which happened to me twice on one of the largest tables within the database.
Things I didn't think about until 2-3:00 am was to change the database recovery model to Bulk Load instead of FULL.
All in all I faced the challenge and accomplished the goal. It took about 2 days for me to catch up on sleep, but it was well worth the experience.
All the best.
Kurt Zimmerman
Kurt W. Zimmerman
SR DBA
Lefrak Organization
New York, NY
http://www.linkedin.com/in/kurtwzimmerman
February 6, 2015 at 2:36 am
Going back to the OP, what is the main reason for wanting to move to AWS?
Is this just for personal training, or is your organisation planning to move its production systems.
Back in 2012 the organisation I worked for moved its production infrastructure into AWS EC2. This move met the main business goal of improving infrastructure flexibility, and the minor goals of cost reduction and minimal/no interruption to service. We moved about 130 windows instances, and the project was completed just ahead of time and within budget.
One of the techniques we used to minimise interruption was to use P2P replication between the old data centres and AWS. This allowed us to perform regression testing within AWS without affecting live users. At cutover we used AWS Route53 to move a portion of live traffic to AWS so we could gain confidence that al worked OK, then ramp up to 100% load in a controlled fashion.
One of the keys to success was having a small core team of 5 who were willing to work outside their main skill areas to make the project a success. We could obviously call on specialists outside the team as needed. We had also engaged AWS Premier Support, and their help kept the team and management happy we were on the right path.
At the end of the day, moving SQL databases is just a small part of the whole project. You need to look at all the business impacts as part of your planning.
Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.
When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara
February 6, 2015 at 2:39 am
The DBA is left with finding creative ways to push the data to RDS
Kurt, I am assuming you have a VPN between your current data centres and AWS. The link speed is then the limiting factor in moving data. Also,if you are planning a real move rather than just a test, then engage with AWS support. They are happy to have backups sent by courier to speed up the process of restore.
Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.
When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara
March 3, 2015 at 11:50 am
EdVassie (2/6/2015)
The DBA is left with finding creative ways to push the data to RDS
Kurt, I am assuming you have a VPN between your current data centres and AWS. The link speed is then the limiting factor in moving data. Also,if you are planning a real move rather than just a test, then engage with AWS support. They are happy to have backups sent by courier to speed up the process of restore.
This was a real move, moving a production site to AWS. Once the database structure was in place it was a matter of moving the data. I initialized the database using the EXPORT feature of SSMS. This was fine except where there were very large tables. The initialization process took over 30+ hours. The primary database remained on line during the entire process.
Since there was known drift in the data I looked for a variety of ways to sync up the two instances. The Idera SQL Data Compare worked great! I was able to get 90% of all tables in sync. The remaining 10% which were the larger tables I chose to BCP into a staging table in the target database. With simple MERGE logic I was able to sync the remaining tables.
End to end the data transfer process took from Friday night, midnight to Sunday morning about 10:30am.
The killer in the entire process was dealing with TEXT & NTEXT fields in a number of the very large tables. Just a simple BCP on those tables took hours to export/import.
I'm so glad I went through the effort. I learned a lot with what you can and cannot do. I will say this, I have some really neat procedures that managed the MERGE process that I chose to abandon during the final push but certainly could be used for small/medium sized databases.
Kurt
Kurt W. Zimmerman
SR DBA
Lefrak Organization
New York, NY
http://www.linkedin.com/in/kurtwzimmerman
September 21, 2016 at 6:35 am
AWS Database can migrate your data to and from most widely used commercial and open-source databases such as Oracle, PostgreSQL, Microsoft SQL Server, Amazon Redshift, Amazon Aurora, MariaDB, and MySQL. The service supports homogeneous migrations such as Oracle to Oracle, and also heterogeneous migrations between different database platforms, such as Oracle to MySQL or MySQL to Amazon Aurora. The source or target database must be on an AWS service.
Follow these steps:-
Step 1. Download SQLAzureMW Tool
Step 2. Use SQL Server Management Studio to connection to your local sql server and AWS RDS instanace and make sure you have proper connectivity to both server.
Step 3. Start Migration - Double click on SQLAzureMW.exe file from downloaded SQLAzureMW tool, and it will open screen like below. Select option Database and click Next.
Step 4. Source database tasks - Enter source SQL Server database connection details and click Connect.
Step 5. Select datababase to be migrate on AWS RDS from the list of databases and click Next.
Step 6. Review the list of selected objects to be migrate and click Next.
Step 7. Destination database tasks - Now input your RDS SQL Server connection details as below and click on Connect and Select target database to which you need to migrate your data.
Step 8. Step 3. Verify Data - Now use your local SQL Server Management Studio and verify all your data migrated successfully or not. In case you get any errors, fix it manually.
Good Luck !!!
July 29, 2018 at 11:36 pm
This was removed by the editor as SPAM
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply