May 9, 2012 at 12:00 am
Comments posted to this topic are about the item Setup SQL Server on Amazon RDS
May 9, 2012 at 4:36 am
Yesterday I was 'handed' 2 Amazon SQL servers that was set up by someone else ( not a DBA ) , and I hope you have a very strong sa password, or ideally disabled the sa account , as I am seeing load of sa failed logins from around the world, and they started to appear 5-6 days after the server was commissioned.
Security of the data must be a major concern !
May 9, 2012 at 5:16 am
Very good article.!!
May 9, 2012 at 7:55 am
Adrian Sims-154382 (5/9/2012)
Yesterday I was 'handed' 2 Amazon SQL servers that was set up by someone else ( not a DBA ) , and I hope you have a very strong sa password, or ideally disabled the sa account , as I am seeing load of sa failed logins from around the world, and they started to appear 5-6 days after the server was commissioned.Security of the data must be a major concern !
I see the same thing in many SQL Servers in DMZs if networking is not well configured. You can do some lockdown in AWS for this, and I'd recommend it.
Without a doubt security is a primary concern, and it should be. All passwords should be at least 12 characters, and preferably a few more.
May 9, 2012 at 3:21 pm
I'm using sqlcompare to deploy the database as per the article, at the moment I'm unsure of the limitations of the platform. Removing AUTHORIZATION [aspnet_Membership_FullAccess] allows it to create but seems bit odd to me.
GO
CREATE SCHEMA [aspnet_Membership_FullAccess]
AUTHORIZATION [aspnet_Membership_FullAccess]
GO
----
Msg 15151, Level 16, State 1, Line 2
Cannot find the user 'aspnet_Membership_FullAccess', because it does not exist or you do not have permission.
Msg 2759, Level 16, State 0, Line 2
CREATE SCHEMA failed due to previous errors.
Any ideas?
May 9, 2012 at 3:52 pm
you'd be better off posting this in the SQLCompare forums, but does this user exist in your instance?
SQLCompare looks at your users on the local database. If they don't exist in the new database on RDS, or you don't have permissions to assign that authorization, results in an error.
May 9, 2012 at 4:02 pm
It's a great article Steve. Thanks. Still trying to work out advantages of cloud offerings from SQL Server (Azure) {deprecated name!} vs Native SQL Server. Amazon entering confuses the matter further! Any thoughts?
BTW I agree SQL Compare Tools are brilliant, anyone interested should just get a 14 day free trial and will surely buy if allowed to!
.
May 9, 2012 at 4:08 pm
RDS gives you a database service that works like the 2008 R2 Dev edition. No Windows logins or host OS access, but if you work with the features and functions inside the database it's the same. Simpler.
Amazon EC2, or other cloud/VM providers give you a virtual Windows host that you manage as you wish. They deal with power, hardware, etc., and free you up to worry about the Windows host and software (like SQL ) installed.. If you don't have a data center or easy way to host things, this is a nice option.
Azure gives you a service that is like SQL Server, but isn't the same as your local SQL Server. Some stuff works, some doesn't, and you have to work out if that matters. Much of the base works, but some stuff doesn't (like NewSequentialID). Some notes here, here, and here.
In general, I think if you need a database service, or a db + web server, the cloud works well. It's not much different than the hosted offerings from MaximumASP, RackSpace and other providers in many ways.
May 10, 2012 at 1:19 am
Steve Jones - SSC Editor (5/9/2012)
RDS gives you a database service that works like the 2008 R2 Dev edition....
Azure gives you a service that is like SQL Server, but isn't the same as your local SQL Server. Some stuff works, some doesn't, and you have to work out if that matters. Much of the base works, but some stuff doesn't (like NewSequentialID). Some notes here, here, and here.
Thanks Steve. We've used SQL Azure, and changed quite a lot to do so.
So I'm wondering how Microsoft persuade your average Joe that making potentially wholesale changes to an app to put it in SQL Azure is better than just uploading the app as it is to Amazon RDS...
Tim
.
May 10, 2012 at 9:21 am
Tim Walker. (5/10/2012)
Thanks Steve. We've used SQL Azure, and changed quite a lot to do so.
So I'm wondering how Microsoft persuade your average Joe that making potentially wholesale changes to an app to put it in SQL Azure is better than just uploading the app as it is to Amazon RDS...
Tim
Cost, loyalty, maybe something in Azure that isn't local in SQL Server? My suspicion is eventually we'll see features tested on Azure first, then in the box.
Corporate policy also, a contract signed by the CxO, may influence things.
It's a big world, and no one will dominate. Lots of people will choose one or the other or neither for various reasons.
June 4, 2012 at 10:05 pm
This is a very awesome option. And it seems that Amazon is making it really easy to have a no-cost development DB server via this offer:
http://aws.amazon.com/rds/sqlserver/free/
Of course, that is for a "micro" instance which means "613 MB memory, Up to 2 ECU (for short periodic bursts), 64-bit platform, Low I/O Capacity, 20 GB of storage, and 20 GB of backup space". And it is only for Express Edition (SQL Server 2008 R2, as might have been mentioned before). But that is for an entire year. It certainly seems nice.
Also: "One ECU provides the equivalent CPU capacity of a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor."
And after that year? From what I can tell, they offer a 3 year contract for $35 (if paid up front) which equates to $0.00133 per hour (Express Edition). And they also offer a "Web Edition" license for $460 for 3 years (if paid up front) which equates to $0.0175 per hour. And that is for their "small" instance which means "1.7 GB memory, 1 ECU (1 virtual core with 1 ECU), 64-bit platform, Moderate I/O Capacity".
And the feature support is listed here:
http://docs.amazonwebservices.com/AmazonRDS/latest/UserGuide/Concepts.DBEngine.SQLServer.html
So while you don't get all of the control, features, or even latest version of SQL Server as you would by setting up your own AWS instance with SQL Server 2012, this is definitely cheaper and less management.
Thanks for announcing this!!
Take care,
Solomon...
SQL# — https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
Sql Quantum Lift — https://SqlQuantumLift.com/ ( company )
Sql Quantum Leap — https://SqlQuantumLeap.com/ ( blog )
Info sites — Collations • Module Signing • SQLCLR
January 20, 2015 at 10:07 am
Great article, I should have come here first.
This thread may be a little old and there may be new features added to the Amazon RDS. But here is my situation.
I have a VLD I need to move to the Amazon RDS. But lets start at the beginning.
I have an instance of the Amazon RDS set up. I scripted out the production VLD structure and was able to successfully put it into place on the Amazon RDS. So far so good.
Now for data. I have several tables that are over 10m rows each. One is about 80m rows. If I use the Red-Gate SQL Data compare it builds the deployment script local to the C: drive causing SQL Data Compare to stop working along with giving issues to the local host I'm working on.
I wrote my own syncing process, which does work however is not the most efficient. It works great on the smaller tables, pushing out the data changes by table out to the Amazon RDS. I have a process on the Amazon RDS side to load that data in. This process works without issue.
Now my VLT. I've considered using BCP to export/import the VLT data, using native format. I have a process that has automated this and it does work well. Just not sure how much time is needed to move all of the rows of data from these table to Amazon.
In an ideal world I'd like to be able to push a button and have a complete replica of my production database on the Amazon RDS.
What are my options? Maybe there is a way I can reconfigure the Red-Gate tool to store the deployment scripts to some other place than the C: drive of the host?
Any input on this would be greatly appreciated.
Kurt Zimmerman
Kurt W. Zimmerman
SR DBA
Lefrak Organization
New York, NY
http://www.linkedin.com/in/kurtwzimmerman
January 20, 2015 at 10:45 am
In terms of moving data, I know this link is Azure, but I'd think some of this would be similar in Amazon. I'll have to try it at some point: http://www.troyhunt.com/2013/12/working-with-154-million-records-on.html
Data Compare has issues with large tables, and it's not intended to do syncing. I think you'd need a sync service, and MS has a few options, and there may be third parties. Really you might be looking for a good ETL structure to move changes.
In terms of the Compare storage, can you shoot a note to me or support at red-gate with details? I thought you could change this.
January 20, 2015 at 11:01 am
Thanks Steve.
Kurt
Kurt W. Zimmerman
SR DBA
Lefrak Organization
New York, NY
http://www.linkedin.com/in/kurtwzimmerman
January 20, 2015 at 12:03 pm
I now do this kind of thing quite a lot now. We started using Azure extensively and are now back to managed hosting.
I moved a 30GB database from Azure to a local instance last week and here is what I do if it helps. (you are right that this is too much for SQL Data Compare). I don't think it really matters which direction you go in or to what.
I use the import / export wizard in SSIS to move the data. Warning that it is a bit quirky! I haven't found a way to stop it erroring after just one fault and it sometimes generates faults itself (In the table structure it scripts). By default it runs ten table copies in parallel. The important point is size of table doesn't seem to matter. If it fails just set it up to go again. You can multi-select and click the tables. I find it safer to tell it to allow identity insert if there is no reason not to.
I haven't created a package to do this, I watch it through (whilst doing other things)
Then use SQL Compare to correct the rest of the structure, or copy the table contents into an already correctly structured database at the target.
Don't see why the same principles shouldn't work for Amazon S3.
That's it!
Edit: Start with an empty target database and have the wizard create the tables.
Best Regards
Tim Walker
.
Viewing 15 posts - 1 through 15 (of 22 total)
You must be logged in to reply to this topic. Login to reply