November 23, 2017 at 10:34 pm
Have recently created a database in Azure SQL database and I'm quiet new to it.
Requirement - Have to export a table from on premises server and import it on the virtual database hosted in Azure SQL database and automate this process.
What I have done so far - I have managed to complete the above process manually - Did BCP out to local server for export (.dat file), used Azure copy to copy the .dat file to Azure blob storage and did an import with the help of using EXTERNAL DATA SOURCE and BULK insert command.
Question - The challenge for me now is to automate this process every weekend. I am exploring different options and I can see there are numerous ways to do it - Use SQL agent of any on-premises server, Azure automation, Elastic jobs etc. I would like to use newer methods and I'm confused between Azure automation vs Elastic jobs.
I am thinking of exploring Elastic jobs and use it for the above purpose. Can anyone please help me and confirm if Elastic jobs is the right option over Azure automation for my requirement?
Note: We will gradually increase the number of tables to be used for this data migration. Additionally, I am also thinking to add few job steps like to truncate table or purge table before importing the data. Hope I can achieve this using elastic jobs?
November 24, 2017 at 1:20 pm
Unless you've set up and are using Azure Active directory, you're going to have a hard time automating this process from Azure. This is because it won't be able to see into your network to pull the files up. Instead, I'd schedule it using whatever you use locally for scheduling (SQL Agent?) and then use the tools you already outlined along with PowerShell to make the whole thing happen.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
November 26, 2017 at 9:58 pm
@Grant Fritchey - Thanks for your reply. We kind of have the AZure active directory set-up. I am planning to do BCP out to local server for export (.dat file), use Azure copy to copy the .dat file to Azure blob storage and do an import with the help of using EXTERNAL DATA SOURCE and BULK insert commands to Azure SQL database. Thinking of creating 3 different SPROC's for the above operations and use Elastic jobs for scheduling it(with powershell). Will this approach not work? We can use Azcopy to copy the files.. Am I thinking in the rite direction?
November 27, 2017 at 6:10 am
Yeah, you're on the right track. I've used AZCopy for exactly this type of operation with Azure SQL Data Warehouse. I used Azure Automation instead of Elastic Jobs. I'm not sure what the difference is there, but if the one is working for you, stick with it.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply