Reusable code in Terraform is key when automating database deployments of complex environments. As you add more resources, you must rethink the way you organize your scripts so you can manage them efficiently.
In my first article on this subject, I presented the basics about Terraform. Now we go a step further, using reusable modules in the deployment.
Expanding The Environment
This time we are going to implement a more robust infrastructure, including a longer list of resources:
Resource group
CosmosDB account and a database (SQL API)
Databricks workspace
SQL Server and SQL database
Storage Account and a container
Synapse Workspace, SQL Pool, and Data Lake GEN2 filesystem (used in the workspace)
I don't intend to create here a real-world environment. Let's say that, for any practical matters, the goal is to create a list of functional resources that can be enhanced later.
Instead of using the basic approach of creating a few scripts, our code is going to use Terraform modules, Modules will handle the syntax to deploy each type of resource, while our code will basically pass values to those modules.
When you have to deal with deploying new environments, handling new versions of AzureRM provider, or deploying several similar resources, modules we will let manage your code faster and easier.
Terraform Modules
We create a root script ("<yourMainScript>.tf") that points to a library of modules. This root script selects which resources to deploy, while each module defines which parameters are required to deploy a given resource. We set the values for those parameters in the root scripts ("variables.tf" or "<myName>.auto.tfvars), so we can reuse the library as it is for any deployment we need.
Check image below to see the syntax of the root script.
In this article, we will explore the HTAP capabilities of Cosmos DB. The goal is to derive real-time insights from transactional changes made to Cosmos DB, in a cost-effective manner with minimal overhead. I want the solution to be scalable, reliable and overall simple to maintain.
Introduction Azure Synapse Analytics is an enterprise analytics service that brings together popular data warehouse technologies and big data systems together. It provides SQL for enterprise data warehousing, Spark technologies for big data, Data explorer for analytics, Pipeline for data integration and ETL and ELT, and other popular Azure services including Power BI, CosmosDB, and […]