PROJECT_NAME
DESCRIPTION:
Bootstraps the infrastructure for {{SELECT_APP_TYPE }}.
Will be used within the provisioned pipeline for your application depending on the options you chose.
Pipeline implementation for infrastructure relies on workspaces, you can pass in whatever workspace you want from {{ SELECT_DEPLOYMENT_TYPE }} pipeline YAML.
PREREQUISITES:
Azure Subscripion
- SPN
- Terraform will use this to perform the authentication for the API calls
- you will need the
client_id, subscription_id, client_secret, tenant_id
Terraform backend
- resource group (can be manually created for the terraform remote state)
- Blob storage container for the remote state management
USAGE:
To activate the terraform backend for running locally we need to initialise the SPN with env vars to ensure you are running the same way as the pipeline that will ultimately be running any incremental changes.
docker run -it --rm -v $(pwd):/opt/tf-lib amidostacks/ci-tf:latest /bin/bash
export ARM_CLIENT_ID=xxxx \
ARM_CLIENT_SECRET=yyyyy \
ARM_SUBSCRIPTION_ID=yyyyy \
ARM_TENANT_ID=yyyyy
alternatively you can run az login
To get up and running locally you will want to create a terraform.tfvars
file
TFVAR_CONTENTS=''' vnet_id = "amido-stacks-vnet-uks-dev" rg_name = "amido-stacks-rg-uks-dev" resource_group_location = "uksouth" name_company = "amido" name_project = "stacks" name_component = "spa" name_environment = "dev" ''' $TFVAR_CONTENTS > terraform.tfvars
terraform workspace select dev || terraform workspace new dev
terraform init -backend-config=./backend.local.tfvars
Requirements
Name | Version |
---|---|
terraform | >= 0.13 |
azurerm | ~> 3.0 |
Providers
Name | Version |
---|---|
azurerm | ~> 3.0 |
Modules
No modules.
Resources
Name | Type |
---|---|
azurerm_databricks_workspace.example | resource |
azurerm_monitor_diagnostic_setting.databricks_log_analytics | resource |
azurerm_client_config.current | data source |
azurerm_monitor_diagnostic_categories.adb_log_analytics_categories | data source |
Inputs
Name | Description | Type | Default | Required |
---|---|---|---|---|
data_platform_log_analytics_workspace_id | The Log Analytics Workspace used for the whole Data Platform. | string | null | no |
databricks_sku | The SKU to use for the databricks instance | string | "premium" | no |
databricksws_diagnostic_setting_name | The Databricks workspace diagnostic setting name. | string | "Databricks to Log Analytics" | no |
enable_databricksws_diagnostic | Whether to enable diagnostic settings for the Azure Databricks workspace | bool | false | no |
resource_group_location | Location of Resource group | string | "uksouth" | no |
resource_group_name | Name of resource group | string | n/a | yes |
resource_namer | User defined naming convention applied to all resources created as part of this module | string | n/a | yes |
resource_tags | Map of tags to be applied to all resources created as part of this module | map(string) | {} | no |
Outputs
Name | Description |
---|---|
adb_databricks_id | n/a |
databricks_hosturl | Azure Databricks HostUrl |
EXAMPLES:
There is an examples folder with possible usage patterns.
example