You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Feat: Adding new AWS modules to provision E2 workspaces using different configurations. And examples using these module (#35)
* docs:Updating main Readme
* docs:Updating main Readme
* docs:Updating main Readme
* docs:Updating main Readme
* feat:Push the terraform plan as a Pull request comment in ADO
* feat:Push the terraform plan as a Pull request comment in ADO
* feat:Push the terraform plan as a Pull request comment in ADO
* feat:Push the terraform plan as a Pull request comment in ADO
* feat:Push the terraform plan as a Pull request comment in ADO
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Adding an Lakehouse platform module on Azure
* Feat: Adding a new module to provision Databricks on Azure with Private Link - Standard deployment. And an example using this module
* Fix: Removing route_table_id variable as it is not used
* Feat: Adding a new module for exfiltration protection. And an example using this module
* Feat: Adding a new module for exfiltration protection. And an example using this module
* Feat: Adding a new module for exfiltration protection. And an example using this module
* Feat: Adding a new module for exfiltration protection with Azure private link. And an example using this module
* Feat: Adding a new module for exfiltration protection with Azure private link. And an example using this module
* Feat: Adding new AWS modules to provision E2 workspaces using different configurations. And examples using these module
* Feat: Adding new AWS modules to provision E2 workspaces using different configurations. And examples using these module
| AWS |[aws-workspace-with-firewall](examples/aws-workspace-with-firewall/)| Provisioning AWS Databricks E2 with an AWS Firewall |
47
+
| AWS |[aws-exfiltration-protection](examples/aws-exfiltration-protection/)| An implementation of [Data Exfiltration Protection on AWS](https://www.databricks.com/blog/2021/02/02/data-exfiltration-protection-with-databricks-on-aws.html)|
@@ -61,7 +65,10 @@ The folder `modules` contains the following Terraform modules :
61
65
| Azure |[adb-with-private-link-standard](modules/adb-with-private-link-standard/)| Provisioning Databricks on Azure with Private Link - Standard deployment |
62
66
| Azure |[adb-exfiltration-protection](modules/adb-exfiltration-protection/)| A sample implementation of [Data Exfiltration Protection](https://www.databricks.com/blog/2020/03/27/data-exfiltration-protection-with-azure-databricks.html)|
63
67
| Azure |[adb-with-private-links-exfiltration-protection](modules/adb-with-private-links-exfiltration-protection/)| Provisioning Databricks on Azure with Private Link and [Data Exfiltration Protection](https://www.databricks.com/blog/2020/03/27/data-exfiltration-protection-with-azure-databricks.html)|
| AWS |[aws-workspace-with-firewall](modules/aws-workspace-with-firewall/)| Provisioning AWS Databricks E2 with an AWS Firewall |
70
+
| AWS |[aws-exfiltration-protection](modules/aws-exfiltration-protection/)| An implementation of [Data Exfiltration Protection on AWS](https://www.databricks.com/blog/2021/02/02/data-exfiltration-protection-with-databricks-on-aws.html)|
With this deployment, traffic from user client to webapp (notebook UI), backend traffic from data plane to control plane will be through private endpoints. This terraform sample will create:
16
16
* Resource group with random prefix
@@ -23,5 +23,5 @@ With this deployment, traffic from user client to webapp (notebook UI), backend
23
23
24
24
1. Update `terraform.tfvars` file and provide values to each defined variable
25
25
2. (Optional) Configure your [remote backend](https://developer.hashicorp.com/terraform/language/settings/backends/azurerm)
26
-
4. Run `terraform init` to initialize terraform and get provider ready.
26
+
3. Run `terraform init` to initialize terraform and get provider ready.
# Provisioning Azure Databricks workspace with a Hub & Spoke firewall for data exfiltration protection
2
+
3
+
This example is using the [aws-exfiltration-protection](../../modules/aws-exfiltration-protection) module.
4
+
5
+
This template provides an example deployment of AWS Databricks E2 workspace with a Hub & Spoke firewall for data exfiltration protection. Details are described in [Data Exfiltration Protection With Databricks on AWS](https://www.databricks.com/blog/2021/02/02/data-exfiltration-protection-with-databricks-on-aws.html).
> If you are using AWS Firewall to block most traffic but allow the URLs that Databricks needs to connect to, please update the configuration based on your region. You can get the configuration details for your region from [Firewall Appliance](https://docs.databricks.com/administration-guide/cloud-configurations/aws/customer-managed-vpc.html#firewall-appliance-infrastructure) document.
15
+
16
+
1. Reference this module using one of the different [module source types](https://developer.hashicorp.com/terraform/language/modules/sources)
17
+
2. Add a `variables.tf` with the same content in [variables.tf](variables.tf)
18
+
3. Add a `terraform.tfvars` file and provide values to each defined variable
19
+
4. Configure the following environment variables:
20
+
* TF_VAR_databricks_account_username, set to the value of your Databricks account-level admin username.
21
+
* TF_VAR_databricks_account_password, set to the value of the password for your Databricks account-level admin user.
22
+
* TF_VAR_databricks_account_id, set to the value of the ID of your Databricks account. You can find this value in the corner of your Databricks account console.
23
+
5. Add a `output.tf` file.
24
+
6. (Optional) Configure your [remote backend](https://developer.hashicorp.com/terraform/language/settings/backends/s3)
25
+
7. Run `terraform init` to initialize terraform and get provider ready.
1. Reference this module using one of the different [module source types](https://developer.hashicorp.com/terraform/language/modules/sources)
14
+
2. Add a `variables.tf` with the same content in [variables.tf](variables.tf)
15
+
3. Add a `terraform.tfvars` file and provide values to each defined variable
16
+
4. Configure the following environment variables:
17
+
* TF_VAR_databricks_account_username, set to the value of your Databricks account-level admin username.
18
+
* TF_VAR_databricks_account_password, set to the value of the password for your Databricks account-level admin user.
19
+
* TF_VAR_databricks_account_id, set to the value of the ID of your Databricks account. You can find this value in the corner of your Databricks account console.
20
+
5. Add a `output.tf` file.
21
+
6. (Optional) Configure your [remote backend](https://developer.hashicorp.com/terraform/language/settings/backends/s3)
22
+
7. Run `terraform init` to initialize terraform and get provider ready.
1. Reference this module using one of the different [module source types](https://developer.hashicorp.com/terraform/language/modules/sources)
14
+
2. Add a `variables.tf` with the same content in [variables.tf](variables.tf)
15
+
3. Add a `terraform.tfvars` file and provide values to each defined variable
16
+
4. Configure the following environment variables:
17
+
* TF_VAR_databricks_account_username, set to the value of your Databricks account-level admin username.
18
+
* TF_VAR_databricks_account_password, set to the value of the password for your Databricks account-level admin user.
19
+
* TF_VAR_databricks_account_id, set to the value of the ID of your Databricks account. You can find this value in the corner of your Databricks account console.
20
+
5. Add a `output.tf` file.
21
+
6. (Optional) Configure your [remote backend](https://developer.hashicorp.com/terraform/language/settings/backends/s3)
22
+
7. Run `terraform init` to initialize terraform and get provider ready.
0 commit comments