You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Added private link option to the exfiltration protection example (#100)
* Predefine regional settings
Corrected README
Simplify steps by providing outputs
Updated to the latest version of dbx and aws modules
Simplify configuration of examples
* reverted change
* refactor
* updated comment
* reformat code
* Added Private link between clusters on the data plane and core services on the control plane
Whitelist maven
* updated readme
* updated readme
* updated vars
* updated vars
* updated vars
* updated comments
* refactor for private link
* code refactor
* implemented code review feedback
* implemented code review feedback
* removed versions from examples
* code refactor
* added no public ip param
* updated main route table association name
* renamed resources
* renamed resources
* refactor
* refactor
* moved provider out of the module
* moved resource out of variables
* use default provider
Copy file name to clipboardExpand all lines: examples/aws-exfiltration-protection/README.md
+8-4
Original file line number
Diff line number
Diff line change
@@ -12,6 +12,9 @@ This template provides an example deployment of AWS Databricks E2 workspace with
12
12
13
13
> **Note**
14
14
> If you are using AWS Firewall to block most traffic but allow the URLs that Databricks needs to connect to, please update the configuration based on your region. You can get the configuration details for your region from [Firewall Appliance](https://docs.databricks.com/administration-guide/cloud-configurations/aws/customer-managed-vpc.html#firewall-appliance-infrastructure) document.
15
+
>
16
+
> You can optionally enable Private Link in the variables. Enabling Private link on AWS requires Databricks "Enterprise" tier which is configured at the Databricks account level.
17
+
15
18
16
19
1. Reference this module using one of the different [module source types](https://developer.hashicorp.com/terraform/language/modules/sources)
17
20
2. Add a `variables.tf` with the same content in [variables.tf](variables.tf)
@@ -20,7 +23,8 @@ This template provides an example deployment of AWS Databricks E2 workspace with
20
23
* TF_VAR_databricks_account_username, set to the value of your Databricks account-level admin username.
21
24
* TF_VAR_databricks_account_password, set to the value of the password for your Databricks account-level admin user.
22
25
* TF_VAR_databricks_account_id, set to the value of the ID of your Databricks account. You can find this value in the corner of your Databricks account console.
23
-
5. Add a `output.tf` file.
24
-
6. (Optional) Configure your [remote backend](https://developer.hashicorp.com/terraform/language/settings/backends/s3)
25
-
7. Run `terraform init` to initialize terraform and get provider ready.
26
-
8. Run `terraform apply` to create the resources.
26
+
5. (Optional) Configure your [remote backend](https://developer.hashicorp.com/terraform/language/settings/backends/s3)
27
+
6. Run `terraform init` to initialize terraform and get provider ready.
28
+
7. Run `terraform plan` to validate and preview the deployment.
29
+
8. Run `terraform apply` to create the resources.
30
+
9. Run `terraform output -json` to print url (host) of the created Databricks workspace.
Copy file name to clipboardExpand all lines: modules/aws-exfiltration-protection/README.md
+5-4
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Provisioning AWS Databricks E2 with a Hub & Spoke firewall for data exfiltration protection
1
+
# Provisioning AWS Databricks E2 workspace with a Hub & Spoke firewall for data exfiltration protection
2
2
3
3
This template provides an example deployment of AWS Databricks E2 workspace with a Hub & Spoke firewall for data exfiltration protection. Details are described in [Data Exfiltration Protection With Databricks on AWS](https://www.databricks.com/blog/2021/02/02/data-exfiltration-protection-with-databricks-on-aws.html).
4
4
@@ -16,21 +16,22 @@ Resources to be created:
16
16
* S3 Root bucket
17
17
* Cross-account IAM role
18
18
* Databricks E2 workspace
19
+
* (Optional) Private link between clusters on the data plane and core services on the control plane
19
20
21
+
Note that enabling Private link on AWS requires Databricks "Enterprise" tier. On AWS the tier is configured at the Databricks account level.
22
+
If your Databricks account is using lower tier disable the private link in the variables (see below).
20
23
21
24
## How to use
22
25
23
26
> **Note**
24
-
> You can customize this module by adding, deleting or updating the Azure resources to adapt the module to your requirements.
27
+
> You can customize this module by adding, deleting or updating the AWS resources to adapt the module to your requirements.
25
28
> A deployment example using this module can be found in [examples/aws-exfiltration-protection](../../examples/aws-exfiltration-protection)
26
29
> If you are using AWS Firewall to block most traffic but allow the URLs that Databricks needs to connect to, please update the configuration based on your region. You can get the configuration details for your region from [Firewall Appliance](https://docs.databricks.com/administration-guide/cloud-configurations/aws/customer-managed-vpc.html#firewall-appliance-infrastructure) document.
27
30
28
31
1. Reference this module using one of the different [module source types](https://developer.hashicorp.com/terraform/language/modules/sources)
29
32
2. Add a `variables.tf` with the same content in [variables.tf](variables.tf)
30
33
3. Add a `terraform.tfvars` file and provide values to each defined variable
31
34
4. Configure the following environment variables:
32
-
* TF_VAR_databricks_account_username, set to the value of your Databricks account-level admin username.
33
-
* TF_VAR_databricks_account_password, set to the value of the password for your Databricks account-level admin user.
34
35
* TF_VAR_databricks_account_id, set to the value of the ID of your Databricks account. You can find this value in the corner of your Databricks account console.
35
36
5. Add a `output.tf` file.
36
37
6. (Optional) Configure your [remote backend](https://developer.hashicorp.com/terraform/language/settings/backends/s3)
0 commit comments