Everything you need before running GenieRails.
| OS | Supported | Notes |
|---|---|---|
| Linux | Yes | Any modern distribution |
| macOS | Yes | Intel or Apple Silicon |
| Windows | Via WSL only | Requires Windows Subsystem for Linux (bash, sed, grep needed) |
| Tool | Version | Check | Install |
|---|---|---|---|
| Python | 3.9+ | python3 --version |
python.org |
| Terraform | >= 1.0 | terraform --version |
terraform.io |
| Git | Any | git --version |
git-scm.com |
These are installed automatically when you first run make generate or make apply:
| Package | Purpose |
|---|---|
python-hcl2 |
Parse Terraform HCL configurations |
databricks-sdk |
Databricks Python SDK |
For integration testing (make test-ci), cloud-specific packages are also auto-installed:
| Package | Cloud | Purpose |
|---|---|---|
boto3 |
AWS | S3 bucket and IAM role management |
azure-identity |
Azure | Service principal authentication |
azure-mgmt-storage |
Azure | Storage account management |
azure-mgmt-authorization |
Azure | RBAC role assignments |
azure-mgmt-databricks |
Azure | Workspace management |
Downloaded automatically on first terraform init:
| Provider | Version | Source |
|---|---|---|
databricks/databricks |
~> 1.91.0 | registry.terraform.io |
hashicorp/null |
~> 3.2 | registry.terraform.io |
hashicorp/time |
~> 0.12 | registry.terraform.io |
GenieRails requires outbound HTTPS (port 443) to:
| Endpoint | Purpose |
|---|---|
registry.terraform.io |
Download Terraform providers (first run only) |
| Your Databricks workspace URL | All API calls (generate, apply, verify) |
accounts.cloud.databricks.com |
AWS account API (group/tag policy management) |
accounts.azuredatabricks.net |
Azure account API (group/tag policy management) |
No VPN is required unless your Databricks workspace is on a private network.
- Unity Catalog — must be enabled on the target workspace
- SQL Warehouse — serverless (auto-created) or existing warehouse
- Genie Spaces — for the Genie Space governance workflow
Create a service principal (SP) in the Databricks Account Console with:
| Role | Scope | Required for |
|---|---|---|
| Account Admin | Account | Creating groups, tag policies |
| Workspace Admin | Target workspace | Deploying governance resources |
| Metastore Admin | Unity Catalog metastore | Managing catalogs, grants, FGAC policies |
Genie-only mode: If you only need Genie Spaces without ABAC governance, set
genie_only = trueinenv.auto.tfvars. This requires only Workspace Admin (no Account Admin or Metastore Admin needed).
You'll need these values for auth.auto.tfvars:
| Credential | Where to find |
|---|---|
databricks_account_id |
Account Console → top-right profile menu |
databricks_account_host |
AWS: https://accounts.cloud.databricks.com / Azure: https://accounts.azuredatabricks.net |
databricks_client_id |
Account Console → User Management → Service Principals → Application ID |
databricks_client_secret |
Same SP → OAuth Secrets → Generate Secret |
databricks_workspace_id |
Account Console → Workspaces, or ?o= parameter in workspace URL |
databricks_workspace_host |
Your workspace URL (e.g., https://dbc-xxx.cloud.databricks.com) |
Credentials (one of):
AWS_PROFILEenvironment variable pointing to a named profile in~/.aws/credentialsAWS_ACCESS_KEY_ID+AWS_SECRET_ACCESS_KEY(+ optionalAWS_SESSION_TOKEN)- Default boto3 credential chain (instance profile, SSO, etc.)
IAM Permissions (for make test-ci provisioning only):
iam:CreateRole,iam:DeleteRole,iam:PutRolePolicy,iam:DeleteRolePolicys3:CreateBucket,s3:DeleteBucket,s3:PutPublicAccessBlocksts:GetCallerIdentity
Standard
make generate+make applyusage does NOT require AWS IAM permissions — only a Databricks service principal.
Credentials (one of):
- Service principal:
AZURE_CLIENT_ID+AZURE_CLIENT_SECRET+AZURE_TENANT_ID DefaultAzureCredential(Azure CLI login, managed identity, etc.)
Additional config (for make test-ci provisioning only):
AZURE_SUBSCRIPTION_IDAZURE_RESOURCE_GROUPAZURE_REGION(e.g.,australiaeast)
Azure RBAC Roles (for provisioning only):
Contributoron resource groupStorage Blob Data ContributorUser Access Administrator
Standard
make generate+make applyusage does NOT require Azure RBAC roles — only a Databricks service principal.
After installing Python and Terraform, verify your setup:
# Clone the repo
git clone https://github.com/databricks-solutions/genierails.git
cd genierails
# Pick your cloud
cd aws # or: cd azure
# Copy and fill in credentials
cp shared/auth.auto.tfvars.example envs/dev/auth.auto.tfvars
# Edit envs/dev/auth.auto.tfvars with your credentials
# Verify connectivity
make setup ENV=dev
make validate ENV=devIf make validate shows all [PASS] checks, you're ready to go.
See From UI to Production or Quickstart for next steps.