Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
* engine/direct: Fix bind and unbind for non-Terraform resources ([#4850](https://github.com/databricks/cli/pull/4850))
* engine/direct: Fix deploying removed principals ([#4824](https://github.com/databricks/cli/pull/4824))
* engine/direct: Fix secret scope permissions migration from Terraform to Direct engine ([#4866](https://github.com/databricks/cli/pull/4866))
* Fix `bundle deployment bind` to always pull remote state before modifying ([#4892](https://github.com/databricks/cli/pull/4892))

### Dependency updates

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
bundle:
name: stale_state_test

resources:
jobs:
job_1:
name: Job 1
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
bundle:
name: stale_state_test

resources:
jobs:
job_1:
name: Job 1

job_2:
name: Job 2
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@

>>> errcode [CLI] bundle deployment bind job_2 [EXTERNAL_JOB_ID] --auto-approve
Error: Resource already managed

The bundle is already managing a resource for resources.jobs.job_2 with ID '[JOB_2_ID]'.
To bind to a different resource with ID '[EXTERNAL_JOB_ID]', you must first unbind the existing resource.


Exit code: 1
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@

>>> errcode [CLI] bundle deployment bind job_2 [EXTERNAL_JOB_ID] --auto-approve
Error: terraform import: exit status 1

Error: Resource already managed by Terraform

Terraform is already managing a remote object for databricks_job.job_2. To
import to this address you must first remove the existing object from the
state.




Exit code: 1

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

21 changes: 21 additions & 0 deletions acceptance/bundle/deployment/bind/job/stale-state/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@

=== Step 1: Deploy with job_1 only
>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/stale_state_test/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

=== Step 2: Save stale local state (has only job_1, serial=1)
=== Step 3: Add job_2 to config and deploy again
>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/stale_state_test/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

=== Step 4: Record deployed job_2 ID from stateDeployed job_2 ID: [JOB_2_ID]

=== Step 5: Restore stale local state (only job_1, serial=1)
=== Step 6: Create external job and try to bind (should fail: remote state already has job_2)
External job ID: [EXTERNAL_JOB_ID]
27 changes: 27 additions & 0 deletions acceptance/bundle/deployment/bind/job/stale-state/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
if [ "$DATABRICKS_BUNDLE_ENGINE" = "direct" ]; then
state_file=".databricks/bundle/default/resources.json"
else
state_file=".databricks/bundle/default/terraform/terraform.tfstate"
fi

title "Step 1: Deploy with job_1 only"
trace $CLI bundle deploy

title "Step 2: Save stale local state (has only job_1, serial=1)"
cp "$state_file" stale_state.json

title "Step 3: Add job_2 to config and deploy again"
cp databricks_v2.yml databricks.yml
trace $CLI bundle deploy

title "Step 4: Record deployed job_2 ID from state"
echo "Deployed job_2 ID: $(read_id.py job_2)"

title "Step 5: Restore stale local state (only job_1, serial=1)"
cp stale_state.json "$state_file"

title "Step 6: Create external job and try to bind (should fail: remote state already has job_2)\n"
job_id=$($CLI jobs create --json '{"name": "External Job"}' | jq -r '.job_id')
add_repl.py "$job_id" EXTERNAL_JOB_ID
echo "External job ID: $job_id"
trace errcode $CLI bundle deployment bind job_2 $job_id --auto-approve &> out.bind.$DATABRICKS_BUNDLE_ENGINE.txt
9 changes: 9 additions & 0 deletions acceptance/bundle/deployment/bind/job/stale-state/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
Cloud = false

Ignore = [
"databricks_v2.yml",
"stale_state.json",
]

[EnvMatrix]
DATABRICKS_BUNDLE_ENGINE = ["terraform", "direct"]
2 changes: 1 addition & 1 deletion cmd/apps/import.go
Original file line number Diff line number Diff line change
Expand Up @@ -300,7 +300,7 @@ func runImport(ctx context.Context, w *databricks.WorkspaceClient, appName, outp
var err error
b, stateDesc, err = bundleutils.ProcessBundleRet(bindCmd, bundleutils.ProcessOptions{
SkipInitContext: true,
ReadState: true,
AlwaysPull: true,
InitFunc: func(b *bundle.Bundle) {
b.Config.Bundle.Deployment.Lock.Force = false
},
Expand Down
2 changes: 1 addition & 1 deletion cmd/bundle/deployment/bind_resource.go
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ import (
func BindResource(cmd *cobra.Command, resourceKey, resourceId string, autoApprove, forceLock, skipInitContext bool) error {
b, stateDesc, err := utils.ProcessBundleRet(cmd, utils.ProcessOptions{
SkipInitContext: skipInitContext,
ReadState: true,
AlwaysPull: true,
InitFunc: func(b *bundle.Bundle) {
b.Config.Bundle.Deployment.Lock.Force = forceLock
},
Expand Down
Loading