Skip to content

Update product names: Workflows→Jobs, Delta Live Tables→Spark Declarative Pipelines#4967

Open
lennartkats-db wants to merge 7 commits intomainfrom
product-name-changes
Open

Update product names: Workflows→Jobs, Delta Live Tables→Spark Declarative Pipelines#4967
lennartkats-db wants to merge 7 commits intomainfrom
product-name-changes

Conversation

@lennartkats-db
Copy link
Copy Markdown
Contributor

Changes

  • Rename "Databricks Workflows" / "Workflows" → "Databricks Jobs" / "Jobs" in CLI help text and templates
  • Rename "Delta Live Tables" → "Spark Declarative Pipelines" and "DLT" → "SDP" in descriptions, comments, and schema annotations
  • Rename template parameter include_dltinclude_sdp and file dlt_pipeline.ipynbsdp_pipeline.ipynb in experimental-jobs-as-code template

lennartkats-db and others added 7 commits April 14, 2026 12:12
…tive Pipelines

Update all non-generated references to retired product names:
- "Databricks Workflows" / "Workflows" → "Databricks Jobs" / "Jobs"
- "Delta Live Tables" → "Spark Declarative Pipelines"
- "DLT" → "SDP" (in comments/internal code)
- Template parameter `include_dlt` → `include_sdp`
- Template file `dlt_pipeline.ipynb` → `sdp_pipeline.ipynb`

Generated files (schema JSON, docsgen, acceptance test outputs, Python
models) are not updated here — regenerate with `make schema`, `make docs`,
`make test-update`, `make test-update-templates`, `make -C python codegen`
after the upstream proto changes land.

Co-authored-by: Isaac
With include_pipeline properly wired (was silently ignored as include_dlt),
PIPELINE=no now excludes the pipeline resource. With only a job resource,
dynamic_version causes 1 change and 0 unchanged, which is correct behavior.

Co-authored-by: Isaac
The template renamed include_dlt to include_pipeline in a prior PR, but
the combinations test intentionally still passes include_dlt (which gets
silently ignored, defaulting to yes). Renaming to include_pipeline makes
PIPELINE=no actually exclude pipelines, causing divergent output across
variants which the combinations framework doesn't support.

Co-authored-by: Isaac
The output was corrupted when running tests locally without terraform,
replacing the successful deployment output with terraform init errors.
Restores correct output from main and applies DLT→SDP string change.

Co-authored-by: Isaac
@lennartkats-db lennartkats-db changed the title [draft] Update product names: Workflows→Jobs, Delta Live Tables→Spark Declarative Pipelines Update product names: Workflows→Jobs, Delta Live Tables→Spark Declarative Pipelines Apr 15, 2026
@lennartkats-db lennartkats-db requested a review from denik April 15, 2026 08:13
Copy link
Copy Markdown
Member

@simonfaltum simonfaltum left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review swarm: Isaac + Cursor (1 round, Cursor timed out)

0 Critical | 1 Major (Gap) | 1 Nit | 1 Suggestion

The rename changes look correct across the board. A few things worth addressing before merge, the main one being that jsonschema.json appears to be manually edited rather than regenerated from the source annotations. There are also a couple of downstream generated files (jsonschema_for_docs.json, docsgen/output/reference.md, docsgen/output/resources.md) that still contain the old product names.

See inline comments for specifics.

},
"additionalProperties": false,
"markdownDescription": "The pipeline resource allows you to create Delta Live Tables [pipelines](https://docs.databricks.com/api/workspace/pipelines/create). For information about pipelines, see [link](https://docs.databricks.com/dlt/index.html). For a tutorial that uses the Declarative Automation Bundles template to create a pipeline, see [link](https://docs.databricks.com/dev-tools/bundles/pipelines-tutorial.html)."
"markdownDescription": "The pipeline resource allows you to create Spark Declarative [Pipelines](https://docs.databricks.com/api/workspace/pipelines/create). For information about pipelines, see [link](https://docs.databricks.com/dlt/index.html). For a tutorial that uses the Declarative Automation Bundles template to create a pipeline, see [link](https://docs.databricks.com/dev-tools/bundles/pipelines-tutorial.html)."
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Gap (Major)] This file looks like it was manually edited rather than regenerated from the source annotations. That's fragile and can drift.

I think you should only edit the source files (annotations.yml, annotations_openapi_overrides.yml) and then run make schema && make schema-for-docs && make docs to regenerate everything consistently. The other generated files (jsonschema_for_docs.json, docsgen/output/reference.md, docsgen/output/resources.md) still contain the old product names and would get picked up by regeneration.

"_":
"markdown_description": |-
The pipeline resource allows you to create Delta Live Tables [pipelines](/api/workspace/pipelines/create). For information about pipelines, see [_](/dlt/index.md). For a tutorial that uses the Declarative Automation Bundles template to create a pipeline, see [_](/dev-tools/bundles/pipelines-tutorial.md).
The pipeline resource allows you to create Spark Declarative [Pipelines](/api/workspace/pipelines/create). For information about pipelines, see [_](/dlt/index.md). For a tutorial that uses the Declarative Automation Bundles template to create a pipeline, see [_](/dev-tools/bundles/pipelines-tutorial.md).
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Nit] The new text reads "create Spark Declarative [Pipelines](...)" which splits the product name across a link boundary. "Spark Declarative" on its own doesn't mean anything. The original worked because "Delta Live Tables" was a self-contained product name and "pipelines" was a generic noun.

Consider "create [Spark Declarative Pipelines](/api/workspace/pipelines/create)" to keep the full product name inside the link.

Comment thread bundle/phases/deploy.go

// One or more DLT pipelines is being recreated.
if len(dltActions) != 0 {
// One or more SDP pipelines is being recreated.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Suggestion] Nice rename from dltActions to pipelineActions. The comment here could match: since the variable already says "pipeline", you could simplify to // One or more pipelines is being recreated. and drop the "SDP" abbreviation.

Copy link
Copy Markdown
Collaborator

@juliacrawf-db juliacrawf-db left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar to the comments I made on the DABs rename PR, from Dylan Vance:

  • In general, it's safer to say "Lakeflow Spark Declarative Pipelines" than just "Spark Declarative Pipelines", to avoid confusion.
  • The name of the feature is "Spark Declarative Pipelines". The resource that it creates is a "pipeline". Do not use "Spark Declarative Pipeline" or "Lakeflow Spark Declarative Pipeline", as this isn't a thing.
    It's very similar to DABs creating a bundle.

For example, the default template would deploy a job called
`[dev yourname] my_dbt_sql_job` to your workspace.
You can find that job by opening your workpace and clicking on **Workflows**.
You can find that job by opening your workpace and clicking on **Jobs**.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
You can find that job by opening your workpace and clicking on **Jobs**.
You can find that job by opening your workpace and clicking on **Jobs & Pipelines**.

For example, the default template would deploy a job called
`[dev yourname] my_default_scala_job` to your workspace.
You can find that job by opening your workspace and clicking on **Workflows**.
You can find that job by opening your workspace and clicking on **Jobs**.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
You can find that job by opening your workspace and clicking on **Jobs**.
You can find that job by opening your workspace and clicking on **Jobs & Pipelines**.

For example, the default template would deploy a job called
`[dev yourname] my_default_sql_job` to your workspace.
You can find that job by opening your workpace and clicking on **Workflows**.
You can find that job by opening your workpace and clicking on **Jobs**.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
You can find that job by opening your workpace and clicking on **Jobs**.
You can find that job by opening your workpace and clicking on **Jobs & Pipelines**.

For example, the default template would deploy a job called
`[dev yourname] my_jobs_as_code_job` to your workspace.
You can find that job by opening your workspace and clicking on **Workflows**.
You can find that job by opening your workspace and clicking on **Jobs**.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
You can find that job by opening your workspace and clicking on **Jobs**.
You can find that job by opening your workspace and clicking on **Jobs & Pipelines**.

For example, the default template would deploy a job called
`[dev yourname] {{.project_name}}_job` to your workspace.
You can find that job by opening your workpace and clicking on **Workflows**.
You can find that job by opening your workpace and clicking on **Jobs**.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
You can find that job by opening your workpace and clicking on **Jobs**.
You can find that job by opening your workpace and clicking on **Jobs & Pipelines**.

For example, the default template would deploy a job called
`[dev yourname] {{.project_name}}_job` to your workspace.
You can find that job by opening your workspace and clicking on **Workflows**.
You can find that job by opening your workspace and clicking on **Jobs**.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
You can find that job by opening your workspace and clicking on **Jobs**.
You can find that job by opening your workspace and clicking on **Jobs & Pipelines**.

For example, the default template would deploy a job called
`[dev yourname] {{.project_name}}_job` to your workspace.
You can find that job by opening your workpace and clicking on **Workflows**.
You can find that job by opening your workpace and clicking on **Jobs**.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
You can find that job by opening your workpace and clicking on **Jobs**.
You can find that job by opening your workpace and clicking on **Jobs & Pipelines**.

For example, the default template would deploy a job called
`[dev yourname] {{.project_name}}_job` to your workspace.
You can find that job by opening your workspace and clicking on **Workflows**.
You can find that job by opening your workspace and clicking on **Jobs**.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
You can find that job by opening your workspace and clicking on **Jobs**.
You can find that job by opening your workspace and clicking on **Jobs & Pipelines**.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants