Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,5 +16,6 @@ See more details here: ([#3225](https://github.com/databricks/cli/pull/3225))

### Bundles
* [Breaking Change] Remove deprecated path fallback mechanism for jobs and pipelines ([#3225](https://github.com/databricks/cli/pull/3225))
* Rename Delta Live Tables to Lakeflow Declarative Pipelines in the default-python template ([#3476](https://github.com/databricks/cli/pull/3476)).

### API Changes
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ workspace:

variables:
catalog:
description: The catalog the DLT pipeline should use.
description: The catalog the Lakeflow Declarative Pipeline should use.
default: main

resources:
Expand Down
2 changes: 1 addition & 1 deletion acceptance/bundle/help/bundle-init/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
Initialize using a bundle template to get started quickly.

TEMPLATE_PATH optionally specifies which template to use. It can be one of the following:
- default-python: The default Python template for Notebooks / Delta Live Tables / Workflows
- default-python: The default Python template for Notebooks / Lakeflow Declarative Pipelines / Workflows
- default-sql: The default SQL template for .sql files that run with Databricks SQL
- dbt-sql: The dbt SQL template (databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricks)
- mlops-stacks: The Databricks MLOps Stacks template (github.com/databricks/mlops-stacks)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ requires-python = ">= 3.11"
dev = [
"pytest",

# Code completion support for DLT, also install databricks-connect
# Code completion support for Lakeflow Declarative Pipelines, also install databricks-connect
"databricks-dlt",

# databricks-connect can be used to run parts of this project locally.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ resources:
schema: my_default_python_${bundle.target}
libraries:
- notebook:
path: ../src/dlt_pipeline.ipynb
path: ../src/pipeline.ipynb

configuration:
bundle.sourcePath: ${workspace.file_path}/src
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,9 @@
}
},
"source": [
"# DLT pipeline\n",
"# Lakeflow Declarative Pipeline\n",
"\n",
"This Delta Live Tables (DLT) definition is executed using a pipeline defined in resources/my_default_python.pipeline.yml."
"This Lakeflow Declarative Pipeline (LDP) definition is executed using a pipeline defined in resources/my_default_python.pipeline.yml."
]
},
{
Expand Down Expand Up @@ -72,7 +72,7 @@
"notebookMetadata": {
"pythonIndentUnit": 2
},
"notebookName": "dlt_pipeline",
"notebookName": "pipeline",
"widgets": {}
},
"kernelspec": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -137,7 +137,7 @@
"libraries": [
{
"notebook": {
"path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/dev/files/src/dlt_pipeline"
"path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/dev/files/src/pipeline"
}
}
],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -211,8 +211,8 @@ Validation OK!
"libraries": [
{
"notebook": {
- "path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/dev/files/src/dlt_pipeline"
+ "path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/prod/files/src/dlt_pipeline"
- "path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/dev/files/src/pipeline"
+ "path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/prod/files/src/pipeline"
}
}
],
Expand Down Expand Up @@ -373,8 +373,8 @@ Resources:
@@ -141,14 +127,11 @@
{
"notebook": {
- "path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/dev/files/src/dlt_pipeline"
+ "path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/prod/files/src/dlt_pipeline"
- "path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/dev/files/src/pipeline"
+ "path": "/Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/prod/files/src/pipeline"
}
}
],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ requires-python = ">= 3.11"
dev = [
"pytest",

# Code completion support for DLT, also install databricks-connect
# Code completion support for Lakeflow Declarative Pipelines, also install databricks-connect
"databricks-dlt",

# databricks-connect can be used to run parts of this project locally.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ resources:
serverless: true
libraries:
- notebook:
path: ../src/dlt_pipeline.ipynb
path: ../src/pipeline.ipynb

configuration:
bundle.sourcePath: ${workspace.file_path}/src
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,9 @@
}
},
"source": [
"# DLT pipeline\n",
"# Lakeflow Declarative Pipeline\n",
"\n",
"This Delta Live Tables (DLT) definition is executed using a pipeline defined in resources/my_default_python.pipeline.yml."
"This Lakeflow Declarative Pipeline (LDP) definition is executed using a pipeline defined in resources/my_default_python.pipeline.yml."
]
},
{
Expand Down Expand Up @@ -72,7 +72,7 @@
"notebookMetadata": {
"pythonIndentUnit": 2
},
"notebookName": "dlt_pipeline",
"notebookName": "pipeline",
"widgets": {}
},
"kernelspec": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
"source": [
"# DLT pipeline\n",
"\n",
"This Delta Live Tables (DLT) definition is executed using a pipeline defined in resources/my_jobs_as_code.pipeline.yml."
"This Lakeflow Declarative Pipeline (LDP) definition is executed using a pipeline defined in resources/my_jobs_as_code.pipeline.yml."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion libs/template/template.go
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ const (
var databricksTemplates = []Template{
{
name: DefaultPython,
description: "The default Python template for Notebooks / Delta Live Tables / Workflows",
description: "The default Python template for Notebooks / Lakeflow Declarative Pipelines / Workflows",
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should say "Notebooks and Lakeflow"; Workflows was subsumed by Lakeflow

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reader: &builtinReader{name: string(DefaultPython)},
Writer: &writerWithFullTelemetry{defaultWriter: defaultWriter{name: DefaultPython}},
},
Expand Down
4 changes: 2 additions & 2 deletions libs/template/template_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import (
)

func TestTemplateHelpDescriptions(t *testing.T) {
expected := `- default-python: The default Python template for Notebooks / Delta Live Tables / Workflows
expected := `- default-python: The default Python template for Notebooks / Lakeflow Declarative Pipelines / Workflows
- default-sql: The default SQL template for .sql files that run with Databricks SQL
- dbt-sql: The dbt SQL template (databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricks)
- mlops-stacks: The Databricks MLOps Stacks template (github.com/databricks/mlops-stacks)
Expand All @@ -18,7 +18,7 @@ func TestTemplateHelpDescriptions(t *testing.T) {

func TestTemplateOptions(t *testing.T) {
expected := []cmdio.Tuple{
{Name: "default-python", Id: "The default Python template for Notebooks / Delta Live Tables / Workflows"},
{Name: "default-python", Id: "The default Python template for Notebooks / Lakeflow Declarative Pipelines / Workflows"},
{Name: "default-sql", Id: "The default SQL template for .sql files that run with Databricks SQL"},
{Name: "dbt-sql", Id: "The dbt SQL template (databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricks)"},
{Name: "mlops-stacks", Id: "The Databricks MLOps Stacks template (github.com/databricks/mlops-stacks)"},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
"type": "string",
"default": "yes",
"enum": ["yes", "no"],
"description": "Include a stub (sample) Delta Live Tables pipeline in '{{.project_name}}{{path_separator}}src'",
"description": "Include a stub (sample) Lakeflow Declarative Pipeline in '{{.project_name}}{{path_separator}}src'",
"order": 3
},
"include_python": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ This file only template directives; it is skipped for the actual output.
{{end}}

{{if $notDLT}}
{{skip "{{.project_name}}/src/dlt_pipeline.ipynb"}}
{{skip "{{.project_name}}/src/pipeline.ipynb"}}
{{skip "{{.project_name}}/resources/{{.project_name}}.pipeline.yml"}}
{{end}}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ requires-python = ">= 3.11"
dev = [
"pytest",

# Code completion support for DLT, also install databricks-connect
# Code completion support for Lakeflow Declarative Pipelines, also install databricks-connect
"databricks-dlt",

# databricks-connect can be used to run parts of this project locally.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# The main job for {{.project_name}}.

{{- /* Clarify what this job is for for DLT-only users. */}}
{{- /* Clarify what this job is for Lakeflow Declarative Pipelines only users. */}}
{{if and (eq .include_dlt "yes") (and (eq .include_notebook "no") (eq .include_python "no")) -}}
# This job runs {{.project_name}}_pipeline on a schedule.
{{end -}}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ resources:
{{- end}}
libraries:
- notebook:
path: ../src/dlt_pipeline.ipynb
path: ../src/pipeline.ipynb

configuration:
bundle.sourcePath: ${workspace.file_path}/src
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,9 @@
}
},
"source": [
"# DLT pipeline\n",
"# Lakeflow Declarative Pipeline\n",
"\n",
"This Delta Live Tables (DLT) definition is executed using a pipeline defined in resources/{{.project_name}}.pipeline.yml."
"This Lakeflow Declarative Pipeline (LDP) definition is executed using a pipeline defined in resources/{{.project_name}}.pipeline.yml."
]
},
{
Expand Down Expand Up @@ -86,7 +86,7 @@
"notebookMetadata": {
"pythonIndentUnit": 2
},
"notebookName": "dlt_pipeline",
"notebookName": "pipeline",
"widgets": {}
},
"kernelspec": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
"source": [
"# DLT pipeline\n",
"\n",
"This Delta Live Tables (DLT) definition is executed using a pipeline defined in resources/{{.project_name}}.pipeline.yml."
"This Lakeflow Declarative Pipeline (LDP) definition is executed using a pipeline defined in resources/{{.project_name}}.pipeline.yml."
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove this acronym, we don't generally use it

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

]
},
{
Expand Down
Loading