Dagster CLI

dagster asset

Commands for working with Dagster assets.

dagster asset [OPTIONS] COMMAND [ARGS]...

Commands

list

List assets

materialize

Execute a run to materialize a selection…

wipe

Eliminate asset key indexes from event logs.

dagster debug

Commands for helping debug Dagster issues by dumping or loading artifacts from specific runs.

This can be used to send a file to someone like the Dagster team who doesn’t have direct access to your instance to allow them to view the events and details of a specific run.

Debug files can be viewed using dagit-debug cli. Debug files can also be downloaded from dagit.

dagster debug [OPTIONS] COMMAND [ARGS]...

Commands

export

Export the relevant artifacts for a job…

import

Import the relevant artifacts from debug…

dagster dev

Start a local deployment of Dagster, including dagit running on localhost and the dagster-daemon running in the background

dagster dev [OPTIONS]

Options

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or job

-m, --module-name <module_name>

Specify module or modules (flag can be used multiple times) where dagster definitions reside as top-level symbols/variables and load each module as a code location in the current python environment.

-f, --python-file <python_file>

Specify python file or files (flag can be used multiple times) where dagster definitions reside as top-level symbols/variables and load each file as a code location in the current python environment.

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--code-server-log-level <code_server_log_level>

Set the log level for code servers spun up by dagster services.

Default:

warning

Options:

critical | error | warning | info | debug | trace

-p, --dagit-port <dagit_port>

Port to use for the Dagit UI.

-h, --dagit-host <dagit_host>

Host to use for the Dagit UI.

Environment variables

DAGSTER_WORKING_DIRECTORY

Provide a default for --working-directory

DAGSTER_MODULE_NAME

Provide a default for --module-name

DAGSTER_PYTHON_FILE

Provide a default for --python-file

dagster instance

Commands for working with the current Dagster instance.

dagster instance [OPTIONS] COMMAND [ARGS]...

Commands

info

List the information about the current…

migrate

Automatically migrate an out of date…

reindex

Rebuild index over historical runs for…

dagster job

Commands for working with Dagster jobs.

dagster job [OPTIONS] COMMAND [ARGS]...

Commands

backfill

Backfill a partitioned job.

execute

Execute a job.

launch

Launch a job using the run launcher…

list

List the jobs in a repository.

list_versions

Display the freshness of memoized results…

print

Print a job.

scaffold_config

Scaffold the config for a job.

dagster run

Commands for working with Dagster job runs.

dagster run [OPTIONS] COMMAND [ARGS]...

Commands

delete

Delete a run by id and its associated…

list

List the runs in the current Dagster…

migrate-repository

Migrate the run history for a job from a…

wipe

Eliminate all run history and event logs.

dagster schedule

Commands for working with Dagster schedules.

dagster schedule [OPTIONS] COMMAND [ARGS]...

Commands

debug

Debug information about the scheduler.

list

List all schedules that correspond to a…

logs

Get logs for a schedule.

preview

Preview changes that will be performed by…

restart

Restart a running schedule.

start

Start an existing schedule.

stop

Stop an existing schedule.

wipe

Delete the schedule history and turn off…

dagster sensor

Commands for working with Dagster sensors.

dagster sensor [OPTIONS] COMMAND [ARGS]...

Commands

cursor

Set the cursor value for an existing sensor.

list

List all sensors that correspond to a…

preview

Preview an existing sensor execution.

start

Start an existing sensor.

stop

Stop an existing sensor.

dagster project

Commands for bootstrapping new Dagster projects and code locations.

dagster project [OPTIONS] COMMAND [ARGS]...

Commands

from-example

Download one of the official Dagster examples to the current directory. This CLI enables you to quickly bootstrap your project with an officially maintained example.

list-examples

List the examples that available to bootstrap with.

scaffold

Create a folder structure with a single Dagster code location and other files such as pyproject.toml. This CLI enables you to quickly start building a new Dagster project with everything set up.

scaffold-code-location

Create a folder structure with a single Dagster code location, in the current directory. This CLI helps you to scaffold a new Dagster code location within a folder structure that includes multiple Dagster code locations.

scaffold-repository

(DEPRECATED; Use dagster project scaffold-code-location instead) Create a folder structure with a single Dagster repository, in the current directory. This CLI helps you to scaffold a new Dagster repository within a folder structure that includes multiple Dagster repositories

dagster-graphql

Run a GraphQL query against the dagster interface to a specified repository or pipeline/job.

Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.

Examples:

  1. dagster-graphql

  2. dagster-graphql -y path/to/workspace.yaml

  3. dagster-graphql -f path/to/file.py -a define_repo

  4. dagster-graphql -m some_module -a define_repo

  5. dagster-graphql -f path/to/file.py -a define_pipeline

  6. dagster-graphql -m some_module -a define_pipeline

dagster-graphql [OPTIONS]

Options

--version

Show the version and exit.

-t, --text <text>

GraphQL document to execute passed as a string

-f, --file <file>

GraphQL document to execute passed as a file

-p, --predefined <predefined>

GraphQL document to execute, from a predefined set provided by dagster-graphql.

Options:

launchPipelineExecution

-v, --variables <variables>

A JSON encoded string containing the variables for GraphQL execution.

-r, --remote <remote>

A URL for a remote instance running dagit server to send the GraphQL request to.

-o, --output <output>

A file path to store the GraphQL response to. This flag is useful when making pipeline/job execution queries, since pipeline/job execution causes logs to print to stdout and stderr.

--ephemeral-instance

Use an ephemeral DagsterInstance instead of resolving via DAGSTER_HOME

--empty-workspace

Allow an empty workspace

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or job

-f, --python-file <python_file>

Specify python file or files (flag can be used multiple times) where dagster definitions reside as top-level symbols/variables and load each file as a code location in the current python environment.

-m, --module-name <module_name>

Specify module or modules (flag can be used multiple times) where dagster definitions reside as top-level symbols/variables and load each module as a code location in the current python environment.

--package-name <package_name>

Specify Python package where repository or job function lives

-a, --attribute <attribute>

Attribute that is either a 1) repository or job or 2) a function that returns a repository or job

--grpc-port <grpc_port>

Port to use to connect to gRPC server

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--use-ssl

Use a secure channel when connecting to the gRPC server

Environment variables

DAGSTER_WORKING_DIRECTORY

Provide a default for --working-directory

DAGSTER_PYTHON_FILE

Provide a default for --python-file

DAGSTER_MODULE_NAME

Provide a default for --module-name

DAGSTER_PACKAGE_NAME

Provide a default for --package-name

DAGSTER_ATTRIBUTE

Provide a default for --attribute

dagit

Run dagit. Loads a repository or pipeline/job.

Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.

Examples:

  1. dagit (works if .workspace.yaml exists)

  2. dagit -w path/to/workspace.yaml

  3. dagit -f path/to/file.py

  4. dagit -f path/to/file.py -d path/to/working_directory

  5. dagit -m some_module

  6. dagit -f path/to/file.py -a define_repo

  7. dagit -m some_module -a define_repo

  8. dagit -p 3333

Options can also provide arguments via environment variables prefixed with DAGIT

For example, DAGIT_PORT=3333 dagit

dagit [OPTIONS]

Options

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or job or 2) a function that returns a repository or job

--package-name <package_name>

Specify Python package where repository or job function lives

-m, --module-name <module_name>

Specify module or modules (flag can be used multiple times) where dagster definitions reside as top-level symbols/variables and load each module as a code location in the current python environment.

-f, --python-file <python_file>

Specify python file or files (flag can be used multiple times) where dagster definitions reside as top-level symbols/variables and load each file as a code location in the current python environment.

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or job

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

-h, --host <host>

Host to run server on

Default:

127.0.0.1

-p, --port <port>

Port to run server on - defaults to 3000

-l, --path-prefix <path_prefix>

The path prefix where Dagit will be hosted (eg: /dagit)

Default:

--db-statement-timeout <db_statement_timeout>

The timeout in milliseconds to set on database statements sent to the DagsterInstance. Not respected in all configurations.

Default:

15000

--db-pool-recycle <db_pool_recycle>

The maximum age of a connection to use from the sqlalchemy pool without connection recycling. Set to -1 to disable. Not respected in all configurations.

Default:

3600

--read-only

Start Dagit in read-only mode, where all mutations such as launching runs and turning schedules on/off are turned off.

--suppress-warnings

Filter all warnings when hosting Dagit.

--log-level <log_level>

Set the log level for the uvicorn web server.

Default:

warning

Options:

critical | error | warning | info | debug | trace

--code-server-log-level <code_server_log_level>

Set the log level for any code servers spun up by dagit.

Default:

info

Options:

critical | error | warning | info | debug | trace

--version

Show the version and exit.

Environment variables

DAGSTER_ATTRIBUTE

Provide a default for --attribute

DAGSTER_PACKAGE_NAME

Provide a default for --package-name

DAGSTER_MODULE_NAME

Provide a default for --module-name

DAGSTER_PYTHON_FILE

Provide a default for --python-file

DAGSTER_WORKING_DIRECTORY

Provide a default for --working-directory

dagster-daemon run

Run any daemons configured on the DagsterInstance.

dagster-daemon run [OPTIONS]

Options

--code-server-log-level <code_server_log_level>

Set the log level for any code servers spun up by the daemon.

Default:

warning

Options:

critical | error | warning | info | debug | trace

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or job or 2) a function that returns a repository or job

--package-name <package_name>

Specify Python package where repository or job function lives

-m, --module-name <module_name>

Specify module or modules (flag can be used multiple times) where dagster definitions reside as top-level symbols/variables and load each module as a code location in the current python environment.

-f, --python-file <python_file>

Specify python file or files (flag can be used multiple times) where dagster definitions reside as top-level symbols/variables and load each file as a code location in the current python environment.

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or job

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

Environment variables

DAGSTER_ATTRIBUTE

Provide a default for --attribute

DAGSTER_PACKAGE_NAME

Provide a default for --package-name

DAGSTER_MODULE_NAME

Provide a default for --module-name

DAGSTER_PYTHON_FILE

Provide a default for --python-file

DAGSTER_WORKING_DIRECTORY

Provide a default for --working-directory

dagster-daemon wipe

Wipe all heartbeats from storage.

dagster-daemon wipe [OPTIONS]

dagster-daemon debug heartbeat-dump

Log all heartbeat statuses

dagster-daemon debug heartbeat-dump [OPTIONS]

dagster api grpc

Serve the Dagster inter-process API over GRPC

dagster api grpc [OPTIONS]

Options

-p, --port <port>

Port over which to serve. You must pass one and only one of –port/-p or –socket/-s.

-s, --socket <socket>

Serve over a UDS socket. You must pass one and only one of –port/-p or –socket/-s.

-h, --host <host>

Hostname at which to serve. Default is localhost.

-n, --max-workers, --max_workers <max_workers>

Maximum number of (threaded) workers to use in the GRPC server

--heartbeat

If set, the GRPC server will shut itself down when it fails to receive a heartbeat after a timeout configurable with –heartbeat-timeout.

--heartbeat-timeout <heartbeat_timeout>

Timeout after which to shutdown if –heartbeat is set and a heartbeat is not received

--lazy-load-user-code

Wait until the first LoadRepositories call to actually load the repositories, instead of waiting to load them when the server is launched. Useful for surfacing errors when the server is managed directly from Dagit

-a, --attribute <attribute>

Attribute that is either a 1) repository or job or 2) a function that returns a repository or job

--package-name <package_name>

Specify Python package where repository or job function lives

-m, --module-name <module_name>

Specify module where dagster definitions reside as top-level symbols/variables and load the module as a code location in the current python environment.

-f, --python-file <python_file>

Specify python file where dagster definitions reside as top-level symbols/variables and load the file as a code location in the current python environment.

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or job

--use-python-environment-entry-point

If this flag is set, the server will signal to clients that they should launch dagster commands using <this server’s python executable> -m dagster, instead of the default dagster entry point. This is useful when there are multiple Python environments running in the same machine, so a single dagster entry point is not enough to uniquely determine the environment.

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

--ipc-output-file <ipc_output_file>

[INTERNAL] This option should generally not be used by users. Internal param used by dagster when it automatically spawns gRPC servers to communicate the success or failure of the server launching.

--fixed-server-id <fixed_server_id>

[INTERNAL] This option should generally not be used by users. Internal param used by dagster to spawn a gRPC server with the specified server id.

--override-system-timezone <override_system_timezone>

[INTERNAL] This option should generally not be used by users. Override the system timezone for tests.

--log-level <log_level>

Level at which to log output from the gRPC server process

--container-image <container_image>

Container image to use to run code from this server.

--container-context <container_context>

Serialized JSON with configuration for any containers created to run the code from this server.

--inject-env-vars-from-instance

Whether to load env vars from the instance and inject them into the environment.

--location-name <location_name>

Name of the code location this server corresponds to.

--instance-ref <instance_ref>

[INTERNAL] Serialized InstanceRef to use for accessing the instance

Environment variables

DAGSTER_GRPC_PORT

Provide a default for --port

DAGSTER_GRPC_SOCKET

Provide a default for --socket

DAGSTER_GRPC_HOST

Provide a default for --host

DAGSTER_LAZY_LOAD_USER_CODE

Provide a default for --lazy-load-user-code

DAGSTER_ATTRIBUTE

Provide a default for --attribute

DAGSTER_PACKAGE_NAME

Provide a default for --package-name

DAGSTER_MODULE_NAME

Provide a default for --module-name

DAGSTER_PYTHON_FILE

Provide a default for --python-file

DAGSTER_WORKING_DIRECTORY

Provide a default for --working-directory

DAGSTER_USE_PYTHON_ENVIRONMENT_ENTRY_POINT
DAGSTER_EMPTY_WORKING_DIRECTORY

Provide a default for --empty-working-directory

DAGSTER_CONTAINER_IMAGE

Provide a default for --container-image

DAGSTER_CONTAINER_CONTEXT

Provide a default for --container-context

DAGSTER_INJECT_ENV_VARS_FROM_INSTANCE

Provide a default for --inject-env-vars-from-instance

DAGSTER_LOCATION_NAME

Provide a default for --location-name

DAGSTER_INSTANCE_REF

Provide a default for --instance-ref