Commands for working with Dagster assets.
dagster asset [OPTIONS] COMMAND [ARGS]...
Commands
Eliminate asset key indexes from event logs.
Commands for helping debug Dagster issues by dumping or loading artifacts from specific runs.
This can be used to send a file to someone like the Dagster team who doesn’t have direct access to your instance to allow them to view the events and details of a specific run.
Debug files can be viewed using dagit-debug cli. Debug files can also be downloaded from dagit.
dagster debug [OPTIONS] COMMAND [ARGS]...
Commands
Export the relevant artifacts for a job…
Import the relevant artifacts from debug…
Commands for working with the current Dagster instance.
dagster instance [OPTIONS] COMMAND [ARGS]...
Commands
List the information about the current…
Automatically migrate an out of date…
Rebuild index over historical runs for…
Commands for working with Dagster jobs.
dagster job [OPTIONS] COMMAND [ARGS]...
Commands
Backfill a partitioned job.
Execute a job.
Launch a job using the run launcher…
List the jobs in a repository.
Display the freshness of memoized results…
Print a job.
Scaffold the config for a job.
Commands for working with Dagster job runs.
dagster run [OPTIONS] COMMAND [ARGS]...
Commands
Delete a run by id and its associated…
List the runs in the current Dagster…
Migrate the run history for a job from a…
Eliminate all run history and event logs.
Commands for working with Dagster schedules.
dagster schedule [OPTIONS] COMMAND [ARGS]...
Commands
Debug information about the scheduler.
List all schedules that correspond to a…
Get logs for a schedule.
Preview changes that will be performed by…
Restart a running schedule.
Start an existing schedule.
Stop an existing schedule.
Delete the schedule history and turn off…
Commands for working with Dagster sensors.
dagster sensor [OPTIONS] COMMAND [ARGS]...
Commands
Set the cursor value for an existing sensor.
List all sensors that correspond to a…
Preview an existing sensor execution.
Start an existing sensor.
Stop an existing sensor.
Commands for bootstrapping new Dagster projects and repositories.
dagster project [OPTIONS] COMMAND [ARGS]...
Commands
Download one of the official Dagster examples to the current directory. This CLI enables you to quickly bootstrap your project with an officially maintained example.
List the examples that available to bootstrap with.
Create a folder structure with a single Dagster repository and other files such as workspace.yaml, in the current directory. This CLI enables you to quickly start building a new Dagster project with everything set up.
Create a folder structure with a single Dagster repository, in the current directory. This CLI helps you to scaffold a new Dagster repository within a folder structure that includes multiple Dagster repositories
Run a GraphQL query against the dagster interface to a specified repository or pipeline/job.
Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.
Examples:
dagster-graphql
dagster-graphql -y path/to/workspace.yaml
dagster-graphql -f path/to/file.py -a define_repo
dagster-graphql -m some_module -a define_repo
dagster-graphql -f path/to/file.py -a define_pipeline
dagster-graphql -m some_module -a define_pipeline
dagster-graphql [OPTIONS]
Options
Show the version and exit.
GraphQL document to execute passed as a string
GraphQL document to execute passed as a file
GraphQL document to execute, from a predefined set provided by dagster-graphql.
launchPipelineExecution
A JSON encoded string containing the variables for GraphQL execution.
A URL for a remote instance running dagit server to send the GraphQL request to.
A file path to store the GraphQL response to. This flag is useful when making pipeline/job execution queries, since pipeline/job execution causes logs to print to stdout and stderr.
Use an ephemeral DagsterInstance instead of resolving via DAGSTER_HOME
Allow an empty workspace
Path to workspace file. Argument can be provided multiple times.
Specify working directory to use when loading the repository or job
Specify python file where repository or job function lives
Specify Python package where repository or job function lives
Specify module where repository or job function lives
Attribute that is either a 1) repository or job or 2) a function that returns a repository or job
Port to use to connect to gRPC server
Named socket to use to connect to gRPC server
Host to use to connect to gRPC server, defaults to localhost
Use a secure channel when connecting to the gRPC server
Environment variables
Provide a default for
--working-directory
Provide a default for
--python-file
Provide a default for
--package-name
Provide a default for
--module-name
Provide a default for
--attribute
Run dagit. Loads a repository or pipeline/job.
Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.
Examples:
dagit (works if .workspace.yaml exists)
dagit -w path/to/workspace.yaml
dagit -f path/to/file.py
dagit -f path/to/file.py -d path/to/working_directory
dagit -m some_module
dagit -f path/to/file.py -a define_repo
dagit -m some_module -a define_repo
dagit -p 3333
Options can also provide arguments via environment variables prefixed with DAGIT
For example, DAGIT_PORT=3333 dagit
dagit [OPTIONS]
Options
Use a secure channel when connecting to the gRPC server
Host to use to connect to gRPC server, defaults to localhost
Named socket to use to connect to gRPC server
Port to use to connect to gRPC server
Attribute that is either a 1) repository or job or 2) a function that returns a repository or job
Specify module where repository or job function lives
Specify Python package where repository or job function lives
Specify python file where repository or job function lives
Specify working directory to use when loading the repository or job
Path to workspace file. Argument can be provided multiple times.
Allow an empty workspace
Host to run server on
127.0.0.1
Port to run server on.
3000
The path prefix where Dagit will be hosted (eg: /dagit)
The timeout in milliseconds to set on database statements sent to the DagsterInstance. Not respected in all configurations.
15000
The maximum age of a connection to use from the sqlalchemy pool without connection recycling. Set to -1 to disable. Not respected in all configurations.
3600
Start Dagit in read-only mode, where all mutations such as launching runs and turning schedules on/off are turned off.
Filter all warnings when hosting Dagit.
Set the log level for the uvicorn web server.
warning
critical | error | warning | info | debug | trace
Show the version and exit.
Environment variables
Provide a default for
--attribute
Provide a default for
--module-name
Provide a default for
--package-name
Provide a default for
--python-file
Provide a default for
--working-directory
Run any daemons configured on the DagsterInstance.
dagster-daemon run [OPTIONS]
Options
Use a secure channel when connecting to the gRPC server
Host to use to connect to gRPC server, defaults to localhost
Named socket to use to connect to gRPC server
Port to use to connect to gRPC server
Attribute that is either a 1) repository or job or 2) a function that returns a repository or job
Specify module where repository or job function lives
Specify Python package where repository or job function lives
Specify python file where repository or job function lives
Specify working directory to use when loading the repository or job
Path to workspace file. Argument can be provided multiple times.
Allow an empty workspace
Environment variables
Provide a default for
--attribute
Provide a default for
--module-name
Provide a default for
--package-name
Provide a default for
--python-file
Provide a default for
--working-directory
Wipe all heartbeats from storage.
dagster-daemon wipe [OPTIONS]
Log all heartbeat statuses
dagster-daemon debug heartbeat-dump [OPTIONS]
Serve the Dagster inter-process API over GRPC
dagster api grpc [OPTIONS]
Options
Port over which to serve. You must pass one and only one of –port/-p or –socket/-s.
Serve over a UDS socket. You must pass one and only one of –port/-p or –socket/-s.
Hostname at which to serve. Default is localhost.
Maximum number of (threaded) workers to use in the GRPC server
If set, the GRPC server will shut itself down when it fails to receive a heartbeat after a timeout configurable with –heartbeat-timeout.
Timeout after which to shutdown if –heartbeat is set and a heartbeat is not received
Wait until the first LoadRepositories call to actually load the repositories, instead of waiting to load them when the server is launched. Useful for surfacing errors when the server is managed directly from Dagit
Attribute that is either a 1) repository or job or 2) a function that returns a repository or job
Specify module where repository or job function lives
Specify Python package where repository or job function lives
Specify python file where repository or job function lives
Specify working directory to use when loading the repository or job
If this flag is set, the server will signal to clients that they should launch dagster commands using <this server’s python executable> -m dagster, instead of the default dagster entry point. This is useful when there are multiple Python environments running in the same machine, so a single dagster entry point is not enough to uniquely determine the environment.
Indicates that the working directory should be empty and should not set to the current directory as a default
[INTERNAL] This option should generally not be used by users. Internal param used by dagster when it automatically spawns gRPC servers to communicate the success or failure of the server launching.
[INTERNAL] This option should generally not be used by users. Internal param used by dagster to spawn a gRPC server with the specified server id.
[INTERNAL] This option should generally not be used by users. Override the system timezone for tests.
Level at which to log output from the gRPC server process
Container image to use to run code from this server.
Serialized JSON with configuration for any containers created to run the code from this server.
Whether to load env vars from the instance and inject them into the environment.
Name of the code location this server corresponds to.
[INTERNAL] Serialized InstanceRef to use for accessing the instance
Environment variables
Provide a default for
--port
Provide a default for
--socket
Provide a default for
--host
Provide a default for
--lazy-load-user-code
Provide a default for
--attribute
Provide a default for
--module-name
Provide a default for
--package-name
Provide a default for
--python-file
Provide a default for
--working-directory
Provide a default for
--use-python-environment-entry-point
Provide a default for
--empty-working-directory
Provide a default for
--container-image
Provide a default for
--container-context
Provide a default for
--inject-env-vars-from-instance
Provide a default for
--location-name
Provide a default for
--instance-ref