Definitions

class dagster.Definitions(assets=None, schedules=None, sensors=None, jobs=None, resources=None, executor=None, loggers=None)[source]

A set of definitions explicitly available and loadable by Dagster tools.

Parameters:
  • assets (Optional[Iterable[Union[AssetsDefinition, SourceAsset, CacheableAssetsDefinition]]]) – A list of assets. Assets can be created by annotating a function with @asset or @observable_source_asset. Or they can by directly instantiating AssetsDefinition, SourceAsset, or CacheableAssetsDefinition.

  • schedules (Optional[Iterable[Union[ScheduleDefinition, UnresolvedPartitionedAssetScheduleDefinition]]]) – List of schedules.

  • sensors (Optional[Iterable[SensorDefinition]]) – List of sensors, typically created with @sensor.

  • jobs (Optional[Iterable[Union[JobDefinition, UnresolvedAssetJobDefinition]]]) – List of jobs. Typically created with define_asset_job or with @job for jobs defined in terms of ops directly. Jobs created with @job must already have resources bound at job creation time. They do not respect the resources argument here.

  • resources (Optional[Mapping[str, Any]]) – Dictionary of resources to bind to assets. The resources dictionary takes raw Python objects, not just instances of ResourceDefinition. If that raw object inherits from IOManager, it gets coerced to an IOManagerDefinition. Any other object is coerced to a ResourceDefinition. These resources will be automatically bound to any assets passed to this Definitions instance using with_resources. Assets passed to Definitions with resources already bound using with_resources will override this dictionary.

  • executor (Optional[Union[ExecutorDefinition, Executor]]) – Default executor for jobs. Individual jobs can override this and define their own executors by setting the executor on @job or define_asset_job explicitly. This executor will also be used for materializing assets directly outside of the context of jobs. If an Executor is passed, it is coerced into an ExecutorDefinition.

  • loggers (Optional[Mapping[str, LoggerDefinition]) – Default loggers for jobs. Individual jobs can define their own loggers by setting them explictly.

Example usage:

defs = Definitions(
    assets=[asset_one, asset_two],
    schedules=[a_schedule],
    sensors=[a_sensor],
    jobs=[a_job],
    resources={
        "a_resource": some_resource,
    }
)

Dagster separates user-defined code from system tools such the web server and the daemon. Rather than loading code directly into process, a tool such as the webserver interacts with user-defined code over a serialization boundary.

These tools must be able to locate and load this code when they start. Via CLI arguments or config, they specify a Python module to inspect.

A Python module is loadable by Dagster tools if there is a top-level variable that is an instance of Definitions.

Before the introduction of Definitions, @repository was the API for organizing defintions. Definitions provides a few conveniences for dealing with resources that do not apply to old-style @repository declarations:

get_asset_value_loader(instance=None)[source]

Returns an object that can load the contents of assets as Python objects.

Invokes load_input on the IOManager associated with the assets. Avoids spinning up resources separately for each asset.

Usage:

with defs.get_asset_value_loader() as loader:
    asset1 = loader.load_asset_value("asset1")
    asset2 = loader.load_asset_value("asset2")
get_job_def(name)[source]

Get a job definition by name. If you passed in a an UnresolvedAssetJobDefinition (return value of define_asset_job()) it will be resolved to a JobDefinition when returned from this function.

get_schedule_def(name)[source]

Get a schedule definition by name.

get_sensor_def(name)[source]

Get a sensor definition by name.

load_asset_value(asset_key, *, python_type=None, instance=None, partition_key=None)[source]

Load the contents of an asset as a Python object.

Invokes load_input on the IOManager associated with the asset.

If you want to load the values of multiple assets, it’s more efficient to use get_asset_value_loader(), which avoids spinning up resources separately for each asset.

Parameters:
  • asset_key (Union[AssetKey, Sequence[str], str]) – The key of the asset to load.

  • python_type (Optional[Type]) – The python type to load the asset as. This is what will be returned inside load_input by context.dagster_type.typing_type.

  • partition_key (Optional[str]) – The partition of the asset to load.

Returns:

The contents of an asset as a Python object.

dagster.create_repository_using_definitions_args(name, assets=None, schedules=None, sensors=None, jobs=None, resources=None, executor=None, loggers=None)[source]

Create a named repository using the same arguments as Definitions. In older versions of Dagster, repositories were the mechanism for organizing assets, schedules, sensors, and jobs. There could be many repositories per code location. This was a complicated ontology but gave users a way to organize code locations that contained large numbers of heterogenous definitions.

As a stopgap for those who both want to 1) use the new Definitions API and 2) but still want multiple logical groups of assets in the same code location, we have introduced this function.

Example usage:

named_repo = create_repository_using_definitions_args(
    name="a_repo",
    assets=[asset_one, asset_two],
    schedules=[a_schedule],
    sensors=[a_sensor],
    jobs=[a_job],
    resources={
        "a_resource": some_resource,
    }
)