Datastores

A Flow supports multiple backend data stores which configure where data is stored for the workflow. Layers in the Flow will read from and write to the data store to pass artifacts between layers.

Local

By default, flows are configured to use the Local datastore which writes Flow artifacts to a location on local disk.

from laminar import Flow
from laminar.configurations import datastores

class LocalFlow(Flow):
    ...

flow = LocalFlow(datastore=datastores.Local())

Memory

The Memory datastore writes artifacts to an in memory key/value store. This very useful for testing.

from laminar import Flow
from laminar.configurations import datastores

class MemoryFlow(Flow):
    ...

flow = MemoryFlow(datastore=datastores.Memory())

Warning

Can only be used with the Thread executor because only the main process can write to the Memory datastore.

AWS.S3

Warning

AWS.S3 is experimental.

The AWS.S3 datastore writes artifacts to the AWS S3 object storage service.

from laminar import Flow
from laminar.configurations import datastores

class S3Flow(Flow):
    ...

flow = S3Flow(datastore=datastores.AWS.S3())