Skip to content

Core Concepts

Understanding these two concepts is key to building with Datallog.

Tasks

A Datallog automation is a workflow composed of one or more tasks.

  • Each task is a Python function decorated with either @core_task or @task.
  • The @core_task decorator marks the entry point of your automation.
  • Tasks pass data to each other by returning a serializable JSON object. The output of one task becomes the input (seed) of the next.

Automatic Parallelization

Datallog can automatically parallelize your workflow.

  • If a task returns a list of items, Datallog will invoke the next task for each item in parallel.
  • This "forking" behavior allows you to process large datasets efficiently without writing any complex concurrency code.

To deploy your automation to the cloud, you would use the datallog login and datallog push commands. Check the command help (datallog <command> --help) for more details. Happy coding! ✨