Skip to main content

Integrate with other orchestration tools

Alongside dbt Cloud, discover other ways to schedule and run your dbt jobs with the help of tools such as the ones described on this page.

Build and install these tools to automate your data workflows, trigger dbt jobs (including those hosted on dbt Cloud), and enjoy a hassle-free experience, saving time and increasing efficiency.

Airflow

If your organization uses Airflow, there are a number of ways you can run your dbt jobs, including:

Installing the dbt Cloud Provider to orchestrate dbt Cloud jobs. This package contains multiple Hooks, Operators, and Sensors to complete various actions within dbt Cloud.

Airflow DAG using DbtCloudRunJobOperatorAirflow DAG using DbtCloudRunJobOperator
dbt Cloud job triggered by Airflowdbt Cloud job triggered by Airflow

For more details on both of these methods, including example implementations, check out this guide.

Automation servers

Automation servers (such as CodeDeploy, GitLab CI/CD (video), Bamboo and Jenkins) can be used to schedule bash commands for dbt. They also provide a UI to view logging to the command line, and integrate with your git repository.

Azure Data Factory

Integrate dbt Cloud and Azure Data Factory (ADF) for a smooth data process from data ingestion to data transformation. You can seamlessly trigger dbt Cloud jobs upon completion of ingestion jobs by using the dbt API in ADF.

The following video provides you with a detailed overview of how to trigger a dbt Cloud job via the API in Azure Data Factory.

To use the dbt API to trigger a job in dbt Cloud through ADF:

  1. In dbt Cloud, go to the job settings of the daily production job and turn off the scheduled run in the Trigger section.
  2. You'll want to create a pipeline in ADF to trigger a dbt Cloud job.
  3. Securely fetch the dbt Cloud service token from a key vault in ADF, using a web call as the first step in the pipeline.
  4. Set the parameters in the pipeline, including the dbt Cloud account ID and job ID, as well as the name of the key vault and secret that contains the service token.
    • You can find the dbt Cloud job and account id in the URL, for example, if your URL is https://cloud.getdbt.com/deploy/88888/projects/678910/jobs/123456, the account ID is 88888 and the job ID is 123456
  5. Trigger the pipeline in ADF to start the dbt Cloud job and monitor the status of the dbt Cloud job in ADF.
  6. In dbt Cloud, you can check the status of the job and how it was triggered in dbt Cloud.

Cron

Cron is a decent way to schedule bash commands. However, while it may seem like an easy route to schedule a job, writing code to take care of all of the additional features associated with a production deployment often makes this route more complex compared to other options listed here.

Dagster

If your organization uses Dagster, you can use the dagster_dbt library to integrate dbt commands into your pipelines. This library supports the execution of dbt through dbt Cloud or dbt Core. Running dbt from Dagster automatically aggregates metadata about your dbt runs. Refer to the example pipeline for details.

Databricks workflows

Use Databricks workflows to call the dbt Cloud job API, which has several benefits such as integration with other ETL processes, utilizing dbt Cloud job features, separation of concerns, and custom job triggering based on custom conditions or logic. These advantages lead to more modularity, efficient debugging, and flexibility in scheduling dbt Cloud jobs.

For more info, refer to the guide on Databricks workflows and dbt Cloud jobs.

Kestra

If your organization uses Kestra, you can leverage the dbt plugin to orchestrate dbt Cloud and dbt Core jobs. Kestra's user interface (UI) has built-in Blueprints, providing ready-to-use workflows. Navigate to the Blueprints page in the left navigation menu and select the dbt tag to find several examples of scheduling dbt Core commands and dbt Cloud jobs as part of your data pipelines. After each scheduled or ad-hoc workflow execution, the Outputs tab in the Kestra UI allows you to download and preview all dbt build artifacts. The Gantt and Topology view additionally render the metadata to visualize dependencies and runtimes of your dbt models and tests. The dbt Cloud task provides convenient links to easily navigate between Kestra and dbt Cloud UI.

Orchestra

If your organization uses Orchestra, you can trigger dbt jobs using the dbt Cloud API. Create an API token from your dbt Cloud account and use this to authenticate Orchestra in the Orchestra Portal. For details, refer to the Orchestra docs on dbt Cloud.

Orchestra automatically collects metadata from your runs so you can view your dbt jobs in the context of the rest of your data stack.

The following is an example of the run details in dbt Cloud for a job triggered by Orchestra:

Example of Orchestra triggering a dbt jobExample of Orchestra triggering a dbt job

The following is an example of viewing lineage in Orchestra for dbt jobs:

Example of a lineage view for dbt jobs in OrchestraExample of a lineage view for dbt jobs in Orchestra

Prefect

If your organization uses Prefect, the way you will run your jobs depends on the dbt version you're on, and whether you're orchestrating dbt Cloud or dbt Core jobs. Refer to the following variety of options:

Prefect DAG using a dbt Cloud job run flowPrefect DAG using a dbt Cloud job run flow

Prefect 2

dbt Cloud job triggered by Prefectdbt Cloud job triggered by Prefect

Prefect 1

  • Trigger dbt Cloud jobs with the DbtCloudRunJob task.
  • Running this task will generate a markdown artifact viewable in the Prefect UI.
  • The artifact will contain links to the dbt artifacts generated as a result of the job run.
0