Skip to main content
dbt

About

Our demo dbt repo currently offers two quickstarts: one that runs a simple dbt pipeline and one that runs a dbt pipeline from an external repo. At this time, running dbt in Mage requires Docker, so both of our demos are docker compose based.

Prerequisites

  1. Docker
  2. Git

Configuration

The following command will clone the demo repo, copy dev.env to .env and run docker compose up to start Mage and a Postgres database.
git clone https://github.com/mage-ai/dbt-quickstart mage-dbt-quickstart \
&& cd mage-dbt-quickstart \
&& cp dev.env .env && rm dev.env \
&& docker compose up
After running the above command, you’ll see the Mage overview page. Click the pipelines icon on the left to enter the pipelines overview.
Open Pipelines

Tutorial

First, select the simple_dbt_python_pipeline by double-clicking the row from the list of pipelines this will take you directly to the editor. You can also single click, then select the “code” icon from the side nav.Scroll to the bottom-most cell and click Execute with all upstream blocks.
Run simple pipeline
You just:
  • Seeded a dbt model
  • Performed a dbt transformation
  • Took the transformed data and performed a Python transformation
  • Wrote the data to a Postgres source
To see the output, you can use a querying tool (like DataGrip or psql) to the locally hosted Postgres database and query public.analytics.cur_customers.
customer_idfirst_namelast_nameletters_first_nameis_alliterative
1MichaelP.7false
2ShawnM.5false
3KathleenP.8false
4JimmyC.5false
5KatherineR.9false
6SarahR.5false
7MartinM.6true
8FrankR.5false
9JenniferF.8false
10HenryW.5false
Select the dynamic_dbt_pipeline by double-clicking the row from the list of pipelines— this will take you directly to the editor. You can also single click, then select the “code” icon from the side nav.Scroll to the bottom-most cell and click Execute and run all upstream blocks.
Compile external pipeline
You just:
  • Pulled a GitHub repo and wrote it to your local Mage directory
  • Wrote a demo profiles.yaml file that interpolated environment variables from your instance
  • Compiled dbt build to write data to your local postgres database.
To run this pipeline, click the name of the pipeline to open the Triggers page and select Run @once to trigger a pipeline run. Select the logs icon to see the run in realtime!
Trigger dbt
Now, in addition to pulling the external Jaffle Shop demo and writing the necessary profiles, you’ll perform dbt build, which will: run models, test tests, snapshot snapshots, and seed seeds. 🥳To see the output, you can use a querying tool (like DataGrip or psql) to the locally hosted Postgres database. To learn more about the Jaffle Shop demo, check out the repo.
I