How to Create an ELT Pipeline Using Airflow, Snowflake, and dbt!

Поделиться
HTML-код
  • Опубликовано: 24 авг 2024
  • In this video, I'll go through how you can create an ELT pipeline using Airflow, Snowflake, and dbt, with cosmos to visualize your dbt workflows! Check out the repo below!
    github.com/ast...

Комментарии • 20

  • @sainithinreddy6633
    @sainithinreddy6633 11 месяцев назад +3

    Very clear explanation 👍

  • @maxpatrickoliviermorin2489
    @maxpatrickoliviermorin2489 10 месяцев назад +1

    Thank you!
    Would you mind making a much more elaborate version please?

  • @shubhamkawade7351
    @shubhamkawade7351 5 месяцев назад +1

    Nice explanation! For a specific dag, how do I run a specific dbt command? e.g. How would you execute 'dbt run --select +third_day_avg_cost_run' for this project in the video
    ?

    • @thedataguygeorge
      @thedataguygeorge  5 месяцев назад

      You could use the cosmos filtering mechanism to filter for just that specific step, but by default cosmos will automatically render each individual dbt model trigger as a task in the DAG

  • @WiseSteps.D
    @WiseSteps.D 5 месяцев назад +1

    All good,
    But where we able to see snowflake connection details?

    • @thedataguygeorge
      @thedataguygeorge  5 месяцев назад

      Go to the connection management UI and select the snowflake connection there!

    • @Rajdeep6452
      @Rajdeep6452 3 месяца назад

      in airflow go to admin>connections and type in your {
      "account": "-",
      "warehouse": "",
      "database": "",
      "role": "",
      "insecure_mode": false
      }

  • @ameyajoshi6588
    @ameyajoshi6588 2 месяца назад

    can we have cyclic pre hook applied for a model? If so how to achieve it using dbt airflow

    • @thedataguygeorge
      @thedataguygeorge  2 месяца назад

      Yes definitely, if you have the pre-hook applied as part of your dbt model build process it should still work!

  • @vijayjoshi-mw8cr
    @vijayjoshi-mw8cr 2 месяца назад

    Hello, I have build a ETL pipeline using python,pandas,airflow,snowflake but problem is when i run the task that time it will not load the data into snowflake.. so please can you help with us!!

    • @thedataguygeorge
      @thedataguygeorge  2 месяца назад

      What errors are you getting?

    • @VijayJoshi-eg2zq
      @VijayJoshi-eg2zq 2 месяца назад

      @@thedataguygeorge when i will run the task that time it will get success message but when check snowflake warehouse i am not able to see the table

  • @Rajdeep6452
    @Rajdeep6452 2 месяца назад +1

    So we are supposed to make the dbt init project first and then create the databases in snowflake? And what schema are you using?

    • @thedataguygeorge
      @thedataguygeorge  2 месяца назад

      The dbt init that's part of the project should create the databases for you as long as you have the proper permissions/setup