Orchestrating Data Pipelines With Snowpark dbt Python Models And Airflow

Поделиться
HTML-код
  • Опубликовано: 2 окт 2024
  • A common Airflow use case is orchestrating Snowflake queries as part of a data pipeline. However, such pipelines are normally SQL-based, and data engineers would like to interact with Python models to perform their data engineering. With the release of dbt version 1.3, it's now possible to create both SQL- and Python-based models in dbt, and these Python-based dbt models are made possible by Snowflake's new native Python support and Snowpark API for Python. With Airflow, you are now able to harness the power of Snowpark dbt Python Models and orchestrate these workflows to transform and manage your data. Adrian Lee, Solutions Engineer at Snowflow, does a technical deep dive into how you can build more fluid data pipelines with Snowpark for Python, dbt, and Airflow.
    Try Snowflake for Python For Free
    Test drive the Snowflake platform with our 30-day free trial
    → signup.snowfla...
    Learn how to build applications with Python in 10 minutes!
    →tinyurl.com/mr...

Комментарии • 4

  • @FabioOrtiz-j5w
    @FabioOrtiz-j5w 7 месяцев назад +3

    Can we get a link to this project in GitHub or any other repo?

  • @rusttaf
    @rusttaf 5 месяцев назад

    What is the advantage of using this configuration (Airflow, DBT core, snowpark) instead of using DBT cloud?