Snowflake Sales Engineer explains how to use Dynamic Tables and Snowpipe Streaming | Real-world demo

Поделиться
HTML-код
  • Опубликовано: 3 авг 2024
  • In this video, we discuss some newer features of Snowflake called Snowpipe Streaming and Dynamic Tables.
    ➡️ Snowflake enables organizations to be data-driven by offering an expansive set of features for creating performant, scalable, and reliable data pipelines that feed dashboards, machine learning models, and applications.
    ➡️ But before data can be transformed and served or shared, it must be ingested from source systems. The volume of data generated in real time from application databases, sensors, and mobile devices continues to grow exponentially. While data generated in real time is valuable, it is more valuable when paired with historical data that provides context.
    ➡️ That proves to be a difficult task for data engineering teams that have to manage separate infrastructure for batch data and streaming data.
    ➡️ Streaming ingestion is not meant to replace file-based ingestion, but rather to augment it for data loading scenarios where it makes sense, such as:
    ►Low-latency telemetry analytics of user-application interactions for clickstream recommendations
    ►Identification of security issues in real-time streaming log analytics to isolate threats
    ►Stream processing of information from IoT devices to monitor critical assets
    🖥 Visit my website here:
    www.masteringsnowflake.com/
    💯 Mastering Snowflake is accepting applications now to work with us in a small group. Serious inquiries only please!
    forms.gle/WBqadnG7Y4tNe1wt8
    💬 Which new Snowflake feature do you want to hear about? Let me know in the comment section below!
    ✅ Subscribe to the channel here:
    www.youtube.com/@mastering_sn...
    ---------------
    ❄️Want to SUPERCHARGE your career and become an EXPERT in Snowflake??❄️
    Mastering Snowflake is accepting applications now to work with us in a small group. Serious inquiries only please!
    forms.gle/WBqadnG7Y4tNe1wt8
    ❄️Order my LATEST book: SnowPro Certification Study Guide HERE: ❄️
    Amazon US - www.amazon.com/dp/B0BHPJXLD1
    Amazon AU - www.amazon.com.au/dp/B0BHPJXLD1
    Amazon UK - www.amazon.co.uk/dp/B0BHPJXLD1
    ❄️Order my book: Mastering Snowflake Solutions HERE: ❄️
    Amazon UK - www.amazon.co.uk/Mastering-Sn...
    Amazon US - www.amazon.com/Mastering-Snow...
    Amazon AUS - www.amazon.com.au/Mastering-S...
    Amazon IND - www.amazon.in/Mastering-Snowf...
    ❄️Get my Free SnowPro core guide HERE: ❄️
    program.masteringsnowflake.co...
    ❄️Become a student on my course: ❄️
    Snowflake Practice Questions - SnowPro Core Certified Udemy Course www.udemy.com/course/snowflak...
    Get your Matillion Associate Certification FAST!
    www.udemy.com/course/matillio...
    -----------
    Here, we're all about helping you extract the maximum amount of business value from your data. As a data engineer and solution architect, I understand the struggles of finding practical and high-value advice on Snowflake capabilities.
    On this channel, I share my personal experiences, proven approaches, and best practices for designing and implementing Snowflake's capabilities. I also provide case studies and examples to show how it works for you.
    ➡️ My aim is to help you become a Snowflake expert and avoid costly pitfalls along the way. I'm all about providing you with robust, real-world advice based on my unique experiences in the field.
    If you're looking to expand your career prospects and the ability to command a significantly higher salary, consider joining our exclusive program.
    We'll guide you through a journey of transformation and provide you with our Everest roadmap to become an expert in the Snowflake data platform.
    So, if you're ready to take your data engineering and solution architecting skills to the next level, hit that subscribe button, and join our community by hitting the bell icon 🔔
    ---------------
    📲 Follow Adam Morton on social media:
    LinkedIn ▶️ / adammorton121
    Instagram ▶️ / mastering_snowflake
    ---------------
    📚 Disclaimer:
    We are not affiliated or authorised by Snowflake in any way.
    ------------------
    #AdamMorton #MasteringSnowflake #AWS3 #Snowflake #Matillion #DataIntegration #CloudComputing #DataManagement #ETL #FileTransfer #DataWarehouse #DataEngineering #DataProcessing #AWS #CloudMigration #DataMigration #BigData #DataAnalytics
  • НаукаНаука

Комментарии • 18

  • @TusharKale9
    @TusharKale9 Год назад

    Thank you for sharing, it is very useful

  • @jayrizzo1454
    @jayrizzo1454 Год назад

    Are we going to get Latteral flatten json ability during snowpipe creation?

  • @user-ds5gh2me4c
    @user-ds5gh2me4c Год назад

    Thanks for sharing. when you create a dynamic table on top of query, is it going execute for all the records in source tables everytime or try to ingest only new records into the dynamic table when it is
    refreshed?

    • @mastering_snowflake
      @mastering_snowflake  Год назад

      This article should help clarify things medium.com/snowflake/slowly-changing-dimensions-with-dynamic-tables-d0d76582ff31

  • @barandeepsingh
    @barandeepsingh Год назад

    Thanks, very useful.Does Snowpipe streaming work with Kafka using kafka connector only or can it similarly integrate with Google PubSub as well using a similar connector?

  • @saumyakapoor6772
    @saumyakapoor6772 9 месяцев назад

    will it be a good use case to use snowpipe streaming in case my source is oracle on premise database, but I want low latency data ingestion in snowflake tables. Or do you recommend going the traditional way of using snowpipe and queues in azure?

  • @surendralamichhane8429
    @surendralamichhane8429 11 месяцев назад +1

    @mastering_snowflake I have a specific use case where every time there is a change in raw table, I run the merge statement that merges based on multiple columns. If the record from select statement in raw table is found with the matching values for all the column in the target table (each combination of column values in merge condition is unique) then I update the record, else I insert a new record. How is this possible in Dynamic table? In other words, how can I mention multiple columns as merge condition in Dynamic table?

    • @mastering_snowflake
      @mastering_snowflake  11 месяцев назад

      Why would you want to update the record if all column values match?

  • @cristinaperez5757
    @cristinaperez5757 Год назад +1

    Would It be posible to share tables between different databases with a dynamic table? I usually create a copy of a table to a different database using a view but its quite painfull when the schema changes

    • @roopad8742
      @roopad8742 Год назад +1

      Cloning?

    • @cristinaperez5757
      @cristinaperez5757 Год назад +1

      @@roopad8742 cloning creates a copy of a table in that instant, if the original table changes the clone doesnt update

    • @roopad8742
      @roopad8742 Год назад +1

      @@cristinaperez5757 Yes but adding a step to clone rather than copy using a view is much more Efficient isn't it?

    • @cristinaperez5757
      @cristinaperez5757 Год назад +1

      @@roopad8742 ok, thanks for replying!

  • @vio4jesus
    @vio4jesus Год назад

    Hhhhhmmm can the Dynamic table query or pull only data that hasn't been pulled since last time?