Historical Data Validator Report - Automated Power BI Dataset Snapshots

Поделиться
HTML-код
  • Опубликовано: 31 июл 2024
  • In this video, we're diving deep into the world of Historical Data Validator Reports, showcasing how to effortlessly create automated snapshots of your datasets.
    📊 Say goodbye to manual data validation headaches! This step-by-step guide will empower you to set up a dynamic reporting system that captures your data's evolution over time.
    Learn to automate dataset snapshots for seamless historical analysis.
    Gain insights into tracking data changes and trends effortlessly.
    Elevate your decision-making with accurate, reliable historical reports.
    💡 Ready to unlock the full potential of your data? Hit play now and embark on a journey towards smarter, data-driven decisions!
    Don't forget to like, share, and subscribe for more exciting Power BI tutorials. Let's revolutionize the way you handle historical data together! 📈💪
    Keep automating,
    Roland
    💪 New videos coming every week 🤹‍♂‍
    🤘 New short videos coming every week 👨‍🏫
    🔗LINKS:
    Power BI Announcement - powerbi.microsoft.com/en-us/b...
    🎬 MORE VIDEOS:
    Power BI & Power Automate - How to automate reporting process -
    • Power BI & Power Autom...
    How To Import Latest File From A Folder Using Power Query - Power BI Trick - • How To Import Latest F...
    Dynamic Row-level Security 🔐 - Based on Dimension Tables - • Dynamic Row-level Secu...
    Chapters:
    00:00 Topic Of The Day - Data Validator
    00:54 Intro
    01:02 BI-Lingual Analytics
    01:21 Data Snapshots
    01:35 Data Validation - Reasons, Threshold, etc
    02:17 Report v1 - Matrices
    04:29 Next Steps - DataViz & Automation
    05:04 Report v2 - Combo Chart + Small Multiples
    06:22 Report v2 - Matix + Card
    07:00 Enhance Previous Automation
    08:35 Test Setup - Data Acquisition
    09:25 Summary
    10:15 LIVE version
    10:36 Questions-Comments
    10:53 End
    ▼▼▼▼▼▼▼▼▼
    Make sure to hit the 👍 button and ❗❗ SUBSCRIBE ❗❗ to my channel.
    If you have any questions, just let me know either in the comments down below 👇 or send us an e-mail 📩.
    ▲▲▲▲▲▲▲▲▲
    🤝 HOW TO CONNECT 🤝
    LinkedIn - / bilingualanalytics
    Twitter - / bilanalytics
    Subscribe - / @bilingualanalytics
    ABOUT ME: bilingualanalytics.com.au/abo...
    📍 - Sydney, Australia
    📽 GEAR:
    📷 - Sony ZV-E10
    🎙 - Audio Technica AT2020
    🔦 - Neewer 18” Ring Light, Elgato Key Light
    🎧 - BOSE QC35
    🖥 - ASUS VC279H + Xiaomi Mi Curved 34"
    💻 - Intel i7-12700K, Gigabyte Aorus Master 3080 12GB, Corsair Vengeance LPX 64GB, Samsung 980Pro 1TB
    #PowerBI #PowerAutomate #DataAnalytics
  • НаукаНаука

Комментарии • 9

  • @BILingualAnalytics
    @BILingualAnalytics  10 месяцев назад +1

    You can find the first part of this video here:
    ruclips.net/video/8KyF-wMP0O4/видео.html

  • @anithanarayan4628
    @anithanarayan4628 8 дней назад +1

    Is it possible to take snapshot of the graphs from power bi it need to trigger email and the snapshots need to be change dynamically every time it triggers all this can be done using power automate??

    • @BILingualAnalytics
      @BILingualAnalytics  8 дней назад

      Not sure what you mean by this. You can set up paginated reports which sounds potentially the closest to sending out charts via email and things like that

  • @user-rf4ix2qj4d
    @user-rf4ix2qj4d 6 месяцев назад +1

    Historical data can fast become an enormous amount of data. If you not only have 20 or 50 files with historical data (and every file hast 8 columns and 100 rows), but 2000 files with 30 columns and 25.000 rows each, you could get an performance problem. (E.g. when you monitor machines in a plant) Can you make a video and show how to compress this data in Power BI by deleting all rows that did not change. The challenge is to create a report that takes the fact in account that on some days the report has to take the value of the last change.

    • @BILingualAnalytics
      @BILingualAnalytics  6 месяцев назад

      I would probably explore a solution in Power Automate for that.
      New file exported to SharePoint, if the data is the same as the previous version archive the previous version - I'm not a huge fan of deleting data.
      If data is different keep it. You might still end up with lots of files, however, wouldn't keep two files with the same values.

  • @DanielWeikert
    @DanielWeikert 10 месяцев назад +1

    How do you get the Query for power automate without dragging each column of the table in a power bi visual and use refresh query?
    There must be a better way
    thx

    • @BILingualAnalytics
      @BILingualAnalytics  10 месяцев назад

      You can write the DAX query "by hand" as well - I just found it easier and much quicker to do this way.

  • @DanielWeikert
    @DanielWeikert 10 месяцев назад +1

    Do you have best practises / tipps /VIDEOS you can make/share for data historization with power bi? Thinking of bigger datasets here so power automate snapshot probably not a good solution here
    br

    • @BILingualAnalytics
      @BILingualAnalytics  10 месяцев назад

      We actually decided to go down the daily snapshot creation, but limit the numbers to 2-3 days per week. Also, we are not keen on grabbing transactional details and validate each and every transaction. If you want to go down that path you really need a more robust solution (ie proper database structure).
      Please bear in mind, a high-level summary like this is enough to flag if there was any change. After that you always have the option to drill into the details and identify the issue.