Dataflows end-to-end project (Microsoft Fabric) + Lakehouse + Power BI

Поделиться
HTML-код
  • Опубликовано: 31 дек 2024

Комментарии • 20

  • @LearnMicrosoftFabric
    @LearnMicrosoftFabric  Год назад +3

    Hey all! How's it going with Dataflows and Fabric? Have you started using either yet?! Share you experiences below 👇😃

  • @AmritaOSullivan
    @AmritaOSullivan Год назад +1

    Thanks for this fab video. It is so so easy to understand and so useful to actually go through use cases end to end!

    • @LearnMicrosoftFabric
      @LearnMicrosoftFabric  Год назад

      Hey thanks for watching and for the feedback! I’ll be doing more end-to-end projects in the future for sure ☺️💪

  • @rizvinazish
    @rizvinazish 7 месяцев назад

    very nice easy way you explained it, very helpful!!

  • @benshi1975
    @benshi1975 4 месяца назад +1

    hey awesome vid! thanks I follow you!
    with dataflow can i aim a specific schema on the warehouse destination? or i should combine pipeline and dataflow for that? thx

  • @LaZyBuM999
    @LaZyBuM999 9 месяцев назад

    Thanks for the wonderful videos on Microsoft Fabric.
    I see that all the data imports are "one time activities" from a source.
    How can we get the delta data (eg. new records, deleted records) in the source syncd with the lakehouse periodically?
    i.e., What about the continuous CRUD operations being done on the source (eg. Sql DB) ? How can that be synced with the data in the lakehouse?

    • @LearnMicrosoftFabric
      @LearnMicrosoftFabric  9 месяцев назад +1

      Thanks fo your comment! Yes you're right, they are one-time activities (but you can schedule then to run every day/ hour etc, but it's never going to be 'real-time'
      The feature you are describing is more like OneLake shortcuts (which you can create a real-time link to FILES in external locations (ADLS and Amazon S3).
      Microsoft are also releasing soon a feature called Database Mirroring which will do the same thing, but for databases (Snowflake, CosmosDB, Azure SQL etc) - this feature is currently in private preview, I believe they will release it soon for public preview!

  • @sanishthomas2858
    @sanishthomas2858 8 месяцев назад

    nice.. Quick question why we are not having dataflow UI that we had in adf and synapse

  • @eniolaadekoya5623
    @eniolaadekoya5623 2 месяца назад

    hi wills for the report can i import background design maybe from figma or canva

  • @scarabic5937
    @scarabic5937 Год назад

    Great video, thanks

  • @arnabgupta4391
    @arnabgupta4391 Год назад

    Thanks for the awesome video. How do you add the folder names like BronzeLayer/SilverLayer? It all got created into the same workspace for me.

    • @LearnMicrosoftFabric
      @LearnMicrosoftFabric  Год назад

      Hey thanks for the question, which part of the video are you referring to?

    • @rdbdebeer9085
      @rdbdebeer9085 Год назад

      The folders are referring to data lakes so he has a Data lake for Bronze Data and a Data lake for Silver Data.

  • @jevonzhu
    @jevonzhu Год назад

    dataflow2 cannot be deployed in Fabric pipeline by my side, why is that?

    • @LearnMicrosoftFabric
      @LearnMicrosoftFabric  Год назад

      Hey! Could be a number of reasons! Does the dataflow run ok outside of the pipeline? I would look in Monitoring Hub and analyze the error message. Good luck!

  • @pphong
    @pphong Год назад

    Thank you for sharing! I didnt load my data into the azure storage account - i used DFg2 to read/upload the CSV file.
    Do you guys experience slow delta writing to the lakehouse? [Can I do something to speed it up]?

    • @LearnMicrosoftFabric
      @LearnMicrosoftFabric  Год назад +2

      Hi
      Yes in general the Dfg2 is quite slow at the moment during the public preview, I’m sure the write speeds will increase as we move closer to GA (general availability) of Fabric