Getting Started with Dataflow in Microsoft Fabric Data Factory

Поделиться
HTML-код
  • Опубликовано: 29 сен 2024

Комментарии • 33

  • @Jojms123
    @Jojms123 7 месяцев назад +1

    First of all thanks for the video. Suggestion : It would be great to have the links for your other videos appearing as you speak or in the description below.

  • @mounikajuttiga3936
    @mounikajuttiga3936 3 месяца назад

    Can we refresh the dataset for every 15mins in fabric(schedule refresh)?

  • @shafa7668
    @shafa7668 Год назад +1

    I wanted to get started with Fabic from day one of announcement literally. So thank you for starting this series. You have given us an ahead start!! Cheers

    • @RADACAD
      @RADACAD  Год назад

      Always glad to help :)

  • @ruru1419
    @ruru1419 Год назад

    Thanks Reza great video as usual!
    We're trying some PoC with Fabric Warehouse (not Lakehouse) for our SQL user community. Although I have no issues loading small files with Dataflow Gen2, when trying to load On-Premis data through our Gateway (which works fine to refresh PowerBI Datasets) i always get this error:
    "An exception occurred: Microsoft SQL: This function doesn't support the query option 'EnableCrossDatabaseFolding' with value 'true'."
    I cannot find anything related to this...any clue? I wonder if many have tried to implement a "true" business scenario and not just some Exel samples...for this we need to pull data from the Gateway. Thanks!

  • @RajeshGopu-x7m
    @RajeshGopu-x7m Год назад

    Very Good Video and easy to understand to explore futher for beginners...

  • @adamsabourin9416
    @adamsabourin9416 Год назад

    Reza if we choose append instead of replace is it going to keep duplicates? If so how can we save as “append and remove duplicates”?

  • @raviv5109
    @raviv5109 Год назад

    Good Video, Thanks for creating and sharing. It would be interesting to know how it performs on real world large datasets.

  • @debasisrana6437
    @debasisrana6437 5 месяцев назад

    Thanks for the video

  • @kapiljadaun7264
    @kapiljadaun7264 Год назад

    Hi
    Your way of explaining is great.
    I would request you to make a video from starting to making reports in Power BI with demo. It will be very helpfull.
    Thank you

    • @RADACAD
      @RADACAD  Год назад

      We are glad it is helpful

  • @tea0819
    @tea0819 Год назад

    Excellent video. Thank you for sharing. I am new to your channel but enjoying all of the content. I recently started a YT channel as well focused on Azure Data and I was just curious what software are you using for drawing red boxes around items and zooming in on your video?

    • @RADACAD
      @RADACAD  Год назад

      Best of luck! and thanks
      I use Zoomit

  • @Milhouse77BS
    @Milhouse77BS Год назад

    Thanks. Seems like there should be a "Publish & Refresh" option?

    • @RADACAD
      @RADACAD  Год назад

      I agree :) would be helpful

  • @barttrudeau9237
    @barttrudeau9237 Год назад

    Reza, Your videos are amazing. You stay razor focused and on subject. I'm really enthused about Fabric but concerned about licensing. I don't want to try a bunch of new things for a month only to find out I can't afford them once the trial period is over. We have E5 licensing and I'm not sure what that's going to cover when the trial period is over. Any chance you could update the licensing video you did a while back to help us understand the cost implications of using Fabric?

    • @RADACAD
      @RADACAD  Год назад

      Thanks Bart
      I will have a new video on Microsoft Fabric licensing soon. It is slightly different from how Power BI licensing works, but similar principals.

  • @yoismelperez2744
    @yoismelperez2744 Год назад

    Thanks for sharing Reza. I like how you are taking the lead to go over Microsoft Fabric products. One question, I may have missed, will replace do update on existing records and inserts for new, or just replace on the entire dataset. Being familiar with PBI Dataflows, I think the answer is it will replace all but just want to confirm.

    • @yoismelperez2744
      @yoismelperez2744 Год назад

      Reza, confirmed, you mentioned it in this video ruclips.net/video/qNoOQzMjrfk/видео.html, it will replace whatever exists 👍

    • @RADACAD
      @RADACAD  Год назад

      Thanks :)
      Replace will wipe out the data and enters the new data, whereas the append will append it to the existing data.

  • @decentmendreams
    @decentmendreams Год назад

    Hi Reza, these are all good but what has downed on me is that if you are with a Premium Per user licensing Fabric means squat . If feels like a rich man has moved in to your neighborhood and you are watching all his fancy toys as the movers unload . I actually went ahead and turned off the trial version as it seems to overcrowd my Service page .Am I far off here ?

    • @barttrudeau9237
      @barttrudeau9237 Год назад +1

      I share similar concerns

    • @RADACAD
      @RADACAD  Год назад +1

      I feel your concerns.
      And to be honest if you want to just purely use Power BI, you won't need Fabric.
      For example, a small business with a data analysts and a few users analyzing data of some Excel files using Power BI, works best as a pure Power BI solution.
      However, for larger scenarios you get more done with other elements. In large organizations, you would need a storage for structured and unstructured data, you need staging environment for the data, then a data warehouse, a fully automated ETL mechanism to load data in, then model it, visualize it etc. Power BI is only part of the picture. Fabric would enable organizations to achieve more in the data analytics space.
      It might look like a very huge product (which is), but remember how you eat an elephant? one bit at a time :D

    • @decentmendreams
      @decentmendreams Год назад

      @@RADACAD Hi Reza, you are right, Fabric will be overkill for most of my needs except for the DirectLake Connector, if I understood it correctly, blazingly fast data refreshes. My files are so large (>100mb per day) and I need to keep as many of them as I can.
      One bright spot about the introduction of Fabric is that it has made me curious about file compressions. For example, I learned that if I convert my CSV files to parquet files (never knew about it till this week) I can reduce its size by 75% which is so awesome.
      Thank you for everything.
      A person in Phoenix, Arizona.

  • @antonyliokaizer
    @antonyliokaizer Год назад

    I'm wondering why public preview don't have the button "Add data destination" in 10:16 after I upload a csv file as a table? Thank you.

    • @antonyliokaizer
      @antonyliokaizer Год назад

      Without the button, I cannot send data to lakehouse nor warehouse....

    • @RADACAD
      @RADACAD  Год назад +1

      Are you creating dataflow gen2? Because Gen 1 doesn't have this option

    • @antonyliokaizer
      @antonyliokaizer Год назад

      @@RADACAD In public review, I don't see any entry for creating gen1 dataflow...
      Thanks, let me double check again and again

    • @antonyliokaizer
      @antonyliokaizer Год назад

      @@RADACAD Per checked, you're correct. Thank you.
      I guess the gen 1 data flow was created in pipeline.
      From Data Factory page, there's only "data flow gen 2" but "gen 1"
      Thank you again and again.

  • @AbhishekYadav-rb4bi
    @AbhishekYadav-rb4bi Год назад

    Thank you🙌

    • @RADACAD
      @RADACAD  Год назад

      You're welcome 😊

  • @mjbah
    @mjbah Год назад

    Hi Reza.
    May thanks for the video. As always, your videos are helping a lot.
    I got a question around 'adding data to destination'. I was just wondering if you must add each table separately. I am just thinking that if you got so many tables and you want to add all the tables to the same destination whether you can't do it all at once?

    • @RADACAD
      @RADACAD  Год назад

      Hi Mohamed
      That is totally my question too; why shouldn't I be able to add one destination for multiple queries. Let's hope when the preview is done and is generally available, we have a feature like that :)