Migrate your Power BI Semantic Models to Direct Lake

Поделиться
HTML-код
  • Опубликовано: 22 янв 2025

Комментарии • 16

  • @DanielWeikert
    @DanielWeikert Месяц назад +8

    Question would be when this makes sense. You certainly do not want to have all your models in direct lake mode as it requires to much Fabric compute resources. An indeph recommendation guide on that would be helpful

  • @anasjabrane894
    @anasjabrane894 Месяц назад +1

    Shootout to this genius solution! Making difficult things easy it's extremely difficult but someone has to do it for us all🎉😂

  • @bhagisrivally
    @bhagisrivally 13 дней назад

    Could you clarify how to handle multiple date tables in Direct Lake when hierarchies are disabled?

  • @sandeepbarge4699
    @sandeepbarge4699 Месяц назад +3

    Is my undetstanding correct that it introduces new tables in Lakehouse? Is it worth migrating existing models to Direct Lake in this manner considering data redundency it creates and overheads it adds for building additional ETL jobs to ingest data in new tables rhat it creates? Woudn't it be worth manually changing initial few steps of each table in report manually to point it to lakehouse tables in direct lake mode?

  • @karanpandya8417
    @karanpandya8417 13 дней назад

    I followed the same process but my model is creating two pqt files instead of one. Any thoughts how can I handle that?

  • @sumitnangare2584
    @sumitnangare2584 Месяц назад

    Wow! This is lifesaver ❤

  • @ProMeiFilmsOfficial
    @ProMeiFilmsOfficial Месяц назад

    Question,
    we use the DEV - STG/UAT - PROD lifecycle
    we have DEV LH, STG LH, and PRD LH
    Is it possible to redirect a Semantic model created on STG to PRD?

  • @josecardenas2736
    @josecardenas2736 Месяц назад

    If is have multiple tables imported and a other tables coming from another Semantic Model this will work? I have a shared mentantic model that is used by multiple dashboards but I'm connecting to a external dataset to get some. Tables.

  • @noahhadro8213
    @noahhadro8213 Месяц назад

    Thanks for the video. Love guy in the cube content. My concern is that when I do this my data flow takes more resources to refresh than my original import semantic model. Using (fabric capacity tracker dashboard from Microsoft). So using more of the capacity means I pay more. In addition, the dataflow refresh takes about 10 to 30 minutes longer pending on the data. Testing I did a Dataflow import of one table vs importing the table with exact same PQ code. Dataflows gen 2 are slower. I’ve followed all the best practices from you guys and Alex powers. Dataflows gen 2 is slower.

  • @franciscoclaudio4818
    @franciscoclaudio4818 12 дней назад

    Até hoje não vi utilidade em nada do Fabric!

  • @Tony-dp1rl
    @Tony-dp1rl Месяц назад +1

    Can Microsoft stop changing technologies every 3-5 years please. Getting to the point now where it is a hindrance more than an advantage to adopt anything in the Azure data-related product-line.

  • @DuAuskenner
    @DuAuskenner Месяц назад

    Coolio

  • @jasestu
    @jasestu Месяц назад

    Come on guys, your a huge company, get better audio.

  • @juanpablorvvv
    @juanpablorvvv Месяц назад +2

    They are just trying to sell Fabric. It’s too expensive even for companies.

    • @TafadzwaMundida
      @TafadzwaMundida Месяц назад +2

      Are the alternatives to Fabric cheaper?