Understanding memory used by Power BI - Unplugged #7

Поделиться
HTML-код
  • Опубликовано: 19 фев 2021
  • How does Power BI consume memory? How do you read the right memory numbers?
    Learn how to use Task Manager and how to correctly interpret the numbers provided by the several processes used by Power BI when you open a PBIX file. In order to do that, we have to introduce a few concepts like process and virtual memory in Windows - just the minimum required to understand what is going on.
    This way, you can understand whether you have a memory issue on your PC or not, and what is causing that!
    Read more about the "unplugged" format: www.sqlbi.com/blog/marco/2021...
    #unplugged #dax #powerbi
  • НаукаНаука

Комментарии • 44

  • @DIGITAL_COOKING
    @DIGITAL_COOKING 3 года назад +3

    great video as usual, what I like about this channel is
    the right subject selected of the problems we face in practice

  • @sue_bayes
    @sue_bayes 3 года назад +1

    Absolutely fascinating and answered many questions that I hadn't even internalised yet. I had to watch certain parts a couple of times, and have no doubt I'll refer to this again. Will be interesting to see the results when using Teams and working on BI at the same time. Also really helped breakdown the different parts of Power BI to understand usage. Thank you. Loving the unplugged format.

  • @yippiekyoo
    @yippiekyoo 3 года назад +1

    Really awesome, now I understand more how the background processes are working, thus will help me to improve my dashboards.

  • @babagio
    @babagio 2 года назад +1

    Ciao Marco, just discovered and subscribed immediately, your channel. Great one! Thanks you both for such knowledge you spread. ("già digerito, non sò come dirlo"). This #unplugged playlist is the right thing in the right moment. A big hug and stay safe!

  • @abhijeetghosh27
    @abhijeetghosh27 3 года назад +1

    Very informative. Would always be useful while during optimization. Thanks Marco for sharing this 👍☺️

  • @roadtech6472
    @roadtech6472 3 года назад +1

    Amazing explanation! Thanks for sharing knowledge!

  • @bandishgupta2046
    @bandishgupta2046 2 года назад +1

    Amazing Video... Thanks for all the information Marco !!

  • @youneshamza381
    @youneshamza381 Год назад +1

    you're great, period !
    A star !

  • @danthompson8309
    @danthompson8309 3 года назад +1

    Fantastic discussion. Thank you

  • @saharlatifi3510
    @saharlatifi3510 2 года назад

    Great video! Very informative. Thanks Marco.

  • @JovianoSilveira
    @JovianoSilveira 3 года назад +1

    You are THE BEST!

  • @kondziunia
    @kondziunia 3 года назад +1

    Awesome. Great video. Rgds

  • @maciejkesy6869
    @maciejkesy6869 3 года назад +1

    Great job, thank you :-)

  • @hasnainhaider8191
    @hasnainhaider8191 3 года назад +1

    Great stuff.

  • @anilredddykanthala
    @anilredddykanthala 11 месяцев назад +1

    Really must know subject for powerbi users

  • @JovianoSilveira
    @JovianoSilveira 3 года назад +1

    I´m your fan here in Brazil

  • @hectorpatinosanchez96
    @hectorpatinosanchez96 2 года назад +2

    Hi, Marco! Thank you for all the videos you and your team make. It's really incredible how such a quality content can be available for everbody every week. I really appreciate the way you focus in problems we all are facing in daily work.
    I'm having real issues dealing with memory usage in Power Query Editor and, even worse, in Advance Editor. I usually write some M to make a clean model form data sources that are not well structured . I observe that with a few (10 is enough) data sources, the performance starts to be incredibly low, making the editor almost unusable. I know your researching and informative efforts is more oriented to DAX than M, but sometimes (maybe always) some M is necessary to build a clean model (This can be a discussion point, I know. Leverage between DAX and M is a difficult topic for me and other developers).
    Have you found the origin of this memory usage? If so, have you found some strategies to avoid it?
    Thank you in advance and thank you again for your incredible work.

    • @SQLBI
      @SQLBI  2 года назад

      You should check this blog post and related articles: blog.crossjoin.co.uk/2021/06/06/speed-up-power-query-in-power-bi-desktop-by-allocating-more-memory-to-evaluation-containers/

  • @Dias-fp5uo
    @Dias-fp5uo 3 года назад +1

    Thank you for a great video Marco! Whenever I refresh my dataset the Analysis services engine consumes around 2GB (10x times more memory than mashup engine). I thought it should be vice versa since Power Query is mainly being used (I have around 40+ tables and my PBIX file size is around 50MB). Do you know if it is normal and do you have video or an article which explains how to deal with it? Thank you!

    • @SQLBI
      @SQLBI  3 года назад

      Analysis Services load data uncompressed in memory and it compresses them in segments of 1MB each in Power BI. If you have tables with many columns and long strings, it's easy to allocate temporary area of 1/2GB to load 1M rows from tables. Once compressed it's another story, but the compression must read data uncompressed first.

  • @workstuff5253
    @workstuff5253 3 года назад +1

    With me at the helm; Short answer - all of it ;)

  • @diogobueno176
    @diogobueno176 2 года назад

    Grazie per il video!!
    What happens if I have just simple mesures like CALCULATE(SUM()), and even in this case my memory in the publicated dashboards the table breaks?

  • @gustavobarbosa2526
    @gustavobarbosa2526 3 года назад +1

    very enligntening, Marco! Thank you!
    However, I have been facing a problem that for me is quite weird. I hve a model with a fact table with around 1M rows. When I create a new measure in another table where I just store measures, PBI takes a lot of time between when I press Enter and the moment it again accepts new inputs. I cannot understand why this happens, because at this point I did not include the measure in any visualization (I just created it), and I would believe that simply creating the measure should have no impact on processing time. I don’t know if you ever faced this issue, if you did maybe you could include it in a video dealing with how PBI stores and deals with measures.
    Thank you so much for all the valuable information yours and Alberto’s videos provide to the PBI community!!

    • @marcorusso7472
      @marcorusso7472 3 года назад

      It's simply the validation of the measure that takes time - you probably have many other measures in the same model.
      Just another reason to use Tabular Editor (which is much faster, besides other things)!

    • @dattrieuds
      @dattrieuds 3 года назад

      @@marcorusso7472 Again, thanks very much for your and Alberto's enlightening videos! I have a follow-up question about the Tabular Editor (TE) here.
      I see the TE very powerful and useful, especially for Calculation Group (CG) (by the way, I love your article series about CG on sqlbi, although I need to read them again and again to really understand them!). But whenever I create CG in TE, all the formats seem not to be editable anymore in PBI Desktop, and more importantly, you can't even drag those measures in new visualizations, which are very annoying to me. I had to create a totally new file, not using TE anymore, and had to use again the old workaround (with DAX function Switch + disconected table) to mimic the CG behaviour, and then I could drag&drop measures freely (not only measures created by CG, but also measures created in PBI Desktop before) to the build the visualisations.
      So my questions are (1) Do you know the way/workaround/best practice to create CG in TE, and avoiding the abovementioned formatting issue? And (2) if I have the abovementioned issue, and cannot switch to my workaround like I also told above, since it will take a lot of time to rebuild everything, is there anyway to fix that issue?
      Thanks again! I wish you, Alberto great health, especially in this Covid time, to continue to contribute to our BI community!

    • @SQLBI
      @SQLBI  3 года назад +1

      It depends on the visual, make sure you have the latest version of Power BI Desktop and check whether all the visuals are effected or not. For example, Matrix should work well on any Power BI version supporting calculation groups. The latest version of Power BI (December 2020) still does not support custom visuals and a few standard visuals. Hopefully, this will be fixed in coming releases.

  • @akhileshgupta1971
    @akhileshgupta1971 Год назад

    Hi Marco - Good Morning first of all! I have a measure and I want to find average of the same measure but I couldn't in power bi using DAX, I don't know error coming up might be not aware of this back session. Overall I want your suggestion on this to do that. I have logic to find the same calculation using alternate choice that would be sum of all job done divided by the no. of users. But I need exactly the average of production they did by averaging of each row of that table. I'm eagerly waiting your response.

  • @alexkim7270
    @alexkim7270 2 года назад

    Thanks for sharing, Marco. Can you share with us your computer's specs? I am amazed that you're opening a 1GB file and your CPU and Memory usage is still very low.
    Mine on the other hand, is consuming 83% CPU and 72% memory when I open a 100KB pbix. I am guessing it has very complex queries, but that's beside the point. What are your specs?

    • @SQLBI
      @SQLBI  2 года назад +1

      The PC used in the demo has an Intel i7 3.2GHz and 32GB of RAM.

  • @elrevesyelderecho
    @elrevesyelderecho 3 года назад

    12:00 Quick question Marco. How much is the memory ram of your PC? What about the processor? How many cores? Speed? 5 Cores and 2. Something Ghz? Just to have a full scope of the exercise. Thanks.

    • @elrevesyelderecho
      @elrevesyelderecho 3 года назад

      OK..12:59. 3.5 Ghz and 32GB ram. How many cores?

    • @elrevesyelderecho
      @elrevesyelderecho 3 года назад

      6 or 8 cores? 🤔 I am guessing 8.

    • @SQLBI
      @SQLBI  3 года назад +1

      i7 7820HQ 2.9 GHz 32GB RAM - 4 cores, (8 virtual cores with HyperThreading)

  • @seez8164
    @seez8164 11 месяцев назад

    Awesome. TLDR. I am stuck with 16GB (LPDDR soldiered onboard). Does increasing swap file manually in Windows10 makes sense for PowerBI?

    • @SQLBI
      @SQLBI  11 месяцев назад

      Not much - you can try and get some benefit from paging other applications, though.

  • @andreiamador4894
    @andreiamador4894 Год назад

    What happen if the issue is with the Power BI Desktop process (yellow one)?

    • @SQLBI
      @SQLBI  Год назад

      Strange, it should be investigated more. Usually that process is not involved in large memory operations. It could be that you receive a large amount of data from a query that has to be transferred to a visual, but usually the number of data points displayed is limited.

    • @andreiamador4894
      @andreiamador4894 Год назад +1

      @@SQLBI Thank you for answer Marco/Alberto.
      Definitively we have something more. I have a customer that it's taking up to 5 GB.
      Opening the Power Query editor will load more .dlls in memory and will increase the memory consumption.
      I did some testing but nothing concluding I can't explain why that memory could be so high, and I'm looking for answers.
      Your video explained the rest of processes which are very clear and there is information about it, but how the Power BI Desktop process consume memory still being a mystery. 😂

  • @kadirkkkk
    @kadirkkkk 2 года назад

    34:30 : Every visual allocates memory in ram and charticulator visual sucks the memory in ram. ha !

  • @cherifiabdelmajid7888
    @cherifiabdelmajid7888 2 года назад +1

    You are THE BEST!