2 ways to reduce your Power BI dataset size and speed up refresh

Поделиться
HTML-код
  • Опубликовано: 16 ноя 2024

Комментарии • 209

  • @alt-enter237
    @alt-enter237 4 года назад +12

    Just used this as a step by step to analyze a model that I am working on. I knew I had to get rid of columns (and I had) but now I am pruning even more ruthlessly. And one thing I would add--don't be afraid to remove columns--you can always add what you need back. So what I do is select JUST the columns I know I want, and then use REMOVE OTHER COLUMNS. Then, if I find that there is a column I DO need, I go back to that REMOVE OTHER COLUMNS step, and modify the command by adding the name of the column I need back in. Super easy, super quick.

    • @GuyInACube
      @GuyInACube  4 года назад +1

      Love it! It is definitely something folks should be looking at.

  • @sandykashyap
    @sandykashyap 3 года назад +6

    By unchecking the auto-date/time , it simply brought down my data model size by 22MB! I am so happy I tried this. You do a fab job, keep it coming!

  • @dangelo90
    @dangelo90 4 года назад +20

    I have recently started expanding my knowledge with PBI and your channel has amazing information, examples and tips. I appreciate your work very much! Thank you for your efforts!

  • @Randyminder
    @Randyminder 5 лет назад +10

    I completely agree with removing columns that you don't need. But, I think we need to be careful that we don't remove so many columns that we can no longer guarantee uniqueness in the table. When a context transition occurs, the table is iterated and if we have duplicate rows, they will get double (or triple etc.) processed causing very hard to catch (and resolve) bugs.

  • @Slyder9278
    @Slyder9278 5 лет назад +8

    Excellent video! Just reduced my PBIX file from 136MB to 34MB. Goes to show how little I know about how data is stored. I had several tables with a couple of unique key columns.

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      WOW! That's amazing. So happy that this helped you out. That's pretty incredible. 👊

  • @SandeepPawar1
    @SandeepPawar1 5 лет назад +19

    Great tips.. I always use Remove other Columns to make sure I only keep the columns I need.. always get rid of the columns as a first step and not after you have done bunch of transformations.. plus always, always reduce the date-time to date only if you dont need the time. Time adds lot of bulk to the size (i guess because of high cardinality). hopefully PowerBI team will add Vertipaq analyzer-like tool in performance analyzer

    • @GuyInACube
      @GuyInACube  5 лет назад +3

      Totally agree! Date-time to date is definitely something we recommend. If you don't need time, get rid of it. If you do need it, split it out.

  • @meg11c
    @meg11c 3 года назад +1

    Aaaahhhhh where would I be without Guy in a Cube? As always, fantastic info.

  • @chiligarden
    @chiligarden 3 года назад +1

    Because this video, I was able to reduce the size of a Power BI report that includes a customized calendar dimension from 500 MB to 2 MB, by just turning off the Time Intelligence feature. This is so unreal that I got my coworker to reproduce the size reduction. In hind site, it makes so much sense based on how Time Intelligence works. Thank you so much!

  • @bernadettearaea6976
    @bernadettearaea6976 3 года назад +3

    Never really thought columns had an effect, thank you so much for this!

  • @bilalazim1901
    @bilalazim1901 5 лет назад +4

    Good techniques
    What i normally do is take out as many colums as possible with select colums, and you can always bring them back if at some stage you need a previously removed colum.
    Just disable data type detection and select data types as your last step

    • @GuyInACube
      @GuyInACube  5 лет назад +2

      Yup. not a bad approach to pull things in later when you need it. Can you explain more on the data type point?

  • @meetdenis82
    @meetdenis82 4 года назад +3

    Brilliant! As someone used to working with tabular data, I inherently knew removing unwanted columns takes a huge load of schemas. I am a newbie to Power BI and was looking on ways to reduce the model size on my projects and your video just proves how simple it is to cut down the size if you are really *clear* about your data. Thanks for highlighting that part so well, Adam!

  • @navjeet41
    @navjeet41 4 года назад +1

    Nice Video. Like that its not just repetitive basics. Its very IRL scenario based of optimization.

  • @dylandelport6497
    @dylandelport6497 4 года назад +8

    Adam, this is an incredible video, thank you. It makes so much sense now that you have explained.

  • @curious_yang
    @curious_yang 3 года назад

    My file size was relatively small (c.4 MB) but visual fails to load in PowerBI service. This has helped me to optimise how the table loads and it is now working! This did not reduce my file size significantly (now c.3.5 MB) but that's not the point anyway. Thanks Adam

  • @avecNava
    @avecNava 5 лет назад +2

    Loads of love for this optimization technique. It felt like the PBIX file was suffocated with unrelated columns.

  • @PowerProd
    @PowerProd 4 года назад +3

    Excellent video! I will use countrows from now on and ditch unique ids

    • @GuyInACube
      @GuyInACube  4 года назад +1

      Awesome! If you have the time, always be sure to test things as well. Things may work different with your data. Always good to validate.

  • @MartinKuzmicz
    @MartinKuzmicz 3 года назад

    Ya, I have a model which is taking every time over an hour to refresh. I'm using dataflow as the source and still is taking a long time. So, my next step will be to check if I need all the columns :) Thanks for the video.

  • @likhui
    @likhui 5 лет назад +4

    Hi Adam & Patrick, thank you guys so much for posting awesome contents, as always :)
    One thing I would like to point out is the shout out for DAX Studio. I have to admit I was a little bit surprised that Darren Gosbell wasn't mentioned as he's the creator and main contributor of DAX Studio. Yes, no doubt that Marco and Alberto (I have huge respect for them) have contributed in some of the coding; Marco has also mentioned a few times that people have mistaken him as its creator and had to clarify that he contributed approx. 5 - 10% of it. So I'm not sure whether that's the case here.
    Once again, thanks for the awesome contents and keep being awesome!

    • @ynwtint
      @ynwtint 5 лет назад +2

      Thanks for bringing this up. I have the same impression that the two DAX gurus from SQL BI are the creator of DAX Studio. Now I the big man behind this very useful tool is Darren Gosbell. (mvp.microsoft.com/en-us/PublicProfile/35889?fullName=Darren%20Gosbell)

    • @likhui
      @likhui 5 лет назад +1

      @@ynwtint You're welcome. Cheers.

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      We have a lot of love for Darren! It is a SQLBI tool though and that was the intent. Apologies for giving the impression on actual development time. That wasn't what we were going for.

    • @likhui
      @likhui 5 лет назад

      @@GuyInACube Don't be sorry and totally understood :) I'm looking forward to your next video already. Cheers!

  • @pierresonkeng6827
    @pierresonkeng6827 4 года назад +4

    Really amazing. I have reduce one of my pbix file from 256 Mo to 182 Mo. I also discover a lot of options to optimize my data set.
    Thank you

  • @denglishbi
    @denglishbi 5 лет назад +2

    Someone else already mentioned the datetime fields to watch out for and another one is calculated columns. Great job as always 👍

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Yup. sooo many things. We have some other videos coming on data model optimizations. Great call outs though 👊

    •  5 лет назад +1

      Hi Dan, why watch out for calculated columns? Could you clarify?

  • @scooterza
    @scooterza 2 года назад

    Thanks Patrick! Amazing impact that losing a few redundant columns has! 🐱‍👤🐱‍👤

  • @ShabnamKhan-vk7fj
    @ShabnamKhan-vk7fj 5 лет назад +6

    Thanks so much; As always, it has been super helpful! We greatly appreciate you guys giving back to the community this way. Keep up the good work!!

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      That means a lot Shabnam! Thank you so much 👊

    • @ShabnamKhan-vk7fj
      @ShabnamKhan-vk7fj 5 лет назад

      @@GuyInACube 👊 anytime!

  • @SolutionsAbroad
    @SolutionsAbroad 4 года назад

    Seeing that file size go down from 600MB to 74MB just made a my jaw drop! Thanks for this!

  • @PabloBadenas
    @PabloBadenas Год назад

    Amazing... only removing the Auto date/time reduced a pbix file from 20mb to 2mb. loved it!!!

  • @gkool4655
    @gkool4655 3 года назад +2

    Hello fellow Devs *Please Note* :
    The Process has changed on how to Load your Model into VertiPaq Analyzer.
    ✅ Now Export a VPAX file first from Dax Studio,
    ✅ Then load THAT into the Excel Analyzer.
    Instruction on first page of new Vertipaq Analyzer ✅

  • @donaldscott8782
    @donaldscott8782 3 года назад

    Thanks Adam. I unchecked Auto date/time and my PBI file dropped from 80MB to 2.4MB !!!!!

  • @arunasubin8965
    @arunasubin8965 4 года назад +1

    My file size has reduced a lot. Thank you so much

  • @JasonRidenour
    @JasonRidenour 3 года назад

    Oh man... I really would love to show you what were working on. I'm in healthcare data analysis. Healthcare data is legit big and we're doing everything we can think of to reduce our data size. Our latest project PBI file saves at 6GB!

  • @jamesharvey1979
    @jamesharvey1979 4 года назад +1

    First.. Great Video.. Second.. I love how you say to "Jump over to Premium to give you some breathing room" Power BI Premium sits at a price point that only large corporations can afford it. I would love to jump to it for the use of computed tables inside data flows, but cant get it into the budget till next year.

  • @dianamgdata
    @dianamgdata 2 года назад

    Amazing! Turning off the date/time configuration you mention in the video reduced my report size by 8MB!

  • @ishasakalley4254
    @ishasakalley4254 5 лет назад +5

    I am a recent subscriber to your channel and must say I love it! Thank you for putting in effort and time and sharing your knowledge.

  • @PedroCabraldaCamara
    @PedroCabraldaCamara 4 года назад +1

    If only I saw this video on time, like for example last year....awesome video guys

    • @GuyInACube
      @GuyInACube  4 года назад

      Thanks for watching! 👊

  • @roseventura1711
    @roseventura1711 3 года назад

    You guys are the bomb! Thanks for the tips. That VertiPaq Analyzer thing? Holy crap! That's a gold mine! It shows all my measures! I've been looking for something like this for forever!

  • @navjeet41
    @navjeet41 4 года назад

    Nice Video. Like that its not just basic repetitive basic skills, But real life scenario for optimization.

  • @wizaphiri19
    @wizaphiri19 4 года назад +3

    Great optimization tips, thank you

    • @GuyInACube
      @GuyInACube  4 года назад

      Most welcome! Thanks for watching 👊

  • @Milhouse77BS
    @Milhouse77BS 5 лет назад +2

    Good examples. I’ve got a team with an S1 AAS, with ginormous composite transaction key that needs to die. Would save money to get it down to an S0.

    • @GuyInACube
      @GuyInACube  5 лет назад

      Yeah it is amazing what exists in a model.

  • @juanlauroaguirre5646
    @juanlauroaguirre5646 2 года назад

    Hi Adam, great talk, however what are your toughts about the usal practice of creating huge / heavy / slow multipurpose "golden" datasets which intend to solve the "several sources of truth" problem by putting everything and the kitchen sink in a single dataset file serving dozens of reports?

  • @etherlords88
    @etherlords88 5 лет назад +2

    3:15 Yup I approve that! 😅 I worked with a table of around 12 mil records and not only it took about 2+ hours to fetch on the desktop, ate up all the ram making the PC almost unusable!!!

    • @GuyInACube
      @GuyInACube  5 лет назад

      Very easy to get into that spot. crazy stuff. 👊

  • @jhewitthunt
    @jhewitthunt 3 года назад

    Very helpful Adam - thanks for doing the video

  • @skumars78
    @skumars78 Год назад

    Hello Patrick - Thanks. This is a great tutorial on the usage of DAX Studio and VertiPaq analyzer. I have tried using it for my Power BI report which is built based on SAP BW Application server connector. However, I do not see SSAS connection to update the local host and analyze. Could you please help me understand how I can create it?
    Thanks,
    PS

  • @majdyazigi8185
    @majdyazigi8185 5 лет назад +2

    Excel is fun has an outstanding video on the same topic

  • @RenatoHaddadMVP
    @RenatoHaddadMVP 2 года назад

    The most of Power BI devs use all columns from source and them doesn't understand why the project runs slow. The best solution is first PREPARE your data source. In SQL Server, create Queries with only columns that you have to use into Power BI. This process is much more faster than others. And, also, you have at PBI all columns that you will use.

  • @debbieedwards7267
    @debbieedwards7267 4 года назад

    Love this. I was wondering, If you have a Surrogate Key and a Business ID which would be high cardinality and you join the tables by Key. Could you actually remove the business keys from the model or should you always leave those in. for example Product Key 1 Product ID 35335 ? I'm thinking in terms of the Fact table AND dimension if you have gone for a STAR schema

    • @GuyInACube
      @GuyInACube  4 года назад

      Debbie you will need to Surrogate Keys for the relationships, but if you are not using the Business ID I would l definitely remove it from the model. The only time we suggest keeping anytime of ID is if it is needed for reporting. Great point!

  • @juanlopez4033
    @juanlopez4033 4 года назад +1

    Do you normally create a backup of the PBI data before you remove columns? Is that just as easy as just creating a PBI file? Just in case we removed columns we should not have done. If so, how do you backup the proper way before we start removing columns, to retain an original the client provided.

  • @RajanieshKaushikk
    @RajanieshKaushikk 4 года назад +1

    You present very well!!

  • @0kazaki
    @0kazaki 5 лет назад +8

    A tool that detects unused Tables & Columns would be great!

    • @yannickfranckum6589
      @yannickfranckum6589 5 лет назад

      Hi @Kazaki you can try Power BI Helper radacad.com/power-bi-helper

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Agreed. that would be a great tool.

    • @moizsherwani8651
      @moizsherwani8651 5 лет назад +1

      Although it doesn't show which aren't used the PowerBI helper (radacad.com/power-bi-helper) by Radacad does show columns which are used. I just put PowerBI helper on one screen, the powerbi file on the other and just remove the unused measures and columns.

    • @atomek1000
      @atomek1000 5 лет назад

      powerbi sentinel does it

  • @chamilam
    @chamilam 5 лет назад +1

    Great ideas presented to reduce the dataset.

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Appreciate that! We have some more videos coming on data model optimizations as well. 👊

    • @chamilam
      @chamilam 5 лет назад

      @@GuyInACube Super !!! looking forward to those videos.

  • @alfredlear4141
    @alfredlear4141 5 лет назад +3

    Thanks for what you guys do.
    Seriously it's so practical and easy to absorb, your channel is very undersubscribed

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Much appreciated Alfred! 👊

  • @subusahu69
    @subusahu69 Год назад

    15:30 you have selected few columns and apply and load. While we publish this from Dev to Prod, do you think it will create problem? If the columns will missmatch in dev and test and prod.

  • @rickuijlen4790
    @rickuijlen4790 2 года назад

    Thank you man! Only the time intelligence reduced my file size from 52MB to 22MB :D

  • @bwaughevents
    @bwaughevents 4 года назад +4

    OMG! Is that a Lone Star State on the Millennium Falcon? LEGIT!

  • @karlnorberg7768
    @karlnorberg7768 3 года назад

    Great stuff. Got rid of 500MB worth of LocalDate_tables o/. Also found that in one report we have 22 million rows where one column contains numbers but is stored as a String. Wrong on so many levels :) It's not even used in the report! 700MB saved in a few seconds. This will come handy setting up guidelines for building Power BI-reports in our organization. Thanks!

  • @shafialameri8363
    @shafialameri8363 4 года назад

    Thanks for this information also i think even if the organization said may we need this column latter on , it is easy to get this column again not a big deal

  • @richardostrea7842
    @richardostrea7842 2 года назад

    You guys should cover the inforiver visual 🙏

  • @markharris7325
    @markharris7325 4 года назад

    Thanks for the info in this video, impressed with how much this decreases the size of the Powerbi File! Would a similar approach work if you are suffering with the "visual has exceeded the available resources" error in the service when linking to a powerbi dataset?

  • @brypie04
    @brypie04 2 года назад

    Just stumbled across this video - some good tips. Couple of questions though:
    1. Unchecking the auto-date/time setting stops me from being able to show a nice hierarchical date slicer (Year->Qtr->Month->Day) - How could I still have one or more of those with the setting disabled?
    2. For reducing the number of columns in the dataset, wouldn't it be better to edit the initial source query to only get the columns you need from source?
    Otherwise, you are telling Power BI to pull in all the columns (and have to handle them all), just to then say "now forget about half the columns I just told you to import"

  • @neverGrowup1224
    @neverGrowup1224 3 года назад

    Hi Adam, very nice video! Thanks a lot, and just wondering what is video recording application that you use to recording your operation on PowerBI? Very appreciate it if you can reply me !

  • @prakash4190
    @prakash4190 5 лет назад

    This is really great! Thanks Adam! Would there be any negative impact(s) if we disable the time intelligence for an existing report have datetime columns.

    • @GuyInACube
      @GuyInACube  5 лет назад

      Absolutely not, unless you are using them in the report. We actually recommend disabling time intelligence if you have your own date table.

  • @sunilg7648
    @sunilg7648 3 года назад

    What is Evaluating set do when refreshing the report. I am using the SharePoint folder with JSON as my data source. It is very slow when refreshing. Major time taking in Evaluations.

  • @1BlackSwordsman1
    @1BlackSwordsman1 5 лет назад +1

    A quick question about data flows and pbi service, would it make sense to load large dimensions to data flows and then only reference it (dataflow) in reports or that approach could cause issues in the long run?

    • @claytillman2227
      @claytillman2227 5 лет назад

      I wonder the same thing. If the dataflow is being refreshed, how can I access that and not refresh in my model. Maybe this is similar to a Direct Query for the dataflow. I don't desire to refresh data, I just want whatever is stored in the dataflow.

  • @eljangoolak
    @eljangoolak Год назад

    very good video. This needs an update though as I cannot follow all the options are not the same any more

  • @jennethtaja7458
    @jennethtaja7458 4 года назад

    Very helpful, Adam. Thanks a lot.
    LOL on 'just like on a cooking show' liked that too!

  • @manikiran5902
    @manikiran5902 5 лет назад

    Hi Adam i am a very big fan of your power bi videos..........
    i have a small doubt about how to validate the reports that are developed in Power BI Desktop
    .....Thanks in advance

  • @arahuac0
    @arahuac0 5 лет назад +1

    Hi Adam love your videos. What do you guys use to zoom in and out on the screen? I also saw it at the MS biz app summit. Thanks!

    • @denglishbi
      @denglishbi 5 лет назад +1

      ZoomIt docs.microsoft.com/en-us/sysinternals/downloads/zoomit

    • @GuyInACube
      @GuyInACube  5 лет назад

      I broke my rule in this video. I actually was surprised when i saw it in the editing. Was on auto pilot. I used ZoomIt in the video at one point, but that's honestly the first time - in a long time - I've done that in the videos. Normally all of the zoom and highlight stuff I do in post. But when presenting in person I absolutely use ZoomIt. Every presenter should have it! Or something similar like it.

  • @huuya
    @huuya Год назад

    So I have a question since I am confused. Based on another video as Well I wonder should I use SQL to pre-filter when loading data OR should I Just load all and remove data and columns I don't need using transformations. You pointed out in another video that transformations Will be blocked if you use SQL tot pre-select data. Or is that only the case if you have used select all and not select specific columns?
    Secondly would it improve speed of In create a query which links all ID's tò eachother in 1 table and can you increment only that table and refresh the lookup tables whenever you need them? Basically is the refresh of a model done on the whole model of specifically to certain tables.

  • @joaquinmaverick82
    @joaquinmaverick82 4 года назад

    Great video, I just saw the another from Aug 2020 :) about disable Auto Date/Time

  • @arklur3193
    @arklur3193 5 лет назад +3

    Great video as always, you guys are great!

    • @GuyInACube
      @GuyInACube  5 лет назад

      Appreciate that 🙏 Thanks for watching 👊

  • @FredLorrain
    @FredLorrain 4 года назад

    I also always check the column data type, 'any' should be banned. When possible relationship should be based on text fields. These are my usual tricks.

  • @evangelinekiku7380
    @evangelinekiku7380 3 года назад

    Great video. Thanks a lot. I just learned some new tips.

  • @dreamofyou00
    @dreamofyou00 3 года назад

    Hello. Is it any differences between removing columns or loading not all columns from the file from performance perspective?

  • @tanyacraig2672
    @tanyacraig2672 2 года назад

    I initially add just the fields I can filter on (market, customer type etc), together with one fact (e.g. order quantity), then I filter, then I add all the other columns required. The only annoying thing is that once you change a column data type, then you can't add any more from the data tables (at least on import).

  • @4eyesleo
    @4eyesleo 4 года назад

    Does in make any sense to group and summarise the remaining columns after deleting unnecessary IDs? Would it increase performance given that Power BI has very intelligent "packing" abilities?

  • @gokukanishka
    @gokukanishka 3 года назад

    Appreciate this kind of video.

  • @C15-k4d
    @C15-k4d 5 месяцев назад

    Is it excel is auto generated or we should connect as datasource or is it SQL server datasource ...excel sheet-- data model not clear

  • @HarishS12137
    @HarishS12137 5 лет назад

    after choosing the columns to keep, will the data refresh the same or will it throw error?

  • @curiousmind9825
    @curiousmind9825 5 лет назад

    Thank you so much for your tutorial. I am using ssas multidimensional live connection. I am trying to create a stacked column chart which show month x axis and value y axis. also shows month wise top selling store. But when i try to top n filter by store that filter shows highest sales per year not filter month wise sales. Please help me how can i solve this.

  • @NC-un7tr
    @NC-un7tr 4 года назад

    Hello, may I know why there is SSAS? Is it the data source of pbix?

  • @osamaasif9601
    @osamaasif9601 5 лет назад +2

    Guys you are amazing, keep it up

    • @GuyInACube
      @GuyInACube  5 лет назад

      Thank you so much! Really appreciate that. 👊

  • @DEMONTmx
    @DEMONTmx 3 года назад

    i dont see the ssas connection option when using vertipaq analyzer, im using a connection to an sql db with azure active directory for power bi

  • @georgib0y
    @georgib0y 5 лет назад

    Thanks! I use regularly Vertipaq, but I forgot how important is to delete unused columns ( I reduced close to 100Mb from 170MB). Adam, do you a way to reduce the evaluation process of the table? That is always taking so much time. I optimize my SQL folding so is fast but some table takes ages to evaluate in the refresh.

  • @TorgeirLognvik
    @TorgeirLognvik 4 года назад +1

    Extremely helpful:)

    • @GuyInACube
      @GuyInACube  4 года назад +1

      Appreciate that Togeir! 👊

  • @GeekSP1
    @GeekSP1 5 лет назад

    Thanks for the video. This message appear when i edit the connection and perform the refresh data of this connection. "We couldn't refresh the connection. Please go to existing connections and verify they connect to the file or server". what should i d?

  • @mrrobert2008
    @mrrobert2008 5 лет назад

    Hello, here from Chile and I am a fan of your RUclips channel. I would like to get the files that you show in the video to be able to practice and follow your steps. I will be grateful if you could share them. Regards!!!

    • @GuyInACube
      @GuyInACube  5 лет назад

      Unfortunately, they are pretty big. It is just the ContsoRetailDW database, modified with extra rows. I also added a custom column for that OrderID column to simulate :) I did that at the SQL level using a view and to also flatten out the data.

  • @rezaafkhamnia9183
    @rezaafkhamnia9183 5 лет назад +2

    It was so greate. could i have source file that show in video?

    • @GuyInACube
      @GuyInACube  5 лет назад

      Unfortuantely no. I used it based on the CotosoRetailDW sample database. But we increased the number of rows. I had 25 million rows in that file. It is pretty big.

  • @alejandrogonzalezbueno8044
    @alejandrogonzalezbueno8044 5 лет назад +5

    hi! Thank you very much for sharing such interesting videos! Could you share the excel file you use in the video and could you explain a little more about the use of DAX Studio?
    Thanks!

    • @Gustavo-Santana
      @Gustavo-Santana 5 лет назад +2

      I agree, it would be great if we could have some more details about how to use Dax Studio. Thanks Adam for your great explanation as ever!

    • @GuyInACube
      @GuyInACube  5 лет назад +7

      The excel spreadsheet is VertiPaq Analyzer - which you can get from sqlbi.com - www.sqlbi.com/tools/vertipaq-analyzer/. Also Marco, from sqlbi.com, has a longer recording on what to do with a slow report. This goes into details on DAX Studio. www.sqlbi.com/tv/my-power-bi-report-is-slow-what-should-i-do-2/. Also, we will be looking at doing more videos on these topics as well.

  • @DAngeloSilvestre
    @DAngeloSilvestre 4 года назад

    Yoooo ....
    Sometimes a refresh process that typically lasts 10-15min gets some problem and doesn´t finish succesfully. In the meanwhile it that schedule refresh keeps the status as in progress for 2 hours and I cannot start an manual refresh while that scheduled refresh hasn´t finished.
    How can I proceed to manually stop a refresh that is currently running?

  • @MuhammadBerki
    @MuhammadBerki 5 лет назад +2

    Wow awesome tips

  • @Drengen10
    @Drengen10 3 года назад

    SSAS doesnt show up in the "existing connections" inside power pivot?

  • @AbinashPhuel
    @AbinashPhuel 5 лет назад +1

    Must needed techniques!

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Agreed :) Thanks for watching Abinash! 👊

  • @1yyymmmddd
    @1yyymmmddd 4 года назад

    One little thing though - if you disable Auto Date many of your Quick measures won't work any longer as only power bi provided date hierarchies are supported.

  • @TheVamos777
    @TheVamos777 5 лет назад +6

    One thing I always do is ensure I get rid of datetime fields especially if time is very precise. Set as date or if you need the time extra it into another column. a Datetime has high cardinality but a date and time in sperate fields are low. If I don't need to be so precise I could just have the minute of the day or the hour. I also add any custom columns in M/Powerquery rather than Dax as you can get better compression or if I can use a measure.

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Totally agree! There are so many things that could be listed here. The video was really long though :| we have more data model optimization videos coming. 👊

  • @premprakash334
    @premprakash334 4 года назад

    I am trying to run a already created report file by Microsoft " Customer Service Analytics for Dynamics 365.pbix" for my Dynamics 365 instance but it fails every time while load with error "The refresh operation failed because it took more than 120 minutes to complete. Consider reducing the size of your dataset or breaking it up into smaller datasets", now i guess i can only do all this Optimization only once my data is loaded into the .pbix file. What to do in case if in the First Time itself the Report doesn't load?

  • @Ritunjan
    @Ritunjan 4 года назад

    M using Huge data set around 2b rows,, and that too using python query to pull data from Mongo .... does this Incremental refresh helps in this scenario ???? pl do Help help Help

  • @kristinamelnichenko5775
    @kristinamelnichenko5775 3 года назад

    Awesome, thank you!

  • @annamalaithirumalraj3787
    @annamalaithirumalraj3787 5 лет назад

    Hi I need to analyze multiple csv files of each 1mb size. Then how many files can I connect

  • @kennethstephani692
    @kennethstephani692 3 года назад

    Great video!

  • @wilmanjoelvasquezatoche4197
    @wilmanjoelvasquezatoche4197 3 года назад

    this actually works

  • @yaaboii34
    @yaaboii34 3 года назад

    Can you do this in power query ?

  • @DuncanFairweather
    @DuncanFairweather 2 года назад

    Dead in the water after the Vertipaq Analyser segment. I have no SSAS connection to edit. Could I please get a work-around?

  • @happyheart9431
    @happyheart9431 Год назад

    Million thanks