Data modeling best practices - Part 1 - in Power BI and Analysis Services

Поделиться
HTML-код
  • Опубликовано: 12 янв 2025

Комментарии • 202

  • @bcippitelli
    @bcippitelli 5 лет назад +11

    Thanks for that Patrick. I have been using star schema and some good data modeling practices and I have to say: "This is the way". Not only performance is better: design, end user understanding maintenance is much much easier as well. The good practices with Power BI make us Super Power Workers! Thanks again!

  • @jeremyanderson2374
    @jeremyanderson2374 4 года назад +17

    Dude, your videos are incredibly helpful. I've learned a ton from you over the past few weeks!

  • @felipebizarre
    @felipebizarre 3 года назад +1

    NO WAY YOU CAN HAVE TWO DATA TABLE DISPOSITIONSSS, my mind just exploded ugh Patrick you're and angel sent from MS Heaven

  • @Real28
    @Real28 5 лет назад +15

    This is 100% basics right here. Its all review for me but Im watching to support this great channel and to simply keep this fresh in my brain. STAR schema is a must in most cases.

    • @GuyInACube
      @GuyInACube  5 лет назад +2

      Love it! And love the support! 👊 Sometimes going back to basics is a good thing.

    • @Real28
      @Real28 5 лет назад +1

      @@GuyInACube I'm just now getting into PBI after years in SQL. So designing the back end is my specialty. But the same rules apply in a lot of situations because the concepts are the same.
      Many times I have to remind someone "do we need the information in two places or can we table these out and relate them?"

    • @phanirj
      @phanirj 5 лет назад

      I am looking for job.. Is there any vaccines, please

  • @dangelo90
    @dangelo90 5 лет назад +2

    I have recently started expanding my knowledge with PBI and your channel has amazing information, examples and tips. I appreciate your work very much! Thank you for your efforts!

  • @KuKuTV108
    @KuKuTV108 4 года назад

    u r one of the best teacher I have been through (few among above 1500 teachers world wide)

  • @hectoralvarorojas1918
    @hectoralvarorojas1918 5 лет назад +5

    Patrick:
    How about to get the links for what you talk about here:
    1) Star Schema Overview (you talk about in 6:39)
    2) "Flat tables and create data model" (you talk about in 6:48)
    This will be great to have it!
    I will be waiting for your answer.
    Best regards!

    • @pacoaus
      @pacoaus 5 лет назад +1

      Agree! Here is the link to 1) Star Schema Overview -- docs.microsoft.com/en-us/power-bi/guidance/star-schema

  • @patshanz
    @patshanz 4 года назад +1

    Great video. Smart. Great pace. You very quickly setup the various problem statements, explained the options, and demo'd. You're a great instructor. Thank you!

  • @marthasanchez-avila9043
    @marthasanchez-avila9043 5 лет назад +6

    Love it! I’ve leveraged this process for all sorts of reports, everything from property and date tables to organize my various sources of data. Thanks for the deeper dive .

    • @GuyInACube
      @GuyInACube  5 лет назад

      Great to hear! Thanks for watching 👊

  • @hectoralvarorojas1918
    @hectoralvarorojas1918 5 лет назад +1

    Great topic you have chosen to talk about it here.
    AWESOME video Patrick!
    Now I will be waiting for the Part 2.
    Congratulations!

  • @someshkhatawe3404
    @someshkhatawe3404 5 лет назад +6

    Adam's dataset video actually worked for my reports. Thank you both 😀😀😀😀

    • @GuyInACube
      @GuyInACube  5 лет назад

      Woot! That is awesome to hear. 👊

  • @bcippitelli
    @bcippitelli 5 лет назад

    Thanks for that Patrick. I have been using star schema and some good data modeling practices and I have to say: "This is the way". Not only performance is better, design is easier, understanding is easier and the maintenance is much much easier. The good practices with Power BI make us Power

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Bruno Cippitelli Yes! Just makes life easier.

  • @Randyminder
    @Randyminder 5 лет назад +7

    There is another drawback to wide tables I think you could have mentioned. When a context transition occurs, the entire table being iterated is brought into memory, uncompressed. For wide tables with lots of rows, this will ravage memory even more.

  • @PeterHolcroft
    @PeterHolcroft 4 года назад +32

    When is Part 2 coming?

    • @gyorgy8911
      @gyorgy8911 4 года назад +2

      I guess it is that: Where to create your columns in Power BI | Data Modeling Best Practices
      ruclips.net/video/ZSUCmi6h5SY/видео.html

  • @innocentiuslacrim2290
    @innocentiuslacrim2290 4 года назад

    Greetings from Finland. I find your videos really beneficial in getting better in the fundamentals when working with Power BI. Earlier I was working on a project with Cognos where the databases were already ready and I could only work in the UI and then give people with access to backend tasks on what to do there. That was pretty terrible as we were always facing performance issues to the point where individual KPIs were just failing randomly (timeouts). Efficient data modelling is really (REALLY) important when we want people to actually use the tools we create.

  • @amarkhaliq641
    @amarkhaliq641 3 года назад +1

    I had the same problem last week. Having two fact tables were i couldn’t use a slicer for a field in both tables correctly. The slicer would filter one table but not the other.

  • @789riteshclement
    @789riteshclement 5 лет назад +2

    This is so helpful. Can't wait for the other parts to come out.

    • @GuyInACube
      @GuyInACube  5 лет назад

      Thanks for watching and stay tuned it will be release soon!

  • @kel78v2
    @kel78v2 6 месяцев назад

    My experience with the star Schema is mixed especially with independent product tables where it limits your ability analyse to a certain extent. Cross tables was one of them

  • @nelsonma4711
    @nelsonma4711 5 лет назад +3

    Patrick, as always, you know what I think, AWESOME video. Cheers!

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Thank you Nelson! And thanks for watching. 👊

  • @marcocardicchi1
    @marcocardicchi1 4 года назад

    You guys are by far the best at generating good quality content.....from inside the Cube! 👍🤟 Keep these video coming!! Thanks

  • @marcosoliveira8731
    @marcosoliveira8731 5 лет назад

    Your report is fast as long your data model is concise.
    Great explanation.

  • @Pri78G
    @Pri78G 2 года назад

    Excellent video. Very well explained. Thanks. I would like to see more videos like these around the topid BI. Thanks once again.

  • @andr135
    @andr135 8 месяцев назад

    Thanks!

  • @sugartraders
    @sugartraders 4 года назад +1

    Awesome, Patrick, you make this stuff fun - thank you!

  • @Cassiethecat
    @Cassiethecat 5 месяцев назад

    Great lecture! Can I ask you where the part 2 video please?

  • @KirtC
    @KirtC 3 года назад

    Thanks for your videos. I always enjoy them. I understand star schemas. Been using them for decades. What causes my hangups are filtering directions in the model. Hopefully you have a video discussing that topic.

  • @AnupamKumar-sc5jx
    @AnupamKumar-sc5jx 4 года назад

    Great Video. at 7:30 are we creating lookup tables from large table in Power BI itself. Is there any other video on how to do it.

  • @laszlokatai-pal5685
    @laszlokatai-pal5685 4 года назад +9

    "I wrote a video" - that's some badass PowerBi skills, mate :D

  • @kimglick2494
    @kimglick2494 4 года назад +2

    Thanks Patrick! This was extremely helpful and easy to understand!

  • @matthewtupuola6091
    @matthewtupuola6091 3 года назад

    New sub! Awesome vid and down to earth presentation with great humour and knowledge of the topic!

  • @syek3470
    @syek3470 3 года назад

    You are a Rockstar. Really Informative ,Great Job ..Keep it up and going

  • @petecardona8203
    @petecardona8203 2 года назад

    Always enjoy your videos -thank you 🙏

  • @mchopra1989
    @mchopra1989 3 года назад

    Awesome tips. Where could I see the part 2?

  • @Sarah-bb6nv
    @Sarah-bb6nv 5 лет назад +3

    Hello, I'm new to Power BI. I'm used to using Power Pivot in Excel and joining tables through SQLServer to create queries (sometimes many many tables). Occasionally creating relationships to other queries within the back end. I was just curious if you guys had any recommendations on when to join tables through relationships or when to merge tables in Query Editor? Coming from Excel, I thought the best practices were to merge tables with joins, but soon found out that wasn't optimal in many cases. So any tips? Thank you!

  • @StanMoong
    @StanMoong 5 лет назад +4

    Thanks for the great tips.
    Currently, I have the following requirement for dashboard to show monthly comparison for the whole year, as well as, year to year comparison, with ability to drill down to transactions.
    6 excel source reports each month, from different sources. These reports need to be consolidated after complex transformation and additional classifications from user mappings.
    My initial thought was to make it into one giant dataset for user convenience.
    After watching your video, it seems I need to redo from scratch and build the star schema, with additional mappings. Reason is there is no standard key fields from the different sources. Imagine you have 3 sales reports but for the same products, each report uses a different product code.
    Would this be the right approach?

  • @jessemacdonough2458
    @jessemacdonough2458 4 года назад

    Wow, very good information in here about optimizing tables. Thanks for the post as always, Patrick!

  • @uncle_eddies_adventures
    @uncle_eddies_adventures 4 года назад +2

    Hey Patrick, I'm not seeing the links to the STAR schema you referred to and I would also be interested to watch the video you did with your daughter on the topic.

    • @GuyInACube
      @GuyInACube  4 года назад +2

      Apologies for that. Here is the link - docs.microsoft.com/power-bi/guidance/star-schema

  • @MrTC-rv3jo
    @MrTC-rv3jo 2 года назад

    Hi Patrick, Great Video! But where is part 2 of this video as the title suggests? Thanks!

  • @AFellowGentleman
    @AFellowGentleman 4 года назад +1

    Break out everyhting into dimensions and facts and you will avoid ultra wide tables and allow you to re-use a lot of these dimensions also.
    Sometimes doing this seems to suck because you have to do a lot of joins on data which feels like it belongs in the same table, but in general you save pain later on doing this categorization at an early stage. The most important thing is to break out the "conformed dimensions" because these can be re-used again and again and again in your fact tables.

  • @mantistoboggan3384
    @mantistoboggan3384 2 года назад

    Great video! Thank you for the very easy to digest examples

  • @oscarzamorano6901
    @oscarzamorano6901 5 лет назад +2

    Awsome advice!! many thanks Patrick... Hello from Colombia, Sourh America...

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Most welcome! Thanks for watching! all the way in South America 👊🙏

  • @bongrobs
    @bongrobs Год назад

    awesome tips patrick, love it!

  • @frankw3101
    @frankw3101 3 года назад

    Hello, I would say that the schema you are showing at 8:30 is not a start schema but a snowflake schema. Normally in a star schema you would only only have fact tables "in the middle" and then dim tables connected to the fact table. But no dim table should then have other dim tables connected to them. If you have dim tables connected to other dim tables then this is called Snowflake schema and this structure will reduce the performance as not only one filter is used but several filter must be used to get the result. To get an ideal star schema it is then necessary to have redundant data in the dim tables. and that is the biggest difference to a database schema. In a database schema you will try to avoid redundant data by normalising the database tables. But this normalisation is not usable for creating Power BI reports with optimal performance.

  • @robertoajolfi7782
    @robertoajolfi7782 5 лет назад

    Great video as usual Patrick! At 6:50 you mentioned a previous video on how to create narrow tables starting form a wide one.
    I haven't been able to find it; can you please provide the url?

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Thanks for watching. Here is the link ruclips.net/video/vjBprojOCzU/видео.html to the video.

    • @robertoajolfi7782
      @robertoajolfi7782 5 лет назад

      @@GuyInACube Great! Thanks a lot a very interesting video, especially from a trainee perspective.
      Sometimes I have some hard time with people that do not understand why we talk about modelling, normalization and relationship.

  • @antique-bs8bb
    @antique-bs8bb 2 года назад

    Did a Part 2 ever emerge?
    Loved this one

  • @phungduong4468
    @phungduong4468 4 года назад

    omg I love the way he spoke out wiiiiide table. Give you a heart :)

  • @anaisalmasi3184
    @anaisalmasi3184 4 года назад +1

    great channel! and thank you for what you do! i have a question. some time when i put some value in my table it s shaow me just the first value from my database and i can't choose any form of calculation (sum, average and co...) how can i do to change this? thank you agan

  • @pipertripp
    @pipertripp 9 месяцев назад

    Also, great channel. Cheers to you and Adam!

  • @peterh7842
    @peterh7842 4 года назад

    Great vid question - How do you deal with data that contains different types of markers please i.e. ".." for missing data, "..." for suppressed data, "." for data from a different time period?

  • @hc9987
    @hc9987 3 года назад

    Dang...where's the Power Bi 101 video! I just finished a Power BI course and it was great theory and cool to see all the tool can do, but I'm lost at where to begin! I need some basic report building exercises to show me step by step what to do. Any such resources?

  • @sravankumar1767
    @sravankumar1767 3 года назад

    nice Patrick, Here multiple Fact tables are available . How can we handle the reports and dax queries..

  • @vicoherndon
    @vicoherndon 4 года назад +1

    Great channel! Cheers from Rio de Janeiro, Brazil!

    • @GuyInACube
      @GuyInACube  4 года назад

      Appreciate that Vinicius! Hope you are doing well down there 👊

  • @Alpacastan21m
    @Alpacastan21m 3 года назад

    Do you guys have a video on how to correctly handle Many to Many?

  • @kristinamelnichenko5775
    @kristinamelnichenko5775 3 года назад

    Thanks for waiting that was a great video

  • @topjimmy44
    @topjimmy44 4 года назад

    Any tips on books or other resources for data modeling? I commonly find times where the data is set up in a way where creating summary tables using Power query would be necessary, or cases where I need to look for instances where a date in one table is less than or greater than a date in another table.

  • @EsdrasMellemberg
    @EsdrasMellemberg 4 года назад

    Very good ... Looking forward to part 2!

  • @EverythingDataWithSravani
    @EverythingDataWithSravani 4 года назад

    Your videos are really helpful. Can you please do a video on using multiple fact tables with different granularities.

  • @gopigadde6752
    @gopigadde6752 3 года назад

    Hi Patrick,
    Need to understand how you are not getting an inactive relationship when you are dealing with multiple fact tables and multiple conformed tables? Is there any setting that you are making in power query editor etc?

  • @romerofamily2453
    @romerofamily2453 6 месяцев назад

    Hello Guy in a Cube, can you recommend a reputable Power BI expect? I need support integrating Power BI with forms and unfortunately the videos are a bit fast

  • @jesslovejoy1533
    @jesslovejoy1533 4 года назад +5

    Awesome!! I love watching your vids instead of going through formal Power BI training because I understand SQL and data analytics very well and want to jump right into using this tool which is new to me. Power BI is incredibly powerful and it's actually easy enough to get started in, but as things get complex it's not always intuitive. So thank you for all the tips, info and best practices! :) I do have a question though, and maybe you have already made a video on this but I'm new to the channel so here I go! Can you talk about the difference between star schemas, Merge with a JOIN, and simply setting up a relationship between 2 tables via the Manage Relationships button (which is maybe the same thing as star schema but I didn't realize that?) ? And when it is best to use which? I find myself wanting to use Merge b/c it's the most like writing in SQL, but I want to use more tools within Power BI, and let it do the work for me!

    • @eagillum
      @eagillum 4 года назад

      Yes, I also want more relationship/join/star schema videos. But even more beginner than you've done.

  • @chanchalarya983
    @chanchalarya983 2 года назад

    How can I work with unstructured data, what process should I use to make structure data

  • @Jack2of3
    @Jack2of3 5 лет назад +1

    The issue I would like solved, and you mentioned, is the memory usage. PBI works great with Excel obviously because Excel dataset size is limited. Hook it up to a database with hundreds of millions of rows regardless of how well modeled, partitioned, indexed and it just goes to sleep as in task manager, end process, sleep. I've cancelled the table analysis, joined the tables, published and the services can't handle it either. I can't decrease the data size any more than I already have. We get millions of transaction every day. Fact table has 10 columns. 2 dates, 4 numbers and four joinable dimension keys. The dimensions are created from their own tables. Everything that can be turned off in options is. Any thoughts?

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      Working with large data can be problematic. That's where features such as incremental refresh and Aggregations really become powerful and I've seen data models with billions of rows work using those items. The struggle is real though.

  • @pipertripp
    @pipertripp 9 месяцев назад

    So basically, normalise your data if you have redundant data in tables to avoid duplication and the hideous many-to-many relationships and pivot wide tables to long format where that makes sense (normalisation will help with the width as well)?

  • @mohamadsaifjamadar1912
    @mohamadsaifjamadar1912 3 года назад +1

    Thanks Patrick! Awesome advice.. I am great a fan of both of you guys. Just a question that, Is it a good practice to use SQL DB Views instead of DB Tables?

  • @mutrax
    @mutrax 4 года назад

    Hi Patrick, Do you know if there's a way to export the data model to Visio Pro ? I work a lot with the modeling tool but I find that's very limited. I'd like to customize the background color, assign different colors to different objects, etc. Thanks !!

  • @karihosny9420
    @karihosny9420 2 года назад

    How many Columns in a Table you would consider it being a wide table?

  • @afsanarabeya4417
    @afsanarabeya4417 2 года назад

    How can i narrower my table which have a column with comma separated string values for almost like 20?? And i really have to work with those values..
    Really need the help. TIA

  • @CrazySw3de
    @CrazySw3de Год назад

    Would love to see more videos like this that I can share with my team.
    In particular I've been running into issues where we have a general policy that all transformations/data modeling are to be done on the back end, with I think the intention being that the logic can be accessed by other sources if needed and not just locked inside Power BI.
    At the same time though, it just seems like this approach cripples what Power BI is capable of and leads to so many other issues. Especially when I see things like essentially multiple flat-file like views being pulled into a model, with relationships just being created in a really ambiguous way that has caused all kinds of problems that I inevitably end up needing to fix by creating a dimensional model.
    Would love to see a video going more in depth on concepts like when is it right to do a calculated column in SQL vs. having a measure in DAX, or elaborating on why its better to have a flexible dimensional model instead of applying all your filtering logic etc. to a view and then using that to drive a single report page.
    I've created really clean, robust dimensional models in the past so this sort of thing is infuriating to see, but at the same time I can't seem to break this misconception with people that creating a dimensional model takes too long compared to making a SQL view, or that having a pre-joined view in SQL isn't really a good replacement for a solid data model.

  • @AweshBhornya-ExcelforNewbies
    @AweshBhornya-ExcelforNewbies 5 лет назад +1

    That was a quick and effective video on Data Modelling. I need help to model the data for one of my client. May be this will be of some help. Can I connect with you directly some how if I need some help on the data

    • @GuyInACube
      @GuyInACube  5 лет назад

      Great to hear Awesh! I'd recommend engaging community.powerbi.com for longer form questions. Tons of folks there that can assist.

  • @SolutionsAbroad
    @SolutionsAbroad 4 года назад +1

    Great video Patrick as always, is it more performant to have a relationship, or merging the columns in Power Query?

  • @sureshramadurgam3730
    @sureshramadurgam3730 4 года назад

    Hey Patrick i like your videos they are really helpfull for a newbies like me. I have a question whether i can use direct SQL tables in to power BI or do i need to create any views or Sql query or SSAS cubes first? . What happens if i use direct SQl tables into power bi in import mode? SQl tables size is relatively small

  • @mjustesen
    @mjustesen 5 лет назад

    Awesome video! I have a question for your model though - as it's shown at 08:35..
    You have a Geography table that's related to your Customer table, so you can analyze InternetSales by Customer Geography. Let's say you would like to use the same Geography table to connect to SalesTerritory, in order to see Reseller Sales by Postcode or State (which is not existing in SalesTerritory).
    How would you go about that, as you cannot create a relationship directly between SalesTerritory and Geography (as it would introduce ambiguous relationships to the model).
    Is the solution to just duplicate the Geography table and connect it directly to ResellerSales..?
    Once again thanks for the great content, please keep it up :)

    • @GuyInACube
      @GuyInACube  5 лет назад

      Great question and even better observation. Honestly, in that case I would need to reshape the model. Since my Geography table would be on the many side of the relationship I would need to move it between ResellerSales and SalesTerritory. This is to ensure that all directions point to Reseller sales from the outer most dimension in the snowflake. With this change, I could reuse the same Geography table and avoid any ambiguous relationship. This is all theoretical since I don't have the data to validate this.

  • @DarkOri3nt
    @DarkOri3nt 5 лет назад

    Imagine there is no dw but numerous summarised tables built with numerous dimension tables in rdm would the best way to tackle this be building denormalised table structure into a star schema model ? Our company has numerous models with different levels of summarisation - aggregation in the same model would you create lowest level of data and summarise using measures to aggregate to different levels using dimension relationships ?

  • @00EagleEye00
    @00EagleEye00 3 года назад

    Good day sir.
    I would like to ask on how to include a dimension on a hierarchical design (recursion) to a fact?
    Kindly set an example and provide explanation if possible.
    Thank you.

  • @pranayreddy7303
    @pranayreddy7303 3 года назад

    hey patrick pls do a vedio on how to avoid bidirectional relationship and manu to many relationship

  • @jourdango2615
    @jourdango2615 4 года назад

    I have an important question! What is the difference between data modeling (like star schema) and database design/ database schema? How do we design a database so that it flows nicely to a data model? Can’t the database schema already just be star schema so that they are one ans the same??? But how can we expect users to update data in star schema form when everything is so fragmented and it uses integer IDs everywhere?
    Thanks!!!!

  • @sagnikmukherjee6313
    @sagnikmukherjee6313 4 года назад +5

    Hello Patrick - Thanks for the videos. Learnt a lot through your videos and your unique teaching style :) I have a situation and I would like your inputs on this. I have a Fact Table where I have 3 different dates like Order Date, Shipped Date and Delivered Date. I have a custom Date Dimension Table that I created in my Data Model using the CALENDARAUTO() Dax Function. Now I need to use that Date Dimension Table as Role Playing Dimension because for some queries, I need to leverage the join between Order Date Date, for some Shipped Date Date and for some Delivered Date Date. However, while building the relationship I can only have one Active relationship between the Fact Table and the Date Dimension Table. I could think of 2 different possibilities based on whatever knowledge I have gained.
    Number 1: I know that I have an option to create 3 versions of each measure using the USERELATIONSHIP Dax Function to force which relationship to be used while calculating the measure. But as I mentioned that it would mean that I need to create 3 version of each measure e.g. Sales by Order Date, Sales by Shipped Date, Sales by Delivered Date etc.
    Number 2: Otherwise I think I have to have 3 separate Date Dimension Tables like Order Date Dimension, Shipped Date Dimension and Delivered Date Dimension.
    Can you please suggest what should be the right and appropriate way to solve this model design situation? Looking forward to your help.

  • @tldrwithabiramisukumaran1334
    @tldrwithabiramisukumaran1334 4 года назад

    Nice video. The demonstration of the data model seems to me like its Snow Flake. I know Snow flake is a variant of the star schema but varies at one critical point that Star Schema is allowed to be denormalized (which is opposite to what you are recommending in terms of narrowing the table) as it has to contain the attributes of the dimension whereas Snowflake is strictly normalized. Could you please clarify how your example defies the variation and conforms to Star schema?

  • @mranimal2711
    @mranimal2711 5 лет назад

    Nice video Patrick! What about wide fact table? I have a wide consolidated fact table (60 Measures that are derived from 5 detail fact tables each covering 5 business processes - Sales, Marketing, etc..) that is rolled up w.r.t the conformed dimensions from the detail fact tables. this wide consolidated fact table is a physical table (with 60 Measures). Is this better than creating the 60 measures in DAX on top of detail fact tables?

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      There are a lot of factors that go into that. Assuming these are integer columns and not strings as you referred to them as measures. Depends how many rows in the table, etc. Best bet is to determine what your performance baseline is and then if you can optimize that at all. If the baseline is pretty fast, you may not need to do anything. Also, if you are using all the columns then you need them. If you aren't using any of them, then can we get rid of them?

  • @WyneeKariuki
    @WyneeKariuki 5 лет назад

    I have a question not related to the content in this tutorial. The "fill
    map" visuals available in the PowerBI market place do not have a
    "conditional format" feature that can allow users to define "data
    colors". This function was previously available in one of the earlier
    versions of PowerBi but when i go to "Data color" options within the
    fill map visual there is no longer provision for "conditional
    formatting". The choropleths were also able to automatically adjust the
    color schemes using filters in that version of PowerBI. Is it possible
    for you to demonstrate how one could utilize the currently available
    choropleths to achieve the same(Conditional formatting of Data Color for
    fill map visual, which can respond to filters).It would also be
    wonderful to see how to upload and use shape files(.shp)

  • @rowanschoultz1022
    @rowanschoultz1022 3 года назад

    Hi Patrick, my RMS creates a 24-digit primary key. Power BI displays these in scientific notation, which, of course, creates a ton of duplicates. We have tried to change the data type without success. We are forced to make the data type Text, which of course is a pig when it comes to compression. Is there a way we can have Power BI display the full 24-digit primary key?

  • @gkool4655
    @gkool4655 3 года назад

    You guys channel are literally saving thousands of people from losing their jobs

  • @TRINI123A
    @TRINI123A 4 года назад

    Nice videos. Question that is driving me nuts: When I unselect all options in a filter in Power BI, the visualization shows as if all options were selected. It should logically show nothing. Thoughts?

  • @SamuelRoshan
    @SamuelRoshan 5 лет назад

    @Patrick I have a scenario where I got 3 Million rows distinct Products linked to a Sales Table. However in the Sales only about 1 Million Products have been sold so far. I really dont need the unsold 1 Million products to be loaded into my model. Is there a way Power BI can delete unused records in a relationship? I could use a Sub Query on my Product to check if it exists in Sales, but I feel it will be expensive as my Sales data increases. Any suggestions?

  • @lucernec3101
    @lucernec3101 2 года назад

    Can you tell me if we can do Power BI Adhoc Analytics

  • @RaphaelSantosNet
    @RaphaelSantosNet 5 лет назад

    Great video Patrick. So, I do have a question though: is creating conditional columns in the query editor better than creating custom columns in the data model? Does that make real difference in terms of performance?
    I do have a few reports in which I used to create custom columns, but I am seriously thinking in replacing them directly in the query editor.
    Thanks for all you do!

    • @lokeshramaraj2800
      @lokeshramaraj2800 5 лет назад

      Conditional column in query editor would help. Bonus if query folding is enabled

  • @onurturna404
    @onurturna404 4 года назад

    Hi,
    i am working in corporate company. We are using powerbi on premise version. I want to build a data model to use in my dashboards. Which tools do I need to build datamodel in on premise version?
    Thanks

  • @patriciocahue3771
    @patriciocahue3771 5 лет назад

    Yooooo! wassup! If you have SSAS capabilities, one should use this option for creating the model and then connecting to that source from Power BI. I find this to be a more efficient way to work with data coming from disparate sources.
    When things get too complex, it is always good to return to the basics. Good simple video with powerful information.

  • @joelatino3748
    @joelatino3748 4 года назад

    Great video, thanks for putting this together.

  • @EffnShaShinko
    @EffnShaShinko 5 лет назад

    This is pure gold.

  • @johngay1981
    @johngay1981 3 года назад

    Good content Patrick.

  • @tucavaleu
    @tucavaleu 5 лет назад

    Hey, Patrick! First of all congrats for your tips. So... could you put here the link where you talk about “Avoid many-to-many relationship”? I didn’t find it. Thanks, bro

  • @ThePalmefamily
    @ThePalmefamily 4 года назад

    Do you have any tutorials with follow along for beginners?

  • @sid0000009
    @sid0000009 5 лет назад

    Hey, I need some help.. Is there a way we can 1. Create Calculated columns based on 2 different tables. 2 Create a data preview from the model I create in Visual Studio. I would eventually use LIVE connection due to data volume size in Power BI so the modelling part needs to be finalized in Analysis services only as we hit limitations in Power BI. Thank you!

  • @lisbongraffiti242
    @lisbongraffiti242 Год назад

    Hi everyone!
    Maybe somebody here could help me with this issue:
    I have a FactTable with CustomerID and ProductID as ForeignKeys.
    The CustomerID has also some columns with information about territory such as (City, Region, Country).
    I tried to combine those columns with the Fact Table, create the new Territory Table related to this columns and then delete the columns from the fact table and the Products Table with Power Query, butn if I delete any of this columns, all their fields become null.
    As the columns should not repeat in a Star Schema, is there any way that I can separate those columns into a different dimension table and add this new Territory Dimension with his specific Territory Key inserted into the original Fact Table?
    Thank you very much!

  • @powerbinoob8535
    @powerbinoob8535 2 года назад

    how did you create multiple tables ?

  • @ankurgupta842
    @ankurgupta842 5 лет назад

    Hello Patrick/Adam, thnaks for all the amazing videos.
    I have published a Table Visual on certain Direct Query model, can you please suggest any way to download near about 1 million record set from Power BI Service? As there is limitation of 150K records OR 16MB file.

    • @GuyInACube
      @GuyInACube  5 лет назад +1

      There currently isn't a way. However, we are working an a workaround and video will be released very soon.

    • @ankurgupta842
      @ankurgupta842 5 лет назад

      @@GuyInACube Thank you for the reply. Looking forward for that workaround video.

  • @sauravsinha6939
    @sauravsinha6939 3 года назад

    why many to many as both has same key so one can be primary and other can be foreign key ?

  • @karlaksixtos6294
    @karlaksixtos6294 4 года назад

    Is it possible to add an already existing pivot table to a data model?? If not do you have any other option?? Thanks!

    • @GuyInACube
      @GuyInACube  4 года назад +1

      You could get data from an Excel sheet. it would just be a static pull and you would lose the functionality of the pivottable.

  • @charliehj6064
    @charliehj6064 3 месяца назад

    is there a part 2