How Real Time Computer Graphics and Rasterization work

Поделиться
HTML-код
  • Опубликовано: 19 июн 2024
  • Patreon: / floatymonkey
    Discord: floatymonkey.com/discord
    Instagram: / laurooyen
    #math #computergraphics

Комментарии • 84

  • @FloatyMonkey
    @FloatyMonkey  4 года назад +61

    Should I make an in depth video about the math of the rasterizer?
    Otherwise the next video will be about matrices.

    • @CosmicComputer
      @CosmicComputer 4 года назад +6

      FloatyMonkey either would be fascinating so do whichever interests you more, we will show up for them both!

    • @paulhax
      @paulhax 3 года назад +7

      Math of rasterizer! Its hard to find "advanced" 3D graphics math clearly explained on RUclips. Good stuff here!

    • @saeedmahmoodi7211
      @saeedmahmoodi7211 3 года назад +1

      yes please

    • @anirudh351
      @anirudh351 Год назад

      yes

    • @vignarajahkm4462
      @vignarajahkm4462 2 месяца назад

      Yea.

  • @brohogany9920
    @brohogany9920 3 года назад +37

    "But who will ever read this code anyway" lol

  • @Saas_1
    @Saas_1 3 года назад +24

    I watched so many videos about this topic, but this one just nailed it! Finally I got a deeper understanding how a rasterizer works! Thank you!

  • @Lavamar
    @Lavamar 2 года назад +24

    I have not heard anyone explain computer science topics better than you. Hope you keep making videos!

    • @FloatyMonkey
      @FloatyMonkey  2 года назад +4

      Hi, thanks! I haven't uploaded in a while but if you want to know what I'm up to, you might want to visit the Discord ;)

  • @Mike360infinity
    @Mike360infinity 2 года назад +1

    Can't believe the quality of your videos! You make complex stuff easier to understand, really appreciate your effort :)

  • @roxferesr
    @roxferesr 4 года назад +11

    I have just discovered your channel and love your content! Keep it up!

  • @paulamorillasalonso1927
    @paulamorillasalonso1927 2 месяца назад

    Thank you so much! I feel like I finally understand the process behind rendering. Keep up the good work 😄😄😄

  • @umaradam2260
    @umaradam2260 2 года назад

    I've watched so many videos to get an understanding of the pipeline but none come close to this one Thanks Alottt!!

  • @alirezaakhavi9943
    @alirezaakhavi9943 Год назад

    such a comprehensive and interesting video thank you so much for sharing! :) subbed

  • @EvgenyMeshkov
    @EvgenyMeshkov 3 года назад

    This was very helpful and informative! Thank you))

  • @ichbinbilal
    @ichbinbilal Год назад

    I just found your Video because I am working with OpenGL and I really like it! Wish there was more in-depth content about the different shaders and everything. Nice work!

  • @berthold64
    @berthold64 3 года назад

    Just subbed. Great content!

  • @elturco9573
    @elturco9573 3 года назад +1

    Good channel. You deserve more subs. Subscribed

  • @fatihyavuz505
    @fatihyavuz505 Год назад

    this is very clean tutorial to understand. Thanks

  • @ClutchGen
    @ClutchGen 3 года назад

    Thank you for this!!

  • @Krblshna
    @Krblshna Год назад

    Nice work!

  • @germanhoyos4422
    @germanhoyos4422 Месяц назад

    instant sub. well done

  • @yousefsayed6380
    @yousefsayed6380 10 месяцев назад

    this is the best explanation of the graphics pipeline I've ever saw

  • @xanderlinhares
    @xanderlinhares 3 года назад +6

    This is a top notch production! Well done, my students will appreciate this.

    • @FloatyMonkey
      @FloatyMonkey  3 года назад +2

      Thanks! What course do you teach if I may ask?

    • @xanderlinhares
      @xanderlinhares 3 года назад +2

      @@FloatyMonkey I teach Intro to Computer Graphics and Graphics Programming. How do you make your videos? If you had a course on that I would gladly pay for it.

    • @FloatyMonkey
      @FloatyMonkey  3 года назад +2

      The visualizations are made in PowerPoint. It sounds silly, but the idea behind it is that you can use them in front of a live audience. I do not have a course on it but all the PowerPoints are available to Patrons for only €8/month. Unfortunatly my Patreon is subscription based but you can always download the files and cancel the subscription immediatly afterwards.

    • @xanderlinhares
      @xanderlinhares 3 года назад +4

      @@FloatyMonkey I’m happy to contribute $8 and maybe some BAT if you take that. I really appreciate you taking the time to thoughtfully respond. Your PowerPoint-Fu is strong.

    • @FloatyMonkey
      @FloatyMonkey  3 года назад +2

      Haha thanks, and no problem. I love to explain and engage with my audience in what I'm passionate about. Just becoming (my first) Patron, would already mean a lot, especially since you're a teacher/professor. Knowing my videos could be usefull to students is absolutly delightfull.

  • @elyas6395
    @elyas6395 3 года назад +1

    i won't believe your c++ codes anymore hahaha dividing normal by the id XD, i love your content .

  • @gutzimmumdo4910
    @gutzimmumdo4910 2 года назад +1

    dam brah, you are quite versed on cg im glad i found your channel.

  • @chicapercebe
    @chicapercebe 3 года назад +2

    Thank you! Also thanks for speaking english so clearly, I could understand everything xD

  • @8BitsdeNostalgia
    @8BitsdeNostalgia 2 года назад +1

    So well explained! I've just started a youtube channel about retro technologies and systems, for Brazilians, and I would love to know how software did you use to animate those arrows ;). Thx

    • @FloatyMonkey
      @FloatyMonkey  2 года назад +1

      Thanks. It's all made with PowerPoint.

  • @Mrchingchingdingding
    @Mrchingchingdingding 3 месяца назад

    Do you tutor? I like how detailed you are but there were some areas that were vague for me, like the fact that the output vectors of a vector shader on some input vertex would fall within the range -1

  • @enthusiasticsimple901
    @enthusiasticsimple901 2 года назад +1

    Amazing tutorial for beginners, much better explanation than my college professor's😅

  • @shahroozleon9098
    @shahroozleon9098 2 года назад

    Dude why youre inactive on youtube, you really explain things very well, thank you for this great video

    • @FloatyMonkey
      @FloatyMonkey  2 года назад

      Thanks. The inactivity is just due to a lack of time. By this summer, I hope to make some video’s about my optical motion capture system, a project I’ve been working on since October 2020. I decided to post updates about it on the Discord, go have a look if you’re interested.

  • @nathanbanks2354
    @nathanbanks2354 Год назад +1

    I've been playing with particle shaders, and didn't quite understand the colour output until I read the comment at 4:40

  • @Jesse-xq7jb
    @Jesse-xq7jb 2 года назад

    Where did you go man!? This channel has the juice

    • @FloatyMonkey
      @FloatyMonkey  2 года назад

      Thanks. Between a full time job, writing a game engine and building a diy lighstage and motion capture system I haven't had much time to make video's this past year, although I now have a lot of interesting topics to cover. I definitely want to get back into it, hopefully by end of this year ...

  • @EclecticVibe
    @EclecticVibe 3 года назад +1

    Hello FloatyMonkey, your channel is really great and I found the videos very informative. Thank you. I have a question about Tesselation. Why is Tesselation required here..since vertex shader already gets data that are vertices of triangles ?

    • @FloatyMonkey
      @FloatyMonkey  3 года назад +5

      That's a great question. The advantage of tesselation is that it runs entirely on the GPU. This means you don't have to transfer the extra vertices from the CPU to the GPU over their relatively slow PCIe connection. On top of that there's less GPU memory required to store the model since the result of the tesselation is immediatly discarded after a specific model has been drawn. It obviously introduces some computational overhead but the advantages far outweigh the disadvantages. These days we can migrate away from the tesselation and geometry shaders however with a new shader stage called Mesh Shaders. Something I will probably talk about in the future.

    • @EclecticVibe
      @EclecticVibe 3 года назад +2

      @@FloatyMonkey Thank you for a great explanation. In another of your video titled "how triangles make up 3D models" you talk about 'Triangulation'. How is Triangulation differnt from Tesselation ?

    • @FloatyMonkey
      @FloatyMonkey  3 года назад +6

      Triangulation is the process of taking an arbitrary polygon (quad, octagon, ...) and turning it into the minimal amount of triangles that represents that polygon, this is required since GPUs prefer to work with triangles only. Tesselation is the process of taking a single triangle and dividing it in a bunch of other triangles which can be displaced in 3d space to add extra geometric detail.

  • @iparadoxg
    @iparadoxg 4 года назад

    i cant figure out how the output merger works, can you please give a visual example on how it works?

  • @hyjarion6972
    @hyjarion6972 3 года назад

    Hello! It's a great video and I would have a question. Where all the advanced lighting techniques are being processed? Is that at the pixel shader too?
    Thank you

    • @FloatyMonkey
      @FloatyMonkey  3 года назад

      Usually in the pixel shader. When using a deferred pipeline however, it's also possible to run them in a compute shader.

    • @hyjarion6972
      @hyjarion6972 3 года назад +1

      @@FloatyMonkey Thank you for your answer!

  • @zozeme9218
    @zozeme9218 2 года назад +1

    i read your code, and it's a division by ID to get the color attributes😉

  • @dustinthompson8600
    @dustinthompson8600 3 года назад

    I have a couple questions, hopefully its clear without visualization here:
    1. Does the rasterizer produce multiple fragments/pixels with the same position? For example, where the blue and red triangles overlap, would there be a fragment with the blue triangle attributes and one with the red triangle attributes in the same overlapping position? i.e. {position = (2, 3), color = blue, depth = 0.1}, {position= (2, 3), color = red, depth = 0.5}
    2. If there are fragments with duplicate positions, does this mean that the Fragment/Pixel shader will run on the same fragment position multiple times even though in the following stage (output merger) the result of the red fragment being shaded will be thrown away (because its in the back)?
    Thank you

    • @FloatyMonkey
      @FloatyMonkey  3 года назад +2

      Great questions. The pixel (fragment) shader can run multiple times for a single pixel, obviously each time calculating the color of a different primitive. These colors either get blended or discarded by the output merger. The latter isn't very efficient in case of overlapping opaque geometry since a lot of work gets thrown away. That's one of the reasons why we use a depth buffer, which can be used to determine whether a pixel shader invocation will get discarded before it actually runs. A finel method to reduce overdraw (the name for this phenomenon) is to render geometry from front to back. This makes sure foreground geometry writes to the depth buffer first so that geometry behind it gets discarded before rendering. Hope this makes sense ;)

  • @papercolor8259
    @papercolor8259 Год назад

    how you make the video? ppt animation?

  • @andrewkardenetz259
    @andrewkardenetz259 Год назад

    subscribed, if not for dark theme videos alone

  • @PortalGenerator
    @PortalGenerator 4 года назад +6

    4:43 I saw what you did ;)

    • @FloatyMonkey
      @FloatyMonkey  4 года назад +1

      Lol, guess someone did read my code

  • @pat917
    @pat917 2 года назад

    Probably something I missed, but where does the perspective projection take place? Wherever you mentioned the vertices they were described with x,y,z not x,y

    • @FloatyMonkey
      @FloatyMonkey  2 года назад

      It happens in the vertex shader (shown at 4:40 on the first line of the main function). We actually do pass the entire result of Vertex * ViewProjectionMatrix to the GPU. In that case x,y describes the vertex position in normalized coordinates in range (-1, 1) and z describes the depth in range (0, 1). This depth value is still needed to determine if the shaded fragment is visible.

    • @pat917
      @pat917 2 года назад

      @@FloatyMonkey Awesome thanks. I thought so but wasn’t sure. Great video

  • @malharjajoo7393
    @malharjajoo7393 2 года назад

    More documentation can be seen in MSFT's Direct3D pipeline: docs.microsoft.com/en-us/windows/win32/direct3d11/overviews-direct3d-11-graphics-pipeline

  • @OZTVjjang
    @OZTVjjang Год назад

    I love your video! I want to use some screenshots in my lecture. Of course I will cite the source. can I?

    • @FloatyMonkey
      @FloatyMonkey  Год назад +1

      Sure, no problem. Out of curiosity what kind of lecture?

    • @OZTVjjang
      @OZTVjjang Год назад

      @@FloatyMonkey Thanks a lot! I am creating various lecture contents. First of all, I will conduct a webinar for shader development. ruclips.net/video/r_eatgPFQYg/видео.html

    • @FloatyMonkey
      @FloatyMonkey  Год назад

      I don't speak Korean but that looked great! Going on the visuals, you managed to touch upon all the preliminaries of shader development in a very short time.

  • @prashantdhawase8057
    @prashantdhawase8057 2 года назад

    I was reading that code why did u divide normal by vertex id 🤣🤣🤣🤣

  • @liangqixuan9001
    @liangqixuan9001 2 года назад

    Which kind of code is the code in the video??

    • @FloatyMonkey
      @FloatyMonkey  2 года назад

      It's HLSL shader code which can run on a GPU.

  • @roeelazar7438
    @roeelazar7438 2 года назад +1

    What an accent! Where are you from? I can't figure it out.

    • @FloatyMonkey
      @FloatyMonkey  2 года назад +2

      I'm from Belgium, my mother tongue is Dutch.

  • @akeylawhite9217
    @akeylawhite9217 10 месяцев назад

    I read the code, that's how you learn

  • @npip99
    @npip99 7 месяцев назад +1

    7:37 That's not really a sufficient explanation. Generating the locations of the vertices of your blender mesh is also not possible on a graphics card, but the CPU sets it up for the GPU; just because the GPU can't do something doesn't explain why it's not done. So, why doesn't the CPU just sort the triangles? The real one-sentence explanation is "Unfortunately, we can't solve this problem by changing the order in which we draw the triangles, *since it's way too expensive to sort hundreds of thousands of triangles, every single frame*. The solution to our problem...".
    As a secondary issue, if you have a triplet of long rectangles where A covers B, B covers C, and C covers A, that triplet can't be resolved without a Z-buffer either. But that's a rare case and generally not the "real" reason, it's just too expensive to compare overlapping triangles and re-sort them all every single frame.

  • @PixelsLaboratory
    @PixelsLaboratory Год назад

    Amazing explanation!!! ♥ But... 09:59... what?! 😅

    • @FloatyMonkey
      @FloatyMonkey  Год назад

      Lol, I just rewatched that part, sould have made it a bit more clear 😅.

    • @SixStringOverdose
      @SixStringOverdose Год назад

      @@FloatyMonkey point was about the "pink" color vs. the actual color you used 😁 loved the video anyway!

    • @FloatyMonkey
      @FloatyMonkey  Год назад +1

      I'm colorblind, for me it's pink 🤣.

    • @SixStringOverdose
      @SixStringOverdose Год назад

      @@FloatyMonkey ahh did't know, so sorry! it's a dark blue with a shade of grey,, but more blue than that. Anyhow, thanks for your videos, I added about 5 of them to a curriculum playlist I'm making for some future newcomers at my workplace!

  • @ChopinDolphy
    @ChopinDolphy 11 месяцев назад

    I think the hardest part of this video to understand is how that color is supposed to be “pink” 😂

    • @FloatyMonkey
      @FloatyMonkey  11 месяцев назад +1

      I'll take that as a compliment, but yeah I'm colorblind 😉

    • @ChopinDolphy
      @ChopinDolphy 11 месяцев назад +1

      @@FloatyMonkeyIt is! (and I’m even a big noob to this stuff) Looking forward to watching the rest of your videos!

  • @jasonboyd782
    @jasonboyd782 Год назад

    Where is that accent from?

    • @FloatyMonkey
      @FloatyMonkey  Год назад

      Belgium (the Dutch speaking half), but I might have picked it up from across the internet ;)

  • @grantexploit5903
    @grantexploit5903 Год назад

    0:50 LOL software rendering go brrr
    3:10 If 16-bits is conventionally used... how do modern 3D graphics work? I mean, games have exceeded 65,536 vertices per frame since the early 2000s, and 65,536 vertices per object for somewhat less time. Do engines swap out vertices mid-frame like was done for sprites and palletes for ambitious games on sprite-based hardware?
    4:35 ...why? Doesn't sound like a very useful range for display purposes at all.
    6:25 You _could_ use Bresenham's line algorithm or something instead of constructing triangles for that purpose.
    7:50 Huh... So the RTX 4090-or for a video-contemporary example, the Titan RTX-is less capable in a respect than, say, the Atari VCS/2600's TIA?! (Sprite-based graphics hardware allows for direct draw-ordering of sprites, AFAIK without exception because without vertex-level depth information, there's really no other method to handle depth. This was also done for polygons on the first generation of console "3D" video processors {that is, the 3DO Interactive Multiplayer's Clio/CEL Engine, the Sega Saturn's VDP1, and the PlayStation's Sony GPU†}, as they lacked per-pixel depth calculation and perspective-correct texture mapping. Oh, and speaking about the 3DO and Saturn, they actually _directly_ used quadrilaterals as their base geometric primitive rather than triangles {causing quite the headache for their devs}, so yeah.)
    7:55 Can you explain why a depth buffer is beneficial? Seems to me that the very processes needed to create one eliminate the need for one and _in fact make creating one wasteful_ . I mean, here's essentially what it _seems_ you're doing to create a depth buffer:
    1. Use vertex shader output coordinates to identify the polygons behind the pixel of screen coordinate (x, y).
    2. " identify what _part_ (i.e. polygon-specific coordinates) of the polygons are behind the pixel of screen coordinate (x, y).
    3. " identify the _depth_ of the part of the polygons behind the pixel of screen coordinate (x, y).
    4. Sort (polygon, depth) pairs from least to greatest depth.
    5. Identify which polygon is at least depth.
    6. Store least result as pixel in depth buffer.
    7. Move on to next pixel.
    Thing is, if you know what polygon is closest to the pixel of a screen coordinate, and you know where on the polygon that is, then you already have ALL the information you need to start the texture-filtering/pixel-shader process for that pixel. (And indeed, those steps 1-5 and 7 are _required_ for perspective-correct texture mapping AFAIK, so it's not like that process itself is wasteful.) _So, why waste cycles and memory in storing the depth in a buffer_ ? After all, if you're going to come back to it later for any future texture-filtering or pixel-shading-related use, you're _also_ going to have to store a "polygon buffer" or else wastefully redo steps 1-5 and 7 (or at least 1-3, 5, and 7) in order to re-determine what face that depth value actually belongs to.
    †Though not necessarily the Atari Jaguar's Tom, which actually was able to handle vertex depth information during rasterization, being perhaps the first consumer-grade 3D (no quotations) hardware capable of that. However, the complexity, bugs, and bottlenecks of the Jaguar's architecture rendered its software generally unimpressive compared to its competition which lacked such a feature, and certain games like Club Drive show evidence of depth-ordered polygons.
    Overall, excellent short video, though. ;)