Это видео недоступно.
Сожалеем об этом.

Radiance Cascades Rendered Directly

Поделиться
HTML-код
  • Опубликовано: 18 авг 2023
  • In this video we explore data stored in radiance cascades by observing it directly. This is equivalent to precalculating a scene, storing a cross-section of its radiance field and then rendering it from any viewpoint and any angle in O(1).

Комментарии • 211

  • @Alexander_Sannikov
    @Alexander_Sannikov  Месяц назад +113

    How ya'll folks found this video? I'm seeing a lot of half-life themed visitors coming from somewhere, and I've no idea where from.

    • @porttek0oficial
      @porttek0oficial Месяц назад +47

      This was on my recommended page and had interesting thumbnail. I watched it until the end and I still don't know what I am looking at.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Месяц назад +14

      @@porttek0oficial sorry for that..

    • @valshaped
      @valshaped Месяц назад +27

      Don't be sorry for it! This was a wonderful demo, and you don't need to be a graphics programmer to find it interesting, fun, enjoyable, and visually appealing!

    • @Deathbynature89
      @Deathbynature89 Месяц назад +39

      Radiance Cascade sounds like Resonance Cascade combined with a cool thumbnail it is intriguing.
      "What is a Radiance Cascade? I wanna know."
      This was recommended to me on my homepage. I have watch videos on radiance fields and tutorials on Gaussian Splatting. The thumbnail also looks like something that could be rendered in Source 2.

    • @SuboptimalEng
      @SuboptimalEng Месяц назад +25

      Probably getting recommended because of SimonDev’s recent video on Radiance Cascades heh.

  • @astr0_th3_man84
    @astr0_th3_man84 Месяц назад +398

    My dumbass thinking this was a half life video

    • @Lexie_T
      @Lexie_T Месяц назад +2

      There's no chance of a half life video.

    • @elpanatv2537
      @elpanatv2537 Месяц назад +15

      resonance cascade🔥🔥🔥🔥🔥🔥🗣🗣🗣🗣

    • @noisetide
      @noisetide Месяц назад +4

      @@elpanatv2537 It's time to choose Mr. Freeman...

    • @makeandbreakgames1791
      @makeandbreakgames1791 Месяц назад +1

      same lol i thought it was a high quality render or something

    • @freakmoler5563
      @freakmoler5563 Месяц назад

      It was the weird thumbnail that tricked me

  • @OtterCynical
    @OtterCynical Месяц назад +520

    Gordon doesn't need to hear all of this, he's a highly trained professional.

    • @minecraftermad
      @minecraftermad Месяц назад +15

      No no that's resonance cascade.

    • @Etka06
      @Etka06 Месяц назад +1

      @@minecraftermad same thing

    • @Gelatinocyte2
      @Gelatinocyte2 Месяц назад +8

      Gordon doesn't need to *see* all of this, he's a highly trained *game dev.*

  • @K4leidos
    @K4leidos Месяц назад +157

    Never change your mic. It's somehow almost perfect

    • @derc4N
      @derc4N Месяц назад

    • @atlas_19
      @atlas_19 Месяц назад +16

      It's like a voice recording/voice report from games and movies.

    • @liegon
      @liegon Месяц назад +4

      It has a lofi feel, like a cassette recording.

    • @antonio_carvalho
      @antonio_carvalho Месяц назад +3

      The slightly nasal voice in a matter of factly intonation also contribute to the effect

    • @Jam_Axo
      @Jam_Axo 13 дней назад

      its a mad scientist mic in the making

  • @Klayperson
    @Klayperson Месяц назад +32

    this is a sickass way to render cosmic shadow people

  • @DilettanteProjects
    @DilettanteProjects Месяц назад +146

    I never thought I'd live to see a radiance cascade

    • @Kavukamari
      @Kavukamari Месяц назад +35

      let alone create one...

    • @Masonova1
      @Masonova1 29 дней назад

      ☝️🤓 actually it was a resonance cascade

    • @DilettanteProjects
      @DilettanteProjects 25 дней назад +3

      @@Masonova1 I like to do this thing sometimes where I notice that one word sounds a bit like another and then I make a joke out of that

    • @Synthonym
      @Synthonym 9 дней назад

      @@Masonova1 whoooosh

  • @Alexander_Sannikov
    @Alexander_Sannikov  Год назад +212

    One thing that I forgot to mention in the video is the, um, sparkles? These are path tracing fireflies that made their way into radiance fields -- they go away the more time you give path tracing to converge, but I did not bother waiting more than half an hour and I thought they look kind on cool anyway. They don't exist while using this data structure for calculating actual global illumination because it needs much lower resolution to be resolved and so it converges much faster.

    • @johnsherfey3675
      @johnsherfey3675 Месяц назад

      Would denoising help?

    • @Mr.LeoNov
      @Mr.LeoNov Месяц назад +1

      ​@@johnsherfey3675I don't think so giving how little resolution there is

    • @ThePrimaFacie
      @ThePrimaFacie Месяц назад +2

      You should pin this as the top comment

    • @monx
      @monx Месяц назад +5

      thanks. the video is missing a brief assurance that this "ghost skull" asset is presented exactly as it would appear in a game. It is not a human head in debug mode.

    • @draco6349
      @draco6349 Месяц назад +2

      it's a fascinating effect, actually- a volume that sparkles isn't something i've seen much before. i wonder if it has any actual use.

  • @valshaped
    @valshaped Месяц назад +44

    The cascades look like the raw output from a light field camera! Very cool!

    • @Alexander_Sannikov
      @Alexander_Sannikov  Месяц назад +21

      There's only so many ways you can encode a light field..

    • @johndawson6057
      @johndawson6057 Месяц назад +2

      ​@@Alexander_Sannikovso it is used in light field cameras then?

  • @longjohn7992
    @longjohn7992 Месяц назад +29

    “Carmack doesn’t need to hear all this he’s a highly trained professional”

  • @Vulcorio
    @Vulcorio Месяц назад +36

    No idea what I've just listened to but the imagery is very fascinating.

  • @ThePrimaFacie
    @ThePrimaFacie Месяц назад +21

    My brain watching this and hearing how its done is doing the sparkly bits of the model. Thanks for the vid

  • @euphemia3092
    @euphemia3092 Год назад +23

    Absolutely blown away by all of your work. Thank you for sharing!!

  • @bebroid420
    @bebroid420 Год назад +11

    2 years ago I was experimenting with directional lightmaps, trying to achieve both diffuse and sharp specular lighting. I've been messing with "plenoptic textures" that look very similar to this demo. It's really interesting how one can come up with a similar concept while trying to achieve different goal. The whole idea with the cascades and usage of this technique to calculate screen space global illuminance.... Just wow!

  • @aburak621
    @aburak621 Год назад +11

    Thanks Alexander for you ExileCon presentation! It was a joy to watch and learn.

  • @footspring94
    @footspring94 Месяц назад +3

    like a digital hologram. crazy that this has been doable for such a long time and only now has been found. And just by someone on yt.

  • @4.0.4
    @4.0.4 Месяц назад +15

    As Wave Function Collapse terrain generation has proven, cool names inspired on physics are generally better for game development.

    • @fritt_wastaken
      @fritt_wastaken 26 дней назад +2

      But wave function collapse is an abysmally terrible name. It has nothing to do with waves or functions. Gives wrong impression on both essence and complexity of the algorithm

  • @melody3741
    @melody3741 18 дней назад

    This is the first time I have ever comprehended these, because everyone else basically just called it black box. Thank you!!!

  • @codymitchell4114
    @codymitchell4114 19 дней назад

    Absolutely gorgeous rendering

  • @perkele1989
    @perkele1989 16 дней назад

    The halflife refs are clearly due to your phenomenal HL voice and production quality

  • @jeremyashcraft2053
    @jeremyashcraft2053 7 дней назад

    Really clever stuff! I know PoE 2 will be shipping with this lighting technique, and I can't wait to see it in action!

  • @TheNSJaws
    @TheNSJaws Год назад +36

    would you mind doing more of these? Not just for Cascade rendering, but in general.
    I quite appreciate your PoE presentations, and every time I rewatch them, I wish they gave you more time.

  • @CharlesVanNoland
    @CharlesVanNoland Месяц назад +3

    This is very reminiscent of lightfield rendering (originally "image based rendering" 20 years ago) of the sort that OTOY and Lytro were working on a decade ago, except here you have multiple resolutions for multiple depths? I'll have to look at your paper to understand the cascade aspect.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Месяц назад +2

      each cascade is encoded in a way that's similar to the good ol' image based rendering. but the most powerful property of RC is how information is distributed across multiple cascades.

  • @DabidarZ
    @DabidarZ Месяц назад +3

    im so confuseddddd BUT THAT LOOKS cool and i love seeing new stuff!!!

  • @footspring94
    @footspring94 Месяц назад +1

    just think how this could replace individual props like tables in corners or show massive events like large cut scenes or animated backgrounds. I also think it will work very well as a way to manage surface texturing.

  • @Vegan_Kebab_In_My_Hand
    @Vegan_Kebab_In_My_Hand Месяц назад +1

    Nice video, and great explanation of the cross-sections and their relationship to the spatial and angular resolutions!

  • @f7029
    @f7029 Месяц назад

    This is amazing!!!!! Really excited to see what comes out of this

  • @TheEagleFace
    @TheEagleFace Месяц назад +3

    babe wake up they just dropped virtual holograms

  • @Merthalophor
    @Merthalophor 3 дня назад

    What I gather:
    We're "storing" in some way how the 3d model looks like if viewed through a plane from any angle, without keeping the 3d model. The most obvious way to do that is to store for each pixel of the plane what the pixel would look like if viewed from every angle. We could store, say, 200 different angles, which would mean that we have to store for _every_ pixel 200 different colors. Then, when rendering, we could check what angle we're looking at a pixel, then linarly interpolate between the colors associated with similar angles.
    What this paper shows is that this is not necessary, and we can make the image still look decent while storing much less information. The key idea is that while we want to store a _few_ angles for _every_ pixel, we only need a _few_ pixels that store a _lot_ of angles. So for example, we could store 4 angles for _every_ pixel - then have a separate map that stores 8 angles for every 16th pixel - then another map that stores 16 angles but only every 36 pixels, and so on. By cleverly interpolating this information, we get a really life-like image, while only storing a fraction of the information (and, conversely, while only having to _calculate_ a fraction of the information, if running in real time).

    • @Alexander_Sannikov
      @Alexander_Sannikov  3 дня назад

      the really important part is that the tradeoff of spatial:angular density is only possible for a given distance range. that's why RC stores radiance intervals, because they capture light coming from a certain distance range.

    • @Merthalophor
      @Merthalophor 3 часа назад

      @@Alexander_Sannikov Distance... from the object being rendered? If you moved away from the plane the quality would decrease?

  • @user-mx7mc7sv2q
    @user-mx7mc7sv2q Месяц назад

    It came from recommendations and I have no idea what this is. But it's damn cool! Now I'll go deeper that rabbit hole to find out what it is and how you managed to make it work

  • @pravkdey
    @pravkdey Месяц назад

    Looks pretty! If it's too memory intense i can imagine it being used sparingly, like for smoke grenade or effects on weapons

  • @DJDDstrich
    @DJDDstrich Год назад +11

    Hey Alex thanks for the video - Really cool to see a little more insight into this Radiance Cascades tech you've developed! Must be fun to play around with various little projects like this now that the tech has proven itself! How are you going with potentially open sourcing some form of it in the near future? Is it still a possibility? It seems like the only viable GI option for real-time procedural games like ARPGs or sim/management games, and, well I want to get my greedy little mitts on it! :D

    • @Alexander_Sannikov
      @Alexander_Sannikov  Год назад +26

      I'm close to publishing an article first, them I'm going to open-source it.

    • @DJDDstrich
      @DJDDstrich Год назад +4

      @@Alexander_Sannikov amazing mate! I'm super excited and hope you get the recognition from it! Where might the article land and how do I get notified?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Год назад +12

      @@DJDDstrich I'm probably going to make a short announcement video on this channel with a link to something like google drive pdf before sending it to a journal.

  • @LukeGeaney
    @LukeGeaney 27 дней назад

    Subbed - found your channel via the SimonDev video on radiance cascades then your ExileCon video :)

  • @photometricman
    @photometricman 9 месяцев назад +4

    A lot of similarities in the atlas to lightfield photography. Highly interesting work. Thank you!

    • @Alexander_Sannikov
      @Alexander_Sannikov  9 месяцев назад +2

      it is capturing the same data, so yeah

    • @Tr33fiddy
      @Tr33fiddy 7 месяцев назад +1

      Good thought. Like the Lytro light field camera.

  • @chris-hayes
    @chris-hayes Месяц назад

    How.. illuminating. Very cool.

  • @blakes8901
    @blakes8901 Месяц назад

    this genuinely might change everything. good luck. I hope this actually pays off for you monetarily.

  • @LoraHannike
    @LoraHannike 26 дней назад

    digital hologram, finally
    great work! keep it up

    • @Alexander_Sannikov
      @Alexander_Sannikov  26 дней назад +1

      i have a video on actual simulated holograms. check my community page to see what it looks like.

    • @LoraHannike
      @LoraHannike 26 дней назад

      @@Alexander_Sannikov I will

  • @Visigoth_
    @Visigoth_ 18 дней назад

    Cool video (Game Dev)... YT sees my interest in "digital volumes" and recommended this.

  • @satibel
    @satibel 11 месяцев назад +3

    have you tried that with a billboard explosion? it may allow high speed rendering with some semi-3d effect.

  • @sublucid
    @sublucid Год назад +1

    Super cool! Show us what this same model looks like illuminated by the final technique!

  • @ellescer
    @ellescer Месяц назад +1

    no idea what your talking about or what any of this means but, looks wicked cool dude

  • @manapotion1594
    @manapotion1594 Год назад +3

    Cool research. So, does that mean the angle of view is quite limited? Does the quality degrades with the "depth" of the object from camera?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Год назад

      the angle is not limited, but the quality does degrade with depth with this encoding

  • @iestynne
    @iestynne Месяц назад

    Thanks for showing the atlas view.
    I'm curious why you chose parabolic encoding?

  • @Deathend
    @Deathend Месяц назад

    Totally a coincidence that it looks like a interdimensional ghost celebrating cico de mayo.

  • @UncoveredTruths
    @UncoveredTruths Год назад +1

    such a cool technique

  • @kabu8341
    @kabu8341 Месяц назад

    If you ever happen to have enough free time on your hands, I would love to have a video explanation for your radiance cascade for people who know not much about graphics programming :D SimonDev's Video was a good start for it tho, now I understand the things you do a bit more. Very fascinating. I still can't really imagine what those probes look like. As far as I understand now it sounds like you have those probes with different properties in the whole... err.. view? volume? of the camera and they are scanning everything and sending the averaged data to the camera and this is what you see in the end?

  • @Jianju69
    @Jianju69 Месяц назад

    Interesting effect, somewhat like glitter immersed in plastic. How much memory does this demo use? Seems too resource intensive for complex real-time scenes.

  • @harry1010
    @harry1010 Месяц назад

    Never seen radiance fields rendered so fast - on the cpu, too!quick q - do you have angular resolution inconsistencies at different layers due to how the resolution changes at each depth layer?

  • @chapterone1162
    @chapterone1162 Месяц назад

    Really cool stuff man good work

  • @simplegamer6660
    @simplegamer6660 Месяц назад

    Damn, what mic are You using? It sounds rad af. I seriously wanna know
    P.S. Why do i have a feeling that i'm close to being the only one who came here *not* because i thought the video is somehow involved with half-life?:D

    • @Alexander_Sannikov
      @Alexander_Sannikov  Месяц назад +1

      Boys, looks like we have a non-half-life person here. I repeat, a person who didn't come to joke about resonance cascades.

  • @rapideye101
    @rapideye101 Месяц назад +1

    how does the sampling work then with the cascades? can you also make a video on that? or is there a paper?

  • @pravkdey
    @pravkdey Месяц назад

    The title sounds like a sleeper agent phrase haha

  • @attashemk8985
    @attashemk8985 Год назад

    Looks cool, will be interesting if nerfs could be baked in this cascade technique

  • @skicreature
    @skicreature 24 дня назад

    Your mention of the accuracy of penumbras makes me wonder if this same technique could be modified in the calculation of radiation dose calculation in radiation therapy. Basically we do a bunch of complex ray tracing or kernel convolutions to perform dose calculations in radiation therapy and calculate the interactions based on a fluence maps from particular angles. For us however reflectivity isn't so important. Mostly we just have attenuation (simple calculation), and a couple different types of scattering interactions (much more complex requiring monte carlo interactions). However, our accuracy in dose calculation tends to decrease at the edge of beams (at the penumbra) as the calculation becomes most difficult there with scattering interactions beginning to dominate over the attenuation interactions.

    • @Alexander_Sannikov
      @Alexander_Sannikov  24 дня назад

      @@skicreature people are already applying RC to a bunch of non-light radiative transfer processes. any process where energy is propagated in rays is suitable.

  • @firstclass3223
    @firstclass3223 Год назад

    Awesome stuff, well explained.

  • @AtomicBl453
    @AtomicBl453 Месяц назад

    i'm curious, would this help with more realistic sun sparkles on a body of water?

  • @JamesAidanKing
    @JamesAidanKing 17 дней назад

    yea i dont know what any of this is but it looks cool

  • @hnlo
    @hnlo 6 дней назад

    Great work! I'm really interested to find out how to encode the radiance cascade data into a texture, can you give me some pointers?

  • @thewaysh
    @thewaysh Год назад +10

    Great video, thank you. Please upgrade your microphone!

    • @iamvfx
      @iamvfx Год назад +9

      I like it, it sounds like a recording from 1960s 😁

    • @Alexander_Sannikov
      @Alexander_Sannikov  Год назад +16

      Sorry for that. That's why I usually don't voice my videos. I should get an actual mic, but I normally much prefer to program something than to waste time setting it up.
      UPD: ok, anyway, ordered a mic. Again, sorry for the quality on this one.

  • @RPG_Guy-fx8ns
    @RPG_Guy-fx8ns Месяц назад

    so basically you bake point clouds to an atlas of cube maps and use it to render imposter pixels

  • @AlexDicy
    @AlexDicy Месяц назад

    How do you style ImGui like that? Is there a theme or did you manually change backgrounds and border radius?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Месяц назад +2

      it's open source you can check. legitengine by raikiri

    • @AlexDicy
      @AlexDicy Месяц назад

      @@Alexander_Sannikov Thanks!

  • @seccentral
    @seccentral Месяц назад

    the profiler is legit. 👍

  • @oldlifeorig5028
    @oldlifeorig5028 Месяц назад

    Does it mean that we can look into 3d models and see it more realistically from inside? Not like backfaces, but volume of it's mesh idk

  • @grimtin10
    @grimtin10 Месяц назад

    yo this is really cool

  • @simonl1938
    @simonl1938 Месяц назад

    this is so cool

  •  Месяц назад

    is that gauss splattering?

  • @themerpheus
    @themerpheus Месяц назад

    so its like deep shadow maps but for irradiance?

  • @perialis2970
    @perialis2970 Месяц назад

    cool, this is a super cool video (not understanding one thing)

  • @PanzerschrekCN
    @PanzerschrekCN Год назад +1

    Looks great!
    But it is hard to understand how it works.
    As i understand this, you just render the scene into a tiny cubemap for each point on a regular grid. You have 4 variants of this grid (cascades) - from less detail (spatial) to more detail, less detail grid contains more detail cubemaps, more detail grid contains less detail cubemaps. It is still unclear for me how fetching and mixing from these arrays of cubemaps works.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Год назад +4

      That's right, the blending part is kind of hard to explain in the video, that's why I'm writing an article with a proper explanation. However, the idea is actually very simple: each cubemap simply has an alpha channel, so since each cascade encodes its own depth range, they are always sorted front-to-back, so you just blend them using their alpha channel.

    • @JannikVogel
      @JannikVogel Год назад

      @@Alexander_Sannikov How do you choose these depth ranges? Are they manually chosen or do they "emerge" from the data (due to information not being representable at some angular resolution for example)?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Год назад

      @@JannikVogelthe discretization scheme is completely scene-independent. That means the exact same radiance cascades can be used to capture radiance of an arbitrary scene. In the paper I explain in great detail why depth ranges need to increase exponentially for subsequent cascades. If you're on "graphics programming" discord, you can have a look at the draft of my paper that people are reviewing right now.

  • @iamvfx
    @iamvfx Год назад

    Cool stuff! Needs more high quality examples

  • @Skythedragon
    @Skythedragon Год назад

    So from what I understand from this and the presentation, you take an array of textures, and for each pixel in that texture, look at it's 8 neighbors, and do some short distance (1px) raytracing, and repeat that for the number of cascades you have?
    Then to get the final value, you read the pixels in the final cascade for the direction you want to trace in?
    Wouldn't this make it O(n) for the number of cascades instead of having some fixed upper bound?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Год назад +6

      Nope, after the precalculation is done, there's no raycasting in this demo at all. Rendering the radiance field of one cascade in this case is equivalent to just reading a cubemap texture for every pixel, which is just a hardware bilinear interpolation. So, each of the 4 cascades are looked up this way and then merged. See at 4:23 each cascades literally stores tiny cubemaps: you don't raymarch them, you interpolate them.

  • @Lucsueus
    @Lucsueus Месяц назад

    Saw this video on my homepage recommendations, no relation to Half-life at all (haven't looked at source/half-life content in forever).
    To be honest I have no clue what this even means, but I'm very intrigued and I often enjoy watching people passionately explain something that to me is very niche.
    Thank you for sharing!

  • @Danuxsy
    @Danuxsy Месяц назад +1

    So what would be the usecase of this?

    • @NightmareCourtPictures
      @NightmareCourtPictures Месяц назад

      Use case of global illumination? With O(1) time complexity? Practically all applications for every graphical representation for the foreseeable future.

    • @Danuxsy
      @Danuxsy Месяц назад

      Well this is different because it is NOT real-time like path tracing we see in games, this is computed offline and so cannot change afterwards. It's similar to splatting, looks like real life but not very usable in games (as of now anyway), me sleep.

    • @NightmareCourtPictures
      @NightmareCourtPictures Месяц назад

      @@Danuxsy it's a stretch to say path tracing is real-time.
      i also don't know where or why you have the impression this isn't real-time, nor why it wouldn't be real-time, given how efficient it is.

    • @Danuxsy
      @Danuxsy Месяц назад

      @@NightmareCourtPictures i watched the ExileCon2023 talking about their GI implementation using radiance cascades so I see that it does have good usecases yess, it's cool !

    • @caffiend81
      @caffiend81 Месяц назад

      @@Danuxsy it's usable in real time. Path of Exile 2 is using Radiance Cascades. IIRC their dev team published a white paper on the technique.

  • @gaia35
    @gaia35 Месяц назад

    This is why computers were built.

  • @TavishMcEwen
    @TavishMcEwen Месяц назад

    so cool!!

  • @nates9778
    @nates9778 Год назад

    This is cool!

  • @anipodat394
    @anipodat394 Месяц назад

    Half-Life + Hollow Knight = Radiance Cascade

  • @JannikVogel
    @JannikVogel Год назад

    Can you share those 8k images you have used in this demo for people who want to reproduce this (without having to capture their own scene first), or could you even upload the entire demo code somewhere?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Год назад +2

      If you want to try this i really recommend replace the volume rendering part with some really simple SDF fractal raymarcher or a sphere. The only reason why i used volume data is because it's obvious that i'm not rendering it in realtime (that'd be much slower).
      That being said, at some point I will publish the sources.

  • @gadirom
    @gadirom Год назад

    Ah! So there’s no sphere harmonics, you used cube maps. Still seems like a trickery. I’m looking forward for the paper. Really, O(1) looks like magic.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Год назад +4

      I use spherical harmonics to gather diffuse GI (not in this demo). This demo does not need gathering irradiance, so it only uses parabolic mapping (basically, cubemaps).

    • @gadirom
      @gadirom Год назад

      @@Alexander_Sannikov I see. Thank you.

  • @homematvej
    @homematvej Год назад +1

    Is there a paper or something?

  • @addmix
    @addmix Месяц назад

    Sparkles

  • @Terszel
    @Terszel Месяц назад

    yeah

  • @aladorn
    @aladorn Год назад

    thank you for the video...veeery impresive.... please upgrade your mic

  • @eucenor4171
    @eucenor4171 Месяц назад

    0:07 global elimination technique 💀

  • @perkele1989
    @perkele1989 16 дней назад

    you didnt even touch the roughness slider :( please, go into more tehcnical detail about how this works ! graphics programmer in me is amazed

    • @Alexander_Sannikov
      @Alexander_Sannikov  16 дней назад +1

      now I also wonder what that slider did, because it makes no sense :D

  • @oBdurate
    @oBdurate Месяц назад

    Bro sounds like Posy

  • @ABWABWABWABWABWABWABWABWABWA
    @ABWABWABWABWABWABWABWABWABWA Год назад

    hell yeah

  • @NeoShameMan
    @NeoShameMan 11 месяцев назад

    It's funny i do something like that to render gi on mali 400 gpu in real time lol, but instead of storing the results, i store the uv of the points, which allows real time updates by simply texture feedback, ie the texture samples itself to resolve gi. I had the cascade idea but didn't implemented it, i didn't knew it would be that efficient. It's on unity's forum under the name exploration of custom diffise rtgi, the technique is called MAGIC for mapping approximation of gi compute 😂. I resolve slowly because i haven't tested how much i can render per frame, so it's one ray per pixel per frame, my computer is dead now lol hasn't finished.

  • @stimpyfeelinit
    @stimpyfeelinit Год назад

    neat!!!!!

  • @lemonjumpsofficial
    @lemonjumpsofficial 28 дней назад

    IT'S A HOLIGRAM!!!!!

  • @user-nm4mi2sq1o
    @user-nm4mi2sq1o 6 месяцев назад

    Оп, оказывается это был тизер affliction в PoE)

    • @Alexander_Sannikov
      @Alexander_Sannikov  6 месяцев назад

      Как Вам удалось их соединить вообще?

    • @user-nm4mi2sq1o
      @user-nm4mi2sq1o 6 месяцев назад

      ​@@Alexander_Sannikov Отвечая на данный вопрос, мне станет стыдно. Сам я мало что понимаю в разработке движка или проектировании/моделировании объектов. Но ваши панели на exilecon и данные ролики посматриваю, чисто из любопытства. Подача материала отличная
      Тут я увидел схожую модель наслоения частиц на объект, если можно так выразится, с текущей лигой в игре...даже цвета совпадают )
      Заглянул сюда после вашего подкаста у CARDIFF'а. К слову, было бы отлично, если бы у вас получилось периодически организовывать с ним, или другими стримерами, такие подкасты. Хотя бы раз в полгода
      У зарубежной аудитории есть Крис и Джонатан, а у нас будете вы.

  • @k-vandan4289
    @k-vandan4289 Год назад

    cool

  • @vanillagorilla8696
    @vanillagorilla8696 Месяц назад

    A digital hologram.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Месяц назад

      @@vanillagorilla8696 i have a video about actual digital wavefield holograms if you're interested in that

    • @vanillagorilla8696
      @vanillagorilla8696 Месяц назад

      @@Alexander_Sannikov I'd love that.

  • @wonkaytry
    @wonkaytry Месяц назад

    mirrors

  • @oldlifeorig5028
    @oldlifeorig5028 Месяц назад

    Could you please explain it all in a fortnite terms?

  • @paranoidPhantom
    @paranoidPhantom Месяц назад +2

    Нихуя не понял, но очень интересно)

  • @macratak
    @macratak Год назад

    wheres the siggraph paper my man. im tryna read that!!!

  • @icaroamorim3123
    @icaroamorim3123 3 месяца назад

    I'm still a bit too stupid to understand but I will keep trying until I implement it

    • @icaroamorim3123
      @icaroamorim3123 3 месяца назад

      Just read the full paper again, I can understand it much better now.

  • @computerghost596
    @computerghost596 9 дней назад

    I dont think this is half life guys

  • @dougbeard7624
    @dougbeard7624 29 дней назад

    Audio is pretty bad. You need to compress.

  • @neon_Nomad
    @neon_Nomad Месяц назад

    Ghost skeletons