So I Tried To Learn Shaders...
HTML-код
- Опубликовано: 19 дек 2024
- Twitch / theprimeagen
Discord / discord
Become Backend Dev: boot.dev/prime
(plus i make courses for them)
This is also the best way to support me is to support yourself becoming a better backend engineer.
LINKS
thebookofshade...
By: Patricio Gonzalez Vivo | x.com/patriciogv
Jen Lowe | x.com/_jenlowe_
Great News? Want me to research and create video????: / theprimeagen
Kinesis Advantage 360: bit.ly/Prime-K...
Get production ready SQLite with Turso: turso.tech/dee...
But Acerola,
real
this comment has sound..
Words you can hear
man of culture
almost heard this sound when clicked on the video
i'm just waiting for Prime gamedev arc where he writes his own 3d engine in c++
rust*
C*@@adityabanka_iso
he has too much webdev brainrot for that
Don't you mean JAI?
@@921Ether bruh
Prime’s gamedev arc continues. We’ll get Prime Game before GTA VI
Going to be an expert before GTA 6
@@ThePrimeTimeagen but will it be before HL3
It's not gonna be Theft, it will be Prime.
GPA being VI is impressive.
Seriously, how I learn how to do shaders:
1. Watch a Blender tutorial for the effect I'm going for.
2. Dissect the Blender source code for how the nodes worked together to create the effect.
3. It's now OUR code.
4. Success.
For anyone new to shaders, I recommend starting with Godot's shader language, it's by far the easiest way to get started. There's a lot of high level language features and parameter handling that makes things much easier starting out.
You are goated if you managed to do this
Adapting graphs from technical artists to make cool effects is a great thing.
I went from making shader node trees in blender to rendering particles in webgpu to rendering objects in webgpu. And now I'm stuck at trying to implement screenspace shadows in webgpu.
OUR software
At about 2:15 into the video when you're asking about the Shaders and how the pixels are being manipulated and transformed is that you have to remember that the Graphics Pipeline has stages that are fixed within the pipeline such as the swap chain, the present buffers, etc. While there are stages of the pipeline that are programmable and fully customizable. The ones that are fully customizable are your Vertex Shaders, your Pixel or Fragment Shaders, your Geometry Shaders if used, and your Compute Shaders if used. These are fully customizable. Now by convention and under the hood they do operate differently. These shaders also fall under the paradigm or idiom that they Compiled Translation Units similar to C/C++. The main difference here is that when these are compiled, we can see a C/C++ translation unit as instructions for the CPU where Shaders either being HLSL, GLSL, SPIR-V or some other, are instructions for your GPU. There is some setup work with the underlying mechanics within the application for the Graphics Pipeline because of the fact that we don't want to be sending these instructions across the FSB (Front Side Bus) as it is quite slow in comparison to the direct connection between the CPU and the GPU through the PCI Express lanes. Because of these timing differences, it's better to send the instructions and data in batches instead of every single frame. All of these timings have to be synchronized with your target frame rate FPS as well as your Physics - Animation engines, and the transmitting of data between either the FSB or PCI-Express lanes has to be accounted for. Yet the Shaders themselves are fully customizable. You can have one section of your "Screen Buffer" render or make update changes while other sections can be completely discarded. There are also many various techniques that are involved from having multiple buffers in use to create various effects from Phong Shaders to Bump Maps, HDR maps, To Fog and Bloom, etc. as well as being able to create kernels that provide bit manipulation or bit twiddling within a fixed window that can also perform various techniques. Some of these can be easily seen within Image Manipulation programs such as taking an image and inverting all of the colors. Here you just use a kernel that gets applied to the image and as it walks or traverses over the image data (array or vector), the kernel will invert all of the color information. Not exactly but we can kind of look at a 3D Graphics Rendering Engine with respect to the underlying Programmable Graphics Pipeline as kind of being an Octopus. It's very intricate and fairly complex in its overall API or Interface design, yet it is very flexible and versatile in what it can do and what you can do with it as well as how you can customize some portions of it.
funny words, magic man
@@lex224ification Not magic. Many years of research, trial and error and just doing it. I learned as I went. I started doing this going as far back as around 03 with DirectX 9c. Then I jumped into Legacy OpenGL and DX 10. Then DX 11 and Modern OpenGL and eventually Vulkan.
I am purely self-taught, self-motivated as it has been a passion and hobby of mine. It's not magic. It's 99.99 9/9% a combination of Mathematics and Logic.
Jesus bro im only 2 minutes into the clip and you drop flashback.
@@stefanvukasinovic9234 Why not? If others can learn from it, take from it, gain anything from it, why not?
Awesome overview of the topic, I'm taking notes. I work on rigging and deformation and I'm excited to get into other parts of the pipeline to do some shader based effects.
You gotta Bring Cherno on your stream, would be awesome collab and can exchange knowledge as well.
Shaders continue to break my head every time, and I write them every day.
Look at the JavaScript interpreters written in GLSL?
Loving the smoothstep exploration
are you a game developer now?
seeing someone in the same pain that I was a year ago is satisfying lol. This language is painful man
1:30 Gaussian blur, as far as I know, is done with a convolution with a gaussian kernel. That means pretty much what prime was saying, you take the kernel, which is a grid of values you apply to the the center and neighboring pixels, to then add them up, and put them in the new image in each location where the center pixel was before. We had to do make a hardware accelerator to do this in a digital design class using an FPGA.
This is very likely not the only way to do this, but its pretty much what you think it is, kind of "averaging" each pixel with its neighbors, but with a higher weight for the center pixel, getting smaller as you go to the edges of the kernal/mask.
When you have a position vector that is of a Vec4 type, the last component W can be used within vector4 and 4x4 matrix operations. This doesn't even account for the use of Quaternions or other various Hamiltonians where Projections and Rotations in Higher Dimensions are needed. A lot of times the w component can have different meanings depending on context. In some contexts, with regards to the vec4 object itself can distinguish it between being a point in space or an actual directional vector. In other contexts, it can be used in conjunction with the Depth Buffer which can help determine the Z-Divide. This is useful with the transposing of a 3D Object within a 3D Environment based on the properties of Perspectives. This gets into Pespective Geometry and Projections. Most times the W component is used in conjunction with the Z-Component for the Depth Buffers where the concern is associated with the Perspective Divide. When you get into 4x4 Matrix Multiplications, many times within 3D Graphics Applications the order of your Matrix operations do matter especially when you are working with transformations within a scene. Most programmable pipelines and shaders make use of 4x4 Matrix operations and commonly you will see or hear of a MVP Matrix. An MVP matrix is simply the result of multiplying 3 matrices. Model, View and Projection. They have to be multiplied in that order otherwise you will get distortions, artifacts, etc. in the rendered scene because Matrix Multiplication is not Commutative. Also, not every Matrix Multiplication has an Inverse. You might want to freshen up on your Linear Algebra because it is 100% required and a necessity here. We are dealing with positional and relational data, transformations, coloring information and manipulation, and last but not least, lighting and shading operations. It requires a lot of higher dimensional mathematics. You might want to also brush up on your trigonometry too.
Next up, Shadertoy.
shoulda started with it imo
@@bluein_nah book of shaders is the best way to start
@@HarpreetSingh-xg2zm book of shaders mvp
My friends Patricio and Jen made book of shaders - you should have them on your show. They are awesome!
been looking for someone to cover this topic in detail thank you for everything you do!
So I skipped to the comments to find out if he really did learn shaders.
Same. Clearly we need to come back in 3 hours and check again 😅
To be fair, with the amount of pausing to go off on a tangent he did within 17 minutes, I'm tempted to do that as well.
@@Jmvars Dude can turn a 5 minute source video into a 45 minute long rant.
you can't spell shaders without adhd
idk who you are or why the algorithm thinks I need to see this. BUt i love it.
im coming from marine corps I.T.
That means either doing the vaguely computer related bitch work OR setting up blades on a server rack. so once month Id get to do advanced and worth braggin about type work that taught me alot.... the remaining jobs of a given month could be summed up as; "image thin client -> deploy thin client -> dodge annoying questions from marine coworkers -> repeat.
(one time a dude tried to MAKE me do something that would get me alot of trouble. you tell that guy no and 3 days later your boss is calling you a disrespectful fuck happened all the time lol so much so it became accurate. like "no ssgt I cant give you a fancy new monitor go do the paper work"
so it goes with out saying that, as a civilian my job title would slot into somewhere 'IT', however I cringe a little since i feel like ive got MAYBE 45% understanding on things over all.
meaning how could I be hire-able down the line when I feel I dont know enough? (im disabled waiting on a surgery yayyyyy VA so its not like im gonna be looking for a job anytime)
since my hobbies are nerdy and theres some cross pollination when it comes to my job skills AND hobbies. ive been adding tools to my tool belt by accident. like setting up Ubuntu server with KVM/QEMU to then waffle around with a bunch of different VM configs --eventually I made a game server for the bois --reason why I bring this is because I accidentally walked away with the ability to confidently use CLI's at least for ubuntu, generally able to understand bash/linux-esk commands, AND realizing I was lacking in basic IP understanding (sad Ik ip config yaml file broke me due to me getting lazy with research... syntax bs ugh). through this 'after-thought' of a project I legit felt like I was actually filling the shoes of an IT guy
I look foward to more stuff from ya Mr.prime, i dont norm like streams but NERDY streams??? I needa get into one one of these days lmao
daaaamn you went of tangent-ing haha
hey, if you ever apply somewhere they'll at least think you're organized cause you served
add some confidence to that and I think you got it!
for the other stuff there's always co-workers (like, in the 1st-2nd month, just adk questions non-stop) and then there is google
Dude, I've been a software dev for almost a decade, and while I've had multiple bumps of knowledge, at the moment, I feel like I know just a fraction. The classic "the more you know, the more you realize you don't know". Don't let that demotivate you. Rather to the contrary. It just means we can become better. That's exciting! Never let the feeling of not knowing enough bring you down, rather let it serve as motivation to become better. If you have grit, perseverance, a good attitude and of course some fundamentals to built upon, you are definitely hire-able in my opinion. Keep at it, keep practicing, keep learning and you'll have a good chance once you start looking for jobs again. Good luck on the surgery as well, hope it all works out.
i wish i was at that stage again xD nothing exciting to learn atm besides research papers
You a 71? If so, hell yeah. I was at a comm co for the most of my time in the fleet except for 6mo when I was in an actual helpdesk pos. Being in a comm co was nice bc I got experience more aspects of the Marine Corps that I wouldn't have time for in a helpdesk pos.
@@Ikxi If they're in the MOS that I think they are, then this only scratches the surface of the tangent-ing to be had lol.
38:42 - "Materials" are just shaders with some per-shader-instance parameters applied to them, and tend to be part of a 3D graphics package or game engine.
They don't necessarily have a perfectly corresponding type at the actual graphics API level. They're sort of a higher level concept.
(and yeah, mat2, mat3, and mat4 are not materials. They're matrices. As he said in the video)
Oof he's really starting from scratch
6:35 speaking only for myself, when I'm riddled with stress and anxiety from (possibly self induced) deadlines and PTSD, any little interruption can be completely derailing. 😢 But you're right, if you're healthy and have a healthy mindset and stay on task, interruptions don't need to be disruptive at all.
GPU has hardware level integration for dealing with floats without need for conversion, stacking or splitting. It's built into the hardware unlike CPUS where it's a software operation in most cases. This is why the float precision actually changes rendering speed unlike CPU bound operations.
It's happening in parallel in the hardware rather than being scheduled. GPU cores, they can just pull and push specialized data so damn fast. The 4090 can push 1,000Gb/s through it's compute units in total, when you pair that with the fact that all the native operation for GLSL and Vulkan are directly built into the silicone it makes most GPU single operations process as fast at the speed of electrons in the medium (perceivably instant in copper and aluminium) as they're already complete before the software touches the data. AMD Ryzen 9 7950X3D: Internal data rates up to 512 GB/s and then that's further bottlenecked by the fact the architecture is designed for random compute/broad tasks rather than having actual silicone dedicated to the specific GL calls and matrix math. Some GL calls are on the GPU in silicone so they run extremely fast.
Svelte society had a video uploaded few days ago about components rendered on gpu and it was when i heard about shaders found thid book and now prime is learning it. Interesting.
6:47
"Stop stopping"
lmao, man, I laughed so hard
yoooo lets go something that I actually know more about than prime?? XD
just teasing -- i am super interested to see what you do!
Game dev arc is here!
Dude I tried to learn glsl shaders in puredata and holy shit it was the most difficult thing I've ever undertaken. Took me forever to even find where the few actual existing examples of it were, and even then they had such little explanation. Made some really tight stuff with it, but omg the difficulty could only be explained in metaphor: it was like I was walking by a building with no doors day after day that I wanted to get onto the roof of, and eventually I managed to spot a fire escape that was there the whole time, and then eventually I was able to grab the bottom rung, and eventually I was able to get up to where my feet were both on it.
"tickle my nut-ric equations"
take that every random person yelling at game devs that "why dont you just do ?!?!"
those ppl should take a 3-year C++ course, learn OpenGL 2.1 on their own and THEN show me what they can do with it meaningfully (without super-comfy-high-level libs/frameworks) in like a year+ ...
This is comedy Armageddon! You're slaughtering my seriousness, and I'm dying from laughter-induced asphyxiation!
Shaders are pretty super duper cool! Lots of math on it. Makes math way more fun than what they teach in classes.
HLSL, GLSL goes as far back as at least the early 2000s. HSLSL was available with at least DirectX 9.0c as far as I know. I'm not sure about DirectX 7 or 8, but at least with 9.0c and definitely DX 10 and 11. As for GLSL that was not available with Legacy OpenGL, but it did appear with at least OpenGL 2.x which was the transition into more modern OpenGL, but it wasn't until at least OpenGL 3.x+ that it became much easier to work with and to understand how the Graphics Pipeline is structured, handles, and implemented. Then a little time later once OpenGL 4.x started to arrive is when most CPUs transitioned into Multi Core systems as being commonplace. From here, this is where the need for Vulkan based on Metal became a necessity. Vulkan uses SPIR-V, but one can still easily write GLSL and have it compiled to SPIR-V without having to worry about SPIR-V. My first experience for working with shaders goes back to at least 2003-05 era which was the transition from DirectX 9.c to DirectX 10, as well as the transition from Legacy OpenGL to more modern OpenGL.
I started writing shaders a year ago to mod games, I still don't know what I'm doing, but it's a lot of fun haha. I would love to share the journey, with y'all someday.
Prime is finally Transposing into the World of 3D Computations! Love it!
You know, if you really don't like macros, GLSL provides an alternative: literally change the shader. It's just a string, after all, and you're not using C, so you can easily just chop it up and piece it together the way you want! :)
You can debug shaders and you can do it through console logging, but it's not implemented by default. You have to setup and add additional or build your own tools to do so.
No, no you can't a shader does not have access to IO and you can't monkeypatch that in, and you can't do it in any language. Shaders are not general purpose, closest you can get to GP programming on a GPU is CUDA or ROCM. What you could technically do is software render but that's not something anyone does.
@@RealRatchet I've done it before. When you have your code that reads in your shader code to be compiled that becomes a shader within your application, there are tools that you can use to debug them. For example, within Vulkan you have Vulkan Validation Layers. Go and build a 3D Graphics Rendering / Game Engine from scratch in C/C++. Setup and initialize every portion of the Graphics Pipeline. I've done it. So don't try to tell me that you cannot do it.
As for the console logging, you have to get the result strings and pass them to your I/O streams. You have to setup how you want the text to be formatted, etc. It's not there by default. You have to set it up and even implement portions of it. You can still debug your shaders. And if you're crafty enough at least within Visual Studio you can install addons or plugins that will also give you shader syntax highlighting, auto complete, and you can even attach them to Visual Studio's debugger.
@@RealRatchet I don't know about OpenGL but Vulkan lets you use debugPrintfEXT to do logging from GLSL shaders, so it is definitely possible.
As a shader programmer with 10+ years of experience, I'm surprised by how entertaining it turned out to watch this.
Can't wait to see his reaction when he opens any modern shader from a real PC/console game (I bet his mind will be blown the moment he finds out about RenderDoc and discovers a whole new world).
Prime, I've taken a OpenGL class recently, we didn't make our own shapes the prof. did I've learned more from this vid then I did in a 8 week class XD
I'm at a point where I join executive meetings just to tune them out so I can focus. I don't know when I got so broken, but context breaking is now a part of my flow state.
YES! MORE OF THIS LEARN SHADER AS A SERIES PLEASE!
01:40:09 flip please don't zoom in 😭, I wonder what must that music be
Did you end up doing any elixir aoc?
You should put the link to the book in the description
You look good Prime!
BTW I heard this as I learned how memory barriers work in C++,
1:03:00 watching the dyslexic guy try to draw weird coordinates while his camera is flipped is gold
"i thought i had something there"
You're a real shader programmer now.
Prime starts writing shaders and has to deal with both web debugging and shader debugging at the same time... I don't want no cache when I'm working with shaders damn it.
Why the tutorial step2 jumps in with smoothstep is a bit much in my mind, but seems like a decent tutorial. Also why they wrote out a mix() function.
Smoothstep is a heavy function, but useful, no way I'd tell someone about it so early in learning GLSL.
Definitely check out Inigo Quilez's videos if you are continuing, like the tutorial mentioned.
FYI, if you talk to shader devs for mobile/xr, they try to stay away from excessive smoothstep cause it's a heavy gl function.
Same thing with division, log, and sqrt; like instead of `#/2.0` its `#*0.5`, for sqrt() we'll use `pow(#,.5)`
And mix() instead of `if` statements, it prevents branching code. (if + uniform/directives == good, if + variables == bad)
I'd hope a good compilation pipeline could do most of these for you, but it's been weirdly hard to ahead of time compile shaders, even a source to source compile, for a *long* time. We're just now starting to see people taking that seriously with shoving everything through spirv, the wgpu naga transpiler, and LLVM adding a sprirv backend. That last one seems like it might be amazing if the right things fall into place!
I think the ability to handle distractions is also a function of memory. When I go to a problem the next day, for example, I have to spend some time reaquainting myself with it.
everything converges to shaders in the end!
"averaging" with different pronunciation
this guy is a genius.
Good advice, when you get distracted, just dont get distracted :D Got it, ADHD solved. If I was able to not get distracted, I would not be watching this video...
"That's not a scuffed render loop."
*Draws arrays before updating anything*
Reads first sentence. "I don't understand!" I feel that.
Next make a path tracer. And a physics sim framework.
Hey guys can you tell me, from where prime reads the articles i.e sites that he visits to read them.
Or any suggestions where can i find some tech or programming articles (experience, news, stories) etc.
Yes i remember when learning XNA all of the examples had cornflower blue backgrounds. Every example ever..
Long form content baby let’s go
Used to work as shaderwriter and a common way to debug was writing values out to buffers and inspecting pixel values are what you expect
There are only two ways to debug a shader one with color buffers, the other with capturing draw calls on the GPU. The latter is a very old open source project that's no longer maintained and doesn't even work on JS.
@@RealRatchetare you talking about RenderDoc? That's still perfectly cromulent and up to date, but it probably won't work with eg Chrome out of the box very well due to it doing all it's rendering out of process.
You can also use Nvidia NSight to get driver level tracing which is handy, and I'm sure the other vendors have something too.
bro using 497382L of fresh water to cool daddy Altman's data banks for a function interface instead of reading 3 lines of documentation:
Shaders are fun ways to express mathematical ideas. :) It's so much fun.
5:15, some laptops use this resolution (eg. my HP x360 13).
Better aspect ratio (16-10) in terms of productivity :3
Tho resolution wise it is definitely yikes
Edit: HP Envy x360 13-bf0xxx, OLED version. Super weird to see that resolution out in the wild :P
Great video Prime! I really like longer videos. Cuz i miss the streams and wanna hang out too :) ❤
you gotta interview @thecherno
this episode should be called: "happy little triangles"
Actually it's the core. Somehow the audience understands over time who you are and if your core is good, as it is, you will no doubt succeed. I like both you and Thor. I think his principle is really simple. Be himself, he's a good guy and that transpires. I think similarly for you.
good that u released this today I have a presentation about a shaders in 40 min :)
Totaly bombed it :(
Primagen: Just Dont go on twitter thats where you get screwed.
Me: oh, ill just play this youtube video in the background.
a few moments later:
me: "i need context as to whats happening in the video so maybe we'll just picture in picture while working"
a few moments later:
youtube has completely taken over my screen
first time watching this dude, and seeing him realize how a lerp works is weirdly entertaining.
Shaders made me use C in JavaScript
Shaders are an addictive rabbit hole. Enjoy the adventure!
Is i3 available on Wayland
Sway is a Wayland clone of i3, it works extremely well
why did I sit and watch this for 3 hours straight
3 hours of Prime on learning Shaders??? F*ck yeah.
chop chop shaderboy
the chat during the XY coordinate part was killing me
It would be nice to have a link to the article.
I am really happy to have controbuted to the direction... And now I hope it continues. But you got stuck for like 30 minutes trying to figure out code the book didn't want you to look at just now. I guess thats what I would end up running into too.
I just can’t keep doing these side interruptions. It’s so annoying.
I liked the moment when he finally understood the LERP
36:52 Sounds like Portal 2 flashbacks
I love hwo prime's brain consisntently cannot parse semicolongs.
GLSL has a built in normalize function to normalize vectors.
Missing semi-colon in shader. Classic!
0:29 me in 2025
I miss the videos of no interruptions
this is top tier content. showing us the real.
12:30 for the life of me I could not get go to import other files from my package. Everything on the internet drove me crazy and I just could not get it.
Then one day it hit me. WTF do I not just look at a repository on github?
Oh my, it was so simple XD.
29:26 zig comptime is the exact same as preprocessor macros
DefaultCube does a lot of shader type stuff in blender with nodes mathematically on his yt channel
May there be grace upon you child.
yup, this is normal, and then you go into raylib and Y is flipped for reasons
This is such a coincidence. I just rewrote an old project of mine for rendering a frag shader on the terminal, then suddenly got recommended to this 🗿
why was the toilet scene important prime
Blur = low pass filter. Edge detection = high pass filter
My dudes. if you're new to shaders, I made this playlist for you ruclips.net/video/3sI9uip7QS0/видео.html
OMG, finally something I can relate to
50 shaders of Prime
Maybe it’s time to jump into matrix multiplication? I could use some explanation there. Probably not useful for a 2D game
Hell yea a long ass video!!
Prime hates on Javascript, yet uses it anywhere he can
prime going through midlife crises
You got reasonably close to cornflower blue.. You had rgb(106, 161, 255) according to the colors in the video.
All roads lead to game development
Acerola collab soon?
MAKROS, cmake, Makefiles... not running on all architectures, missing "defer" and optionals .... Now I am learning Zig and it is really funny.