I know that this video was uploaded like 4 years ago but I still want to say that is feels like you are progressively aging with every video just how much it is difficult to develope the engine en explaining how does it work to all of us! Thank you so much for this series, it's really interesting to learn how does a game engine work! 😁
I'm glad I've studied 3D and learned how to make assets from scratch because I would have never been able to understand this otherwise, lol. It's fun knowing these sort of things as it really helps you appreciate Video Games even more :)
It's nice to see sneak peeks at the future of the game engine since sometimes tutorials (I'm not trying to blame them), end abruptly. So seeing ahead shows us how much more work is ahead of us.
Tip: With your checker box, you can fill the 4 corner boxes with Black, Red, Green, Yellow to match the texture coords in the Vertex buffer that you used earlier to verify the order and position once its applied to the geometry!
@3:20 You can access textures from any type of shader, not just fragment shaders. That means you can use textures as scalar-field input to vertex shaders, geometry shaders, tessellation shaders, etc... For instance, I have a parallax occlusion shader that I wrote from scratch that uses a simple regularly subdivided low-poly grid mesh from a VBO and a low-res downscaled version of the depthmap that will be raymarched - using a maximum downscaling filter (i.e. a normal minification will just average each group of 2x2 texels but my downscale actually outputs the largest of all values that overlap each output pixel as the output value, so if you downscaled a black-to-white gradient to a 1x1 pixel it would be white, instead of gray). This low-poly mesh is offset by this max-downscaled depthmap along its normal to serve as "proxy geometry" off which the actual final fragment shader rays originate and march into the full-resolution depthmap. The proxy geometry loosely approximates the underlying depthmap so as to eliminate a bunch of unnecessary raymarching steps that would've resulted from just using a large flat polygon, but instead of having to generate uniquely shaped proxy geometry for each surface I can just have one single VBO of a grid of triangles and offset it using the max-downscaled version of the depthmap. The reason for the maximum compare against all overlapped texels for each output texel in the final proxy offset depthmap is to guarantee that none of the proxy geometry ends up beneath any parts of the full resolution depthmap. The goal is a proxy geometry that loosely conforms to the outer side of the depthmap, where just a typical downscale would result in some of the proxy geometry initiating rays from underneath any sharp peaks in the full res depthmap - preventing those peaks from ever being rendered. Just one instance of using textures in shaders outside of fragment shaders. I really feel that the programmable render pipeline is a mechanism that has yet to be fully explored. Conventional PBR rendering barely scratches the surface of the tip of an iceberg of what amazing stuff can be done with programmable rendering pipelines.
if anyone is getting weird distortion on your image the pixel width and height have to be the same and be a power of 2. Or if you really want that image, make the image format GL_RGBA8 and GL_RGBA and make the actual format of you image rgba by resaving it in photoshop or something.
For those whose computer doesn't support the latest OpenGL (4.5), you might get Access violation executing location 0x0000000000000000 error. To solve this, try changing from glCreateTexture to glGenTextures. Also change the rest of the OpenGL codes to lower version.
I know I'm late, but I'm having the same issue. I already replaced glCreateTextures to glGenTextures, but I can't seem to find good replacements for the other gl functions that are too modern. If anyone sees this and knows a solution, could you please tell me? I have been in this scenario before with glCreateBuffer but this time it has already taken me weeks and to no avail. I also tried other tutorials to render an image, but nothing seems to work.
I know im very late but this is how you would set it up with old way: int width, height, channels; stbi_set_flip_vertically_on_load(true); stbi_uc* data = stbi_load(path.c_str(), &width, &height, &channels, 0); HZ_CORE_ASSERT(data, "Failed to load image!"); m_Width = width; m_Height = height; glGenTextures(1, &m_RendererID); glBindTexture(GL_TEXTURE_2D, m_RendererID); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, m_Width, m_Height, 0, GL_RGB, GL_UNSIGNED_BYTE, data); glGenerateMipmap(GL_TEXTURE_2D); stbi_image_free(data);@@BigHug_yt
Very good video! If anyone looks on shadertoy for example they would see what kindof crazy shit people shove into the framebuffer texture. People store many things like voxels into textures, GUI state, triangle lists, particle positions, etc...
When you talking about "debugging with colors", is super important and could not agree more! I am in the discord and often see people that are lost about debugging, why don't you make a video where you show renderDoc and or NSight to show how you can use those tool for debugging rendering issues?
For people with old PC which doesn't support OpenGL 4.2+: glTexStorage2D() is only for 4.2+, glTextureStorage2D() is only for 4.5+ this worked on my PC. ``` glGenTextures(1, &m_RendererID); glBindTexture(GL_TEXTURE_2D, m_RendererID); glTexParameteri(m_RendererID, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(m_RendererID, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_Width, m_Height, 0, GL_RGB, GL_UNSIGNED_BYTE, data); glGenerateMipmap(GL_TEXTURE_2D); ```
If someone else gets an error saying base class is missing, be sure you're including OpenGLTexture.h in your Texture.cpp file **after** you include your Texture.h file. I was stupid enough that I included it in my Texture.h file iteself.
So did 2D or 3D win the poll? It's really close, but 2D is slightly ahead and I doubt many people still need to vote. You did 2DTextures today so I suppose we are getting 2D done first? I voted 3D, but I actually think it is better if we can get "blazing fast" 2D done first so that the community can start creating proper games and content with the engine.
using 2D textures has nothing to do with wether we're going 2D or 3D, 2D textures are just the normal textures you'd use when texturing something, 3D textures are a way more advanced topic which we now especially do not need yet
3D textures have a _very_ niche use case - typically for things like unusually complex environment maps. All textures used in modern games are 2D images wrapped around a 3D mesh.
i think this issue is macos-specific or something but if someone experiences a segfault in the glTextureSubImage2D call, just make sure the level argument there is the same as specified in the glTextureStorage2D a few lines prior.
At 45:45, he talks about texture slots. He says that the uniform is bound based on the texture slot, and he gives an example of a shader with 4 textures (slots 0, 1, 2, and 3). Is it fair to assume that there will never be a Shader that uses a [Number Of Textures] greater than the [Number Of Texture Slots]?
If someone gets an error in stbi_image_free because there was an access violation reading location, try using a different picture, maybe the one Cherno uses. I spent way too much time debugging this :(
another great video cherno! I was curious would you be able to make a video on synchronization mechanisms such as mutex/semaphore using modern c++ ( c++17 or later). It would be awesome.
I actually *did* switch to Mac (for work) because of the constant forced Windows restarts. Most of my system administration tools are through remote software away, lol.
I wondered how things are related to compressed vs uncompressed textures in a game engine ! Does this API supports this and does the graphics API cares actually about if a texture is compressed? If there are some interesting things to say about texture compression i would love to see a video on the topic !
Dude, don't switch to Mac, you'll regret it! If you're gonna switch anywhere, please be some Linux distro. You can customize it to your liking, and it doesn't force you to do anything! Got an update? It can wait until you want it. Got a new printer? 9 times out of 10 it works out of the box. If you must switch from Windows, go to Linux! Maybe not Arch or Gentoo (coming from an Arch user), but maybe something based on Debian or Ubuntu. I recommend Pop!_OS. Yes, it literally is spelled Pop!_OS.
Ughh, i'm struggling after this episode. I understood everything, and meticulously checked the diffs, and can't see i've done anything wrong, but now i am getting tonnes of warnings and compiler errors. Might have to revert and compile regularly to see what introduced the errors: It doesn't like my constructor in OpenGLTexture.h "OpenGLTexture2D(const std::string& path);" it's saying missing type specifier - int assumed. C++ does not support default-int
My Texture is full white color I have exactly the same code. EDIT: FIXED IT my make_shared in Texture2D.cpp was make_shared(OpenGLTexture2D(path)) CHANGED TO : make_shared(path) . Reason: make_shared performs one heap allocation
Weird. I get a strange rainbow effect of vertical coloured bars in the dark squares of the texture. In the code of the opengl series it doesn't do that (plugged in the same texture there). It doesn't seem to be caused by the shader. Maybe those new-fangled opengl functions behave strange on intel iris xe graphics.
Yes, if I plug the code of the opengl series into the constructor instead of the code he uses here, it looks good. Must be iris xe doing strange things. I keep it on the code of the opengl series.
Really nice video :) Can someone explain me why we are using GL_RGB8 in glTextureStorage2D and GL_RGB in glTextureSubImage2D? Why we do not use the same layout for storing and specifying the same texture? And how do I find out what format should i have to use for different pictures?
Is the Texture that can be rendered after this video be only a grayscale Black&White Image Or can we also load an RGB colored Image ? Cause if we can load Colored image then my image is coming good as an texture on Square. Btw , Thank You!! for the nice long informative video
could you share your code highlighing config? because i realized you use different code highlight colors, would like to set that like that myself too..
I think that is part of Visual Assist, which is a non-free extension for Visual Studio. I am not sure though. I think he talks about it in one of his early C++ videos? Maybe in the "BEST Visual Studio Setup for C++ Projects!" video?
I'm getting "Hazel::Texture::Bind is inaccessible" error on the line m_Texture->Bind();. If i use dynamic casting to Hazel::OpenGLTexture2D it works but in the chernos case it works without casting. Anyone knowing what this is about? I checked the code with the with github code but couldnt find anything.
turns out i forgot to put "public" keyword in the Texture2D class when i specify the base class "Texture" like this, class Texture2D : public Texture { ... }
ChatGPT3.5 generated: - The transcript is from an episode in the "TEXTURES | Game Engine series." - The speaker discusses various topics related to game development, including smart pointers, assets, and OpenGL code. - They mention the flexibility of textures in game engines and how textures can contain different types of data. - The speaker emphasizes the importance of visualizing and validating data in shaders. - They discuss the process of loading and manipulating textures in the game engine and mention the need for an asset manager in the future.
So I'm just getting a black empty texture rendering. I have double and triple-checked that I didn't make any mistakes with the added code, and I can see that the OpenGLTexture2D object seems to have been created properly. I assume it's an issue with Open GL on my computer, which just has Intel integrated graphics. Anybody have any ideas?
i have same problem, double and triple checked too! don't think it's OpenGL. I've done many shader app's and they all worked in OGL, though i'm in "Windows 10" now the rest of my system is still the same. I'm still testing a few things, let you know if i find something.
You could clone and compile the Hazel Github repository and see if the issue happens there too? This should be a definite indication whether the issue is with your code or with your pc.
Well I found if you bind the texture "after glTextureStorage2D" and unbind after glTextureSubImage2D, everything works! The OpenGL Series shows this, but Cherno didn't have to bind here! My problem is why??? Is it the texture (absolute, relative) path at runtime? there's no error's at runtime so idk? why doesn't he need to do this!?
@@Grynjolf Well I think it doesn't even matter, what does thou is that he's dedicated to the community now, and Cherno+community = great things to happen ...
Seriously I didn't get that JPEG shade throw xD What's wrong with compressed images! Maybe the texture is some weird rasterized image, why would I use anything uncompressed for that, genuinely interested in understanding that joke ;d
JPEG is overall a terrible format for storing game assets. It makes a huge sacrifice of quality for file size, doesn't support RGBA/baked mipmaps, and is overall just not worth using when you could go with .dds (most engines do this, but usually with a custom header) or even .png instead.
30:16 "stb is a like a library of various c++ helper classes, well not classes but you know what I mean", so a collection of C libraries, got it.
oh, i didnt get it. Why he say in such implicit way =,= bruh
I know that this video was uploaded like 4 years ago but I still want to say that is feels like you are progressively aging with every video just how much it is difficult to develope the engine en explaining how does it work to all of us! Thank you so much for this series, it's really interesting to learn how does a game engine work!
😁
The Hazel's Private Development Branch's Footage in 51:05 Looks Really Amazing, and it Really Motivates Me to Work on my Stupid Engine.
JUST DO IT!
37:40 - In my version of glad GLsizei is actually a typedef over a regular signed integer.
the dev build looking good its a lot further than i thought it was geez
I'm glad I've studied 3D and learned how to make assets from scratch because I would have never been able to understand this otherwise, lol. It's fun knowing these sort of things as it really helps you appreciate Video Games even more :)
6 views, 10 likes, gotta be good
It's nice to see sneak peeks at the future of the game engine since sometimes tutorials (I'm not trying to blame them), end abruptly. So seeing ahead shows us how much more work is ahead of us.
Hey Cherno could you upload your newest changes to git?
The last commit was 15 days ago, so it's a bit outdated.
Tip: With your checker box, you can fill the 4 corner boxes with Black, Red, Green, Yellow to match the texture coords in the Vertex buffer that you used earlier to verify the order and position once its applied to the geometry!
@3:20 You can access textures from any type of shader, not just fragment shaders. That means you can use textures as scalar-field input to vertex shaders, geometry shaders, tessellation shaders, etc... For instance, I have a parallax occlusion shader that I wrote from scratch that uses a simple regularly subdivided low-poly grid mesh from a VBO and a low-res downscaled version of the depthmap that will be raymarched - using a maximum downscaling filter (i.e. a normal minification will just average each group of 2x2 texels but my downscale actually outputs the largest of all values that overlap each output pixel as the output value, so if you downscaled a black-to-white gradient to a 1x1 pixel it would be white, instead of gray). This low-poly mesh is offset by this max-downscaled depthmap along its normal to serve as "proxy geometry" off which the actual final fragment shader rays originate and march into the full-resolution depthmap. The proxy geometry loosely approximates the underlying depthmap so as to eliminate a bunch of unnecessary raymarching steps that would've resulted from just using a large flat polygon, but instead of having to generate uniquely shaped proxy geometry for each surface I can just have one single VBO of a grid of triangles and offset it using the max-downscaled version of the depthmap. The reason for the maximum compare against all overlapped texels for each output texel in the final proxy offset depthmap is to guarantee that none of the proxy geometry ends up beneath any parts of the full resolution depthmap. The goal is a proxy geometry that loosely conforms to the outer side of the depthmap, where just a typical downscale would result in some of the proxy geometry initiating rays from underneath any sharp peaks in the full res depthmap - preventing those peaks from ever being rendered. Just one instance of using textures in shaders outside of fragment shaders. I really feel that the programmable render pipeline is a mechanism that has yet to be fully explored. Conventional PBR rendering barely scratches the surface of the tip of an iceberg of what amazing stuff can be done with programmable rendering pipelines.
if anyone is getting weird distortion on your image the pixel width and height have to be the same and be a power of 2. Or if you really want that image, make the image format GL_RGBA8 and GL_RGBA and make the actual format of you image rgba by resaving it in photoshop or something.
So much fun watching your content over the years. Keep it up man!
I love these long videos 🙌
For those whose computer doesn't support the latest OpenGL (4.5), you might get Access violation executing location 0x0000000000000000 error. To solve this, try changing from glCreateTexture to glGenTextures. Also change the rest of the OpenGL codes to lower version.
I know I'm late, but I'm having the same issue. I already replaced glCreateTextures to glGenTextures, but I can't seem to find good replacements for the other gl functions that are too modern. If anyone sees this and knows a solution, could you please tell me? I have been in this scenario before with glCreateBuffer but this time it has already taken me weeks and to no avail. I also tried other tutorials to render an image, but nothing seems to work.
I know im very late but this is how you would set it up with old way: int width, height, channels;
stbi_set_flip_vertically_on_load(true);
stbi_uc* data = stbi_load(path.c_str(), &width, &height, &channels, 0);
HZ_CORE_ASSERT(data, "Failed to load image!");
m_Width = width;
m_Height = height;
glGenTextures(1, &m_RendererID);
glBindTexture(GL_TEXTURE_2D, m_RendererID);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, m_Width, m_Height, 0, GL_RGB, GL_UNSIGNED_BYTE, data);
glGenerateMipmap(GL_TEXTURE_2D);
stbi_image_free(data);@@BigHug_yt
Love how episodes thumbnails lately have looked like Yaan is ready to spill some hot tea
btw, for texture ideas just watch some videos about substance designer/painter. great procedural stuff with nodes
Very good video! If anyone looks on shadertoy for example they would see what kindof crazy shit people shove into the framebuffer texture. People store many things like voxels into textures, GUI state, triangle lists, particle positions, etc...
When you talking about "debugging with colors", is super important and could not agree more! I am in the discord and often see people that are lost about debugging, why don't you make a video where you show renderDoc and or NSight to show how you can use those tool for debugging rendering issues?
Please stay at full energy and keep at making the world a better place to live :)
For people with old PC which doesn't support OpenGL 4.2+:
glTexStorage2D() is only for 4.2+, glTextureStorage2D() is only for 4.5+
this worked on my PC.
```
glGenTextures(1, &m_RendererID);
glBindTexture(GL_TEXTURE_2D, m_RendererID);
glTexParameteri(m_RendererID, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(m_RendererID, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_Width, m_Height, 0, GL_RGB, GL_UNSIGNED_BYTE, data);
glGenerateMipmap(GL_TEXTURE_2D);
```
glTexParameteri(GL_TEXTURE_2D,... *
Thanks a lot!
Much appreciated!
your GPU must be really old if it doesn't support 4.2 lol, a reminder opengl hasn't been updated since 2017
@@Brad_Script it only supports 4.0 :(
If someone else gets an error saying base class is missing, be sure you're including OpenGLTexture.h in your Texture.cpp file **after** you include your Texture.h file. I was stupid enough that I included it in my Texture.h file iteself.
So did 2D or 3D win the poll? It's really close, but 2D is slightly ahead and I doubt many people still need to vote. You did 2DTextures today so I suppose we are getting 2D done first? I voted 3D, but I actually think it is better if we can get "blazing fast" 2D done first so that the community can start creating proper games and content with the engine.
using 2D textures has nothing to do with wether we're going 2D or 3D, 2D textures are just the normal textures you'd use when texturing something, 3D textures are a way more advanced topic which we now especially do not need yet
The Cherno is doing 2D first btw
3D textures have a _very_ niche use case - typically for things like unusually complex environment maps. All textures used in modern games are 2D images wrapped around a 3D mesh.
i think this issue is macos-specific or something but if someone experiences a segfault in the glTextureSubImage2D call, just make sure the level argument there is the same as specified in the glTextureStorage2D a few lines prior.
At 45:45, he talks about texture slots. He says that the uniform is bound based on the texture slot, and he gives an example of a shader with 4 textures (slots 0, 1, 2, and 3). Is it fair to assume that there will never be a Shader that uses a [Number Of Textures] greater than the [Number Of Texture Slots]?
If someone gets an error in stbi_image_free because there was an access violation reading location, try using a different picture, maybe the one Cherno uses. I spent way too much time debugging this :(
if you're getting an access violation by using a freeing function you are probably freeing memory that cannot be freed
Wow great!
another great video cherno! I was curious would you be able to make a video on synchronization mechanisms such as mutex/semaphore using modern c++ ( c++17 or later). It would be awesome.
At the start of the video I paused to get a coffee :)
thanks!
Lookin fabulous in dat thumbnail tho
I actually *did* switch to Mac (for work) because of the constant forced Windows restarts. Most of my system administration tools are through remote software away, lol.
I wondered how things are related to compressed vs uncompressed textures in a game engine ! Does this API supports this and does the graphics API cares actually about if a texture is compressed? If there are some interesting things to say about texture compression i would love to see a video on the topic !
Came from the OpenGL series, amazing video to watch! BTW, may I ask what's the brand of your watch?
What about using the rendering abstraction to create a GUI library that doesn't rely on deprecated OpenGL functions?
Dude, don't switch to Mac, you'll regret it! If you're gonna switch anywhere, please be some Linux distro. You can customize it to your liking, and it doesn't force you to do anything! Got an update? It can wait until you want it. Got a new printer? 9 times out of 10 it works out of the box. If you must switch from Windows, go to Linux! Maybe not Arch or Gentoo (coming from an Arch user), but maybe something based on Debian or Ubuntu. I recommend Pop!_OS. Yes, it literally is spelled Pop!_OS.
Ughh, i'm struggling after this episode. I understood everything, and meticulously checked the diffs, and can't see i've done anything wrong, but now i am getting tonnes of warnings and compiler errors. Might have to revert and compile regularly to see what introduced the errors:
It doesn't like my constructor in OpenGLTexture.h "OpenGLTexture2D(const std::string& path);"
it's saying
missing type specifier - int assumed. C++ does not support default-int
great 👌👍well explained :)
hey guys, here we are about 3000 lines of code! xd
what do you think about ECS?
My Texture is full white color I have exactly the same code.
EDIT: FIXED IT
my make_shared in Texture2D.cpp was make_shared(OpenGLTexture2D(path))
CHANGED TO : make_shared(path) .
Reason: make_shared performs one heap allocation
Weird. I get a strange rainbow effect of vertical coloured bars in the dark squares of the texture. In the code of the opengl series it doesn't do that (plugged in the same texture there). It doesn't seem to be caused by the shader. Maybe those new-fangled opengl functions behave strange on intel iris xe graphics.
Yes, if I plug the code of the opengl series into the constructor instead of the code he uses here, it looks good. Must be iris xe doing strange things. I keep it on the code of the opengl series.
that is
glGenTextures(1, &m_renderer_id);
glBindTexture(GL_TEXTURE_2D, m_renderer_id);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, m_width, m_height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
glBindTexture(GL_TEXTURE_2D, 0);
Really nice video :) Can someone explain me why we are using GL_RGB8 in glTextureStorage2D and GL_RGB in glTextureSubImage2D? Why we do not use the same layout for storing and specifying the same texture? And how do I find out what format should i have to use for different pictures?
Nevermind, in the next episode, it is explained :)
Why does he use glTextureStorage2D instead of glTexImage2D? Is there a big difference?
His stb loader from the Opengl series works on version 4.5. This code does not.
Is the Texture that can be rendered after this video be only a grayscale Black&White Image Or can we also load an RGB colored Image ?
Cause if we can load Colored image then my image is coming good as an texture on Square.
Btw , Thank You!! for the nice long informative video
could you share your code highlighing config? because i realized you use different code highlight colors, would like to set that like that myself too..
I think that is part of Visual Assist, which is a non-free extension for Visual Studio. I am not sure though. I think he talks about it in one of his early C++ videos? Maybe in the "BEST Visual Studio Setup for C++ Projects!" video?
I'm getting "Hazel::Texture::Bind is inaccessible" error on the line m_Texture->Bind();.
If i use dynamic casting to Hazel::OpenGLTexture2D it works but in the chernos case it works without casting. Anyone knowing what this is about? I checked the code with the with github code but couldnt find anything.
turns out i forgot to put "public" keyword in the Texture2D class when i specify the base class "Texture" like this,
class Texture2D : public Texture {
...
}
@@arlandria6036 thank you, I had the same problem too. Yo saved me a lot of time
You
Can you provide a link to your visual studio theme?
The one in your discord FAQ seems to be outdated.
I think it's the default Visual Assist theme
Yes, it's the default Visual Assist Theme
Really don't switch to mac. All of the new openGL cause programs to crash. I can only get my code to run if I use glGenTextures etc.
ChatGPT3.5 generated:
- The transcript is from an episode in the "TEXTURES | Game Engine series."
- The speaker discusses various topics related to game development, including smart pointers, assets, and OpenGL code.
- They mention the flexibility of textures in game engines and how textures can contain different types of data.
- The speaker emphasizes the importance of visualizing and validating data in shaders.
- They discuss the process of loading and manipulating textures in the game engine and mention the need for an asset manager in the future.
So I'm just getting a black empty texture rendering. I have double and triple-checked that I didn't make any mistakes with the added code, and I can see that the OpenGLTexture2D object seems to have been created properly. I assume it's an issue with Open GL on my computer, which just has Intel integrated graphics. Anybody have any ideas?
i have same problem, double and triple checked too! don't think it's OpenGL. I've done many shader app's and they all worked in OGL, though i'm in "Windows 10" now the rest of my system is still the same. I'm still testing a few things, let you know if i find something.
You could clone and compile the Hazel Github repository and see if the issue happens there too? This should be a definite indication whether the issue is with your code or with your pc.
glTextureStorage2D(m_RendererID, 1, GL_RGB8, m_Width, m_Height);
you could be using GL_RGB instead of GL_RGB8.
meee too
And I remember that Cherno talked about this in one of OpenGL videos but then I forget which videos
Well I found if you bind the texture "after glTextureStorage2D" and unbind after glTextureSubImage2D, everything works!
The OpenGL Series shows this, but Cherno didn't have to bind here! My problem is why???
Is it the texture (absolute, relative) path at runtime?
there's no error's at runtime so idk?
why doesn't he need to do this!?
Woaw first !
So the Ref keyword doesn't exist for me, anyone else experience this?
Check the previous episode
it's a typedef for shared_ptr
Wait you don't work for EA anymore?
No
@@pooria_garrett3020 do you know why?
@@Grynjolf He explicitly mentioned he prefers working independently
@@pooria_garrett3020 oh so you suspect that it was a voluntary decision? That makes sense. Would be scared to do it myself 😅
@@Grynjolf Well I think it doesn't even matter, what does thou is that he's dedicated to the community now, and Cherno+community = great things to happen ...
no mac!
I thought the best game engines used jpegs xD
Seriously I didn't get that JPEG shade throw xD What's wrong with compressed images! Maybe the texture is some weird rasterized image, why would I use anything uncompressed for that, genuinely interested in understanding that joke ;d
JPEG is overall a terrible format for storing game assets. It makes a huge sacrifice of quality for file size, doesn't support RGBA/baked mipmaps, and is overall just not worth using when you could go with .dds (most engines do this, but usually with a custom header) or even .png instead.
uuuu early