Can't wait for the release! Mesh shaders are great, but 1) do they support textures? I could never find an example with that 2) are they that much more efficient? I found out that the overhead of populating the arrays/buffers of triangles was there, and, at the end of the day, the major gain was from occlusion culling of the meshlets? I tried to intregate them in my project, but I can't deal well with those two points.
Textures are supported. You choose the struct you use for your vertices. Texture sampling will still be handled in the pixel shader, which pretty much stays the same. They are more efficient if you take advantage of the flexibility to do custom culling. If you are rendering the exact same triangles as a vertex shader, VS might win out.
Thanks! On point 2, yes, in my case, Vertex Shader wins out, as my culling is done upstream I think that you don’t use mesh shaders for your amazing voxel physics project?
Oh shit! I need to do like, exactly this! This is fuckin cool. I wanna render a big-ole' trash heap, with physics and stuff. Wouldn't even need everything to be dynamic all of the time; I'd hopefully find some way to "switch" between static and dynamic here and there
4 месяца назад
I would love to see a more in depth explanation of this!
i'm just completing my first year with opengl, coming from an audio dsp background. my next challenge is isosurface terrain. i wonder what function you are using for your green object, is it an octave of perlin with smoothstep? i'll start next week, but a few things have occured to me: second order cubic interpolation would outperform gradient descent for surface nets. it also occured to me that keeping an array of normals from the isosurface function (with zeros for outside points) similarly interpolated will solve triangulation normals and collisions, conceptually simpler and computationally for complex functions. the green objects ar emoving too rapidly for me to observe, but if you are using first order interpolation, second order interpolation also improves quality here :) i super wish inigo quilez had introduced people to higher order interpolants!
The green objects were little heart emojis. The compression is pretty rough on this one. Here the goal wasn't to get a smooth liquid surface, but more to have a granulated look like a bunch of candy pieces or something.
Yeah, honestly, I think it was easier to set this up with StructruredBuffers than instanced rendering in DX11 with the input layouts and everything. It got a bit killed by the GPU shortage and Nanite. But I think maybe at larger native resolution and MSAA rendering the hardware rasterization can start winning out again. I kind of want an MSAA option because of all the moving particles, or I wish with FSR you could pass in the opaque and transparent layers separately with both their depth buffers.
Yes that could definitely be a good idea. Another might even be to draw the screen space bounding box (with depthlessequal) and use a ray intersection with the geometry. Ray cube is very fast. Tellusim engine also had luck with hardware raytracing.
Although there are only 7 main shapes, because of LOD, there are also meshes at each LOD level, which makes the mesh choices balloon a bit. Instancing would definitely work. There are just additional things that you can do in the amplification shader such as choose a different mesh for each particle, as well as frustum culling, to make things more efficient so the GPU has more room to do other things, like lighting, reflections/refractions, etc.
i'm going from "i should play around with mesh shaders" to "i NEED to play around with mesh shaders"
I'm prepping this old demo for public release. 👍 if your computer supports mesh shaders :)
Your work is amazing
Can't wait for the release! Mesh shaders are great, but
1) do they support textures? I could never find an example with that
2) are they that much more efficient? I found out that the overhead of populating the arrays/buffers of triangles was there, and, at the end of the day, the major gain was from occlusion culling of the meshlets?
I tried to intregate them in my project, but I can't deal well with those two points.
Textures are supported. You choose the struct you use for your vertices. Texture sampling will still be handled in the pixel shader, which pretty much stays the same.
They are more efficient if you take advantage of the flexibility to do custom culling. If you are rendering the exact same triangles as a vertex shader, VS might win out.
Thanks! On point 2, yes, in my case, Vertex Shader wins out, as my culling is done upstream
I think that you don’t use mesh shaders for your amazing voxel physics project?
Yeah, but I will be pulling it in. I will be working to combine my 3 demos with their different focuses, geo, lighting, scale.
Oh shit! I need to do like, exactly this! This is fuckin cool. I wanna render a big-ole' trash heap, with physics and stuff. Wouldn't even need everything to be dynamic all of the time; I'd hopefully find some way to "switch" between static and dynamic here and there
I would love to see a more in depth explanation of this!
i'm just completing my first year with opengl, coming from an audio dsp background. my next challenge is isosurface terrain. i wonder what function you are using for your green object, is it an octave of perlin with smoothstep? i'll start next week, but a few things have occured to me: second order cubic interpolation would outperform gradient descent for surface nets. it also occured to me that keeping an array of normals from the isosurface function (with zeros for outside points) similarly interpolated will solve triangulation normals and collisions, conceptually simpler and computationally for complex functions.
the green objects ar emoving too rapidly for me to observe, but if you are using first order interpolation, second order interpolation also improves quality here :) i super wish inigo quilez had introduced people to higher order interpolants!
The green objects were little heart emojis. The compression is pretty rough on this one. Here the goal wasn't to get a smooth liquid surface, but more to have a granulated look like a bunch of candy pieces or something.
This is genuinely so cool I love how it looks
this is awesome!
Love the mesh shader API, so elegant.
Yeah, honestly, I think it was easier to set this up with StructruredBuffers than instanced rendering in DX11 with the input layouts and everything. It got a bit killed by the GPU shortage and Nanite. But I think maybe at larger native resolution and MSAA rendering the hardware rasterization can start winning out again. I kind of want an MSAA option because of all the moving particles, or I wish with FSR you could pass in the opaque and transparent layers separately with both their depth buffers.
Would it make sense to use floating sprites for the lowest level of detail?
Yes that could definitely be a good idea. Another might even be to draw the screen space bounding box (with depthlessequal) and use a ray intersection with the geometry. Ray cube is very fast. Tellusim engine also had luck with hardware raytracing.
someone tell kitchen simulator about this!
Microsoft will never implement this in Minecraft because the game must only be worse with every update.
Would instancing not work with your original setup?
Although there are only 7 main shapes, because of LOD, there are also meshes at each LOD level, which makes the mesh choices balloon a bit. Instancing would definitely work. There are just additional things that you can do in the amplification shader such as choose a different mesh for each particle, as well as frustum culling, to make things more efficient so the GPU has more room to do other things, like lighting, reflections/refractions, etc.
@@GrantKot Okay, gotcha! Thanks!