Sets up OpenGL for communication with GPU, proceeds to do all pixel calculations on the CPU and use the GPU as a pass through. Jokes aside, nice work 👍 I think its really creative to use the texture buffer like this and given that its low rez the performance isn't really needed anyway.
Normally, the GPU receives an input like triangles of a mesh and produce pixel colors from that. That is a lot of work depending on how complicated the mesh is. In this case the GPU is given a texture with all colors already set. The only thing the GPU has to do is pass it on to the screen. This is a whaste of the GPUs power but given the pixel nature of the engine I don't think that power is needed yet.
Just a little correction, OpenGL doesn't handle your input, that's GLFW, which is the framework that creates the OpenGL context and Window, OpenGL is just for rendering. Awesome stuff though
@@moosethegoose8581 For some people, this doesn't matter. However, as somebody into game engines, I immediately, uhh, 😅 """felt offended""" when he stated that OpenGL does the windowing work 😂
You'd be surprised at how simple game engines are. I started making them as a teenager. Currently making one for a terminal based roguelike. Learning how to organize your code is the most important part. Once you get past that, it's just plugging in the code for each of the parts.
there are lots of good reasons for making your own engine. - for fun - avoiding bloat - making it only have what you need - performance - having it support niece features you want - it is your own (you won't have to pay anyone else for using it, since it is your own) - getting better at programming (when you use something like unity you learn how to make something using unity, but when you make a game engine you begin to learn how to make a game by yourself, so you get better at programming, and in case unity ever goes away, you can continue making games without it) and more that I don't remember atm. A portion of a Jonathan Blow talk on this: ruclips.net/video/SOtxjOLst2k/видео.html Another portion of a Jonathan Blow talk: ruclips.net/video/cL6vluhfKw8/видео.html
6:28 - Mother of God. You know exact location of your pixel in the array and yet, instead of just setting it, your wrote a double for loop? o_O It's the strangest thing in programming I have ever seen.
I genuinely laughed out loud when you mentioned pipelining. It had approximately 0 effect on the speed of that loop. That was 100% due to making a O(1) algorithm into a polynomial.
This is neat, and a good learning experience for sure. I've been doodling with Vulkan for a few years now. It's really REALLY good, but it's also a ton of work. I've been building my own general purpose game engine for a year now. I'm still working on the basic core fundamentals of it. I wanted to make my own reflection system so I'm writing my own C++ parser... I'm still at it. It's been pretty fun so far though.
Whoaaa, this video has so many coincidences with me it's crazy. Everything from your thought process choosing learning OpenGL, the tutorial you used, the idea for a game engine that would run in the shader code instead of in the cpu code. All the same. My idea was inspired by Noita, trying to see if a game like it could be optimized further by runing it's pixel simulations on the gpu. It might be use a cellular automaton to define the physics too, and the double buffering in the rendering pipeline is also coincidentally necessary for the cellular automaton to be stable. It was so perfect. But I never got past the point where you ended, i.e. the texture page. I know that's probably all I need, but now I'm procrastinating haha. Maybe your project will give me the push I need.
Superb video. I do think you're quite, quite mad for building your own but there were great lessons in there - especially about performance. I'd probably use Rust for low level. Tantan has had some good experience with wrappers for opengl.
Rust is an excellent shout, and I love Tantan videos. I've made a small project in Rust before but then forgot the interesting bits about how I made it before I made a video. I might have to give it another look. Thanks Rob 👍
finally the long 10 month wait is over, the only classic game dev youtuber to not die in the nuclear blast or something I mean they all just disappeared, in some cases that was for the better tho *cough *cough Barji, There is some kind of conspiracy I tell you.
this architecture isn’t actually all that bad. It’s a common technique called Software Rendering. Although the converting it into texture isn’t the most common thing. You could of also used SDL2 which would of also let you convert a pixel array into a texture and then render it with the GPU but just with a fair simpler API.
@@Chadderbox May I ask why you don’t like lua? It’s fast, easy to learn, very readable, and has easy C/C++ integration built-in. It kinda just seems like the perfect tool for the job in this situation.
1:51 correction: these are not graphics frameworks, these are graphics API's. graphics frameworks are the high level, abstracted stuff that you use to render stuff on screen, which then a graphics API is used for that job. also, did you ever check on the godot engine?
This is a fantastic video, nice work! I made a game engine using the opengl wrapper opentk so I could use c#. Trying to create a game engine gives you so much insight into how most game engines work and is great for learning new programming concepts.
This is pretty awesome! I feel like that this could be made pretty well with SDL2 instead of OpenGL, it’s really lightweight and handles everything, like windows, rendering, input, sound, you name it!
Using function pointers for user to specify there start, update and finish is the correct way so good job. It allows user not to worry about going into your engine code. Only thing bad about your engine is there no audio.
I think since you're just rendering a texture to the window, you could probably drop OpenGL entirely and do software rendering with something like GLFW. I'm not sure how well that would work with scaling...but that would allow it to run entirely on the CPU
7:40 gonna "umm actually" this and say that in the code you gave you're using references, not pointers, also in C++ you can use std::function instead of a reference or pointer to accept a function as an argument just so you know
That double for loop with an if statement was not your problem. Compiling it with -O3 would've increased the performance of it insanely. Also, your renderer was one way. No need to get pixels back from opengl. Just use you cached array?!
Python. I was making a CLI chess implementation in Python yesterday, now you've inspired me to try to make a graphical board in 64*64 pixels. Hardest part might be designing readable sprites to fit within the checkboard lol
the speed myth is still a myth. opengl is not slower than the other two. and it saddens me that all sorts of people who started making engines believe that : (
OpenGL definitely doesn't perform worse than DirectX, in fact you could even get VK or DX12 like performance out of it if you know what you're doing. These "frameworks" are actually graphics APIs that send commands to the GPU, period. The performance you get is almost entirely dependent on your engine design/subsystems, draw call optimizations, etc.
I don't understand OpenGL but I write my own engine. That's exactly the point: people creating custom engines for fun and learning (no, nobody in the comments needs a custom engine because of performance 😅)
Great info. I just want to know how to create a Lumen system like UE5. The day when someone discovers it will be revolution, because is needed for the open source world.
Hey, I don’t know if you still respond to comments on this video but can you tell what app did you use to code. Did you use VS code or something else? And if you did, how’d you set it up?
This was in VSCode, I followed a video to set it up but unfortunately that video is no longer available. I'm sure if you look it up on RUclips then you'll find something. It is basically just setting up MinGW on windows and that is it.
hi, in fact I would like to code projects like you but since I hear here and there that we have to stay at the cutting edge of technology, I tell myself that if I start a project that takes me... let's say 03 months, trends will change and I will no longer be up to date. This means that I abandon all projects that exceed 1 week max. So how do you do it?
Building a game engine up to the latest "trends" with all the newest complex rendering techniques would take you about 50 human lifetimes, it isn't a goal for anyone other that big companies like Unreal with large teams of engineers. "hobbyist" games engines are mostly basic and don't try to chase trends as that is completely out of reach. Just reaching 2002 level graphics is a huge achievement for a custom made game engine
C++ is very performant and has some useful abstractions over C or assembly like OOP which becomes very handy for Engine Development later on. But it's a memory unsafe language so you should use rust btw.
Kind of a weird and loaded question to ask, considering most game engines are made in C++, literally every major game engine is written in C++ (Unity, Unreal, Godot, etc). What was he gonna write it in, Python? Sure, if you want very weak performance, by all means. But you're not gonna get hundreds or even thousands of frames per second like you can with C++. It's a compiled language and creates a native binary executable for the system it's on. Really the only logical option.
@@nick15684 Rust nowadays is seen as a compelling alternative for C++, it has a lot of the features that C++ has, but without the fear of breaking things with manual memory management
next time actually learn how to use shaders and how to use them to run stuff on the gpu because now it is like doing all the rendering using the cpu which is not good at all
@@Chadderbox languages like lisp are really interesting, I’ve tried one for a framework called cel7 which was pretty fun but insanely confusing cause of the formatting lol
OPEN GL: the true dog Direct X: Microsoft’s “implementation” of OGL, but it is NO LONGER A STATE MACHINE (open gl IS). Vukan: literally open GL 2023. At the end of the day, they all literally evolved from Open GL. 🙃
That pixelndex calculation is wrong and works only because you have square window. It should be: ((y * windowWidth) + x) * 3 also the bounds checking should not allow x or y to be the width or height respectively.
*Good afternoon. I'm from Donetsk. And so, he made a game engine in C ++, given that he fully learned to be a programmer WITHOUT PROBLEMS. I am very jealous and will immediately move to the UK. In order to learn my specialty, start a new life, and forget about this 10-year war in Ukraine.* *I am very sad. I was not destined to be in this country, where there has been a war in Ukraine for 9 years.*
I'm also very sad for you :( I was with some Ukranian kids at my school who had to move, I can't even imagine it. Hopefully things will be better soon.
@@Chadderbox I mean JDH already did it, but unless you wanna torture yourself with x86 16bit asm and try to figure out ancient interrupts, I can't really recommend it.
Having done both. This is true. NO DON'T DO IT YOU'LL SPEND YEARS WRITING YOUR OS AND END UP WITH A MOUSE CURSOR ON A GREEN SCREEN THAT DOES NOTHING AND ESPECIALLY DOESN'T PLAY YOUR GAME EXCEPT FOR THAT ONE TEXT ADVENTURE GAME YOU WROTE FOR YOUR OS BUT SUBSEQUENTLY BROKE WHEN YOU ADDED IPC SUPPORT! I mean, it's fun.
@@AugustCoder Yeah, I'm currently trying to make 3d game in an os. It would maybe be less painful, if I didn't wanted to do it "the ol' way" and just used UEFI like a normal person.
Glfw is the library/api to create windows and create OpenGL contexts in those windows ,by default glfw imports OpenGL(or Vulkan if #define GLFW_INCLUDE_VULKAN is written in the file) which means he did indeed have to use OpenGL, just because there isn't any shaders or anything doesn't mean he didn't use OpenGL,He. literally talked about an OpenGL FUNCTION he used. So why do you Mr. Zuckerberg feel the need to chime in on his work even though you most likely have never written a single c library in your life. What has he done to you to make you angry? Do something different than usual?
I understand, I'd probably recommend forking the repo if you want to do anything since I'd rather do everything myself for this project as a learning experience.
You can definitely ray trace in opengl, just not using the dedicated hardware for it. People are still making engines in OpenGL because it is simple and all we need.
Well, your video is already quite old, so maybe you already figured these things out by yourself, but it looks like you don't quite understand the purpose of OpenGL. You're not supposed to send an array of pixels as a texture and have it render there. The whole point of OpenGL is that you send to it primitive shapes and then it figures out by itself how the pixels should be. It can do this much faster than CPU code, because it runs on the graphics card. If you already know how the pixels should be, then you did all the rendering in the CPU, and you don't need OpenGL. What you're doing is rendering everything in the CPU, sending all the stuff to the graphics card, which puts them on the screen without doing anything. It would be more straightforward, faster, and give you code with less dependencies if you just opened a pixmap to your OS display server and transfered the pixels there. In Linux Xlib does this, I don't know about Windows, but pretty much any GUI toolkit should have a way to open a pixmap.
Beta people: Making games with just common programming language Alpha people: Making games with a game engine Sigma people: Making games building their own game engine Omega people: Making 3D games using assembly
Sets up OpenGL for communication with GPU, proceeds to do all pixel calculations on the CPU and use the GPU as a pass through.
Jokes aside, nice work 👍 I think its really creative to use the texture buffer like this and given that its low rez the performance isn't really needed anyway.
Normally, the GPU receives an input like triangles of a mesh and produce pixel colors from that. That is a lot of work depending on how complicated the mesh is. In this case the GPU is given a texture with all colors already set. The only thing the GPU has to do is pass it on to the screen. This is a whaste of the GPUs power but given the pixel nature of the engine I don't think that power is needed yet.
Just a little correction, OpenGL doesn't handle your input, that's GLFW, which is the framework that creates the OpenGL context and Window, OpenGL is just for rendering. Awesome stuff though
_I was looking for this comment._
All i have to say: 🤓
@@moosethegoose8581 For some people, this doesn't matter. However, as somebody into game engines, I immediately, uhh, 😅 """felt offended""" when he stated that OpenGL does the windowing work 😂
To clarify, I did initially have a section discussing this in the video, but it was super boring so I cut it out.
@@moosethegoose8581 nah
Dude, I don’t think I could ever do this. This is genuinely one of the better things I’ve seen.
Nah you could do this. Just a bit of Googling and a tiny bit of prior knowledge and you'll be on your way.
You'd be surprised at how simple game engines are. I started making them as a teenager. Currently making one for a terminal based roguelike. Learning how to organize your code is the most important part. Once you get past that, it's just plugging in the code for each of the parts.
there are lots of good reasons for making your own engine.
- for fun
- avoiding bloat
- making it only have what you need
- performance
- having it support niece features you want
- it is your own (you won't have to pay anyone else for using it, since it is your own)
- getting better at programming (when you use something like unity you learn how to make something using unity, but when you make a game engine you begin to learn how to make a game by yourself, so you get better at programming, and in case unity ever goes away, you can continue making games without it)
and more that I don't remember atm.
A portion of a Jonathan Blow talk on this: ruclips.net/video/SOtxjOLst2k/видео.html
Another portion of a Jonathan Blow talk: ruclips.net/video/cL6vluhfKw8/видео.html
i love niece features
@@danielllanes5298 🤣
you can actually sell it.
My game engine only supports nephew features. Should I divert some time to adding niece ones too?
this was honestly one of the chadderbox videos of all time
Actually though
Couldn't agree more
Wow this is impressive. I have always wanted to do this, but never understood how to. Great video dude! Amazing job 😊
Do it
6:28 - Mother of God. You know exact location of your pixel in the array and yet, instead of just setting it, your wrote a double for loop? o_O It's the strangest thing in programming I have ever seen.
I genuinely laughed out loud when you mentioned pipelining. It had approximately 0 effect on the speed of that loop. That was 100% due to making a O(1) algorithm into a polynomial.
Insanely well timed video. I was thinking of making a game engine just to swap the engine of another game. Hopefully I can learn something from this!!
This is neat, and a good learning experience for sure. I've been doodling with Vulkan for a few years now. It's really REALLY good, but it's also a ton of work. I've been building my own general purpose game engine for a year now. I'm still working on the basic core fundamentals of it. I wanted to make my own reflection system so I'm writing my own C++ parser... I'm still at it. It's been pretty fun so far though.
Very cool!
Whoaaa, this video has so many coincidences with me it's crazy. Everything from your thought process choosing learning OpenGL, the tutorial you used, the idea for a game engine that would run in the shader code instead of in the cpu code. All the same. My idea was inspired by Noita, trying to see if a game like it could be optimized further by runing it's pixel simulations on the gpu. It might be use a cellular automaton to define the physics too, and the double buffering in the rendering pipeline is also coincidentally necessary for the cellular automaton to be stable. It was so perfect. But I never got past the point where you ended, i.e. the texture page. I know that's probably all I need, but now I'm procrastinating haha. Maybe your project will give me the push I need.
This was great! How much torture would it be to create bindings for something like Haskell or COBOL lmao
It might be possible, I've never used either so I've got no idea but it could be fun.
CHAD stack AAA game
@@rallokkcaz Hello fellow primeagen viewer lol
@@KaidenBird Love these callouts haha
comparing Haskell to COBOL should be a crime punishable by death
Superb video. I do think you're quite, quite mad for building your own but there were great lessons in there - especially about performance. I'd probably use Rust for low level. Tantan has had some good experience with wrappers for opengl.
Rust is an excellent shout, and I love Tantan videos. I've made a small project in Rust before but then forgot the interesting bits about how I made it before I made a video. I might have to give it another look. Thanks Rob 👍
finally the long 10 month wait is over, the only classic game dev youtuber to not die in the nuclear blast or something I mean they all just disappeared, in some cases that was for the better tho *cough *cough Barji, There is some kind of conspiracy I tell you.
What happens if I engineered the nuclear blast? I was actually off for 10 months to assassinate everyone.
This is such a cool concept and you explained it in a way that was both interesting and understandable. Subscribed!
Awesome, thank you! Glad you liked it :)
this architecture isn’t actually all that bad. It’s a common technique called Software Rendering. Although the converting it into texture isn’t the most common thing. You could of also used SDL2 which would of also let you convert a pixel array into a texture and then render it with the GPU but just with a fair simpler API.
I feel like there is another good reason now to make a game engine...
For unity?
You finally came back!!
(Cool video btw)
Thanks!
I can think of a couple of reasons to make a game engine now.
Cool to see you take on your own framework! Great video
NEW CHADDERBOX!!!
0:20 if you want a bigger challenge, do it in C, or worse x86 assembly
I've been trying to make a game in Assembly for ages, it's super hard!
This is pretty cool! Much better than what I've done lol.
You've got to do something cooler and show me now!
you could add lua bindings. You can already use lua code in C and viceversa, so it shouldn't be hard, and the pool of reserved keywords is quite small
I'm not a fan of Lua but this could be fun, I could finally make a game in Lua.
@@Chadderbox May I ask why you don’t like lua? It’s fast, easy to learn, very readable, and has easy C/C++ integration built-in. It kinda just seems like the perfect tool for the job in this situation.
This is great, explained what the engine does without fancy frills.
1:51 correction: these are not graphics frameworks, these are graphics API's. graphics frameworks are the high level, abstracted stuff that you use to render stuff on screen, which then a graphics API is used for that job.
also, did you ever check on the godot engine?
I'm not a fan of Godot's workflow :)
@@Chadderbox why?
Learned a lot and you were hilarious; thanks very much for sharing your experiences in this video 😆
This is a fantastic video, nice work! I made a game engine using the opengl wrapper opentk so I could use c#. Trying to create a game engine gives you so much insight into how most game engines work and is great for learning new programming concepts.
“What did you do over the weekend “
“I made a game engine”
This is pretty awesome! I feel like that this could be made pretty well with SDL2 instead of OpenGL, it’s really lightweight and handles everything, like windows, rendering, input, sound, you name it!
Yeah SDL would be so much better, but part of the challenge was using OpenGL :)
@@Chadderbox ooohhh
Man, you really hate yourself, don’t you?
😂
Unless you used GLFW lol
(You prob did, I just forgot)
I used GLFW, so I don't hate myself too much.
Using function pointers for user to specify there start, update and finish is the correct way so good job. It allows user not to worry about going into your engine code.
Only thing bad about your engine is there no audio.
I can't lie, I completely forgot to add audio. Coming soon? Maybe?
@@Chadderbox I don't got audio yet in my engine as well lol.
Dude really said programming using pointers in cpp makes life easier. I use pointers on daily basis I still don't understand them fully. What a Chad.
"if i went much lower level i'd be making my own graphics card" - LOL
very nice work
Chadderbox is the perfect example to why you should learn about GLSL and Fragment Shaders!
This project has a tiny bit of GLSL in l, just to get basic rendering working.
@@Chadderbox it would give it a huge performance boost tho
@@Chadderbox If you would have read until the matrix multiplication part your engine would probably perform way better.
"General Kenobi! You are a bold one." Shouldn't I answer this to the intro?
in 4:10 what music did you use, it's so beautifulll
are there any good tutorial for making a Game Engine?
Don't do what I did. Check out The Cherno.
Goodness your brain is big, unlike my smooth one.
Some impressive stuff being done here mate! :D
I've played your multiplayer game, you are much smarter than me :)
@@Chadderbox disagree! Don't make me challenge you and prove that you're smarter
I think since you're just rendering a texture to the window, you could probably drop OpenGL entirely and do software rendering with something like GLFW. I'm not sure how well that would work with scaling...but that would allow it to run entirely on the CPU
GLFW doesn't allow you to write to the window buffer without going through the GPU. SDL does have something like that, though.
@@Radgerayden-ist ooh, interesting.
Trust the boxman to release a banger like this fr.
It only took the boxman 9 months :despair:
3:40 but if ur just using a texutre then why dont just do it in unity?
0:16 i feel called out
I'm making a basic 3d game using Modern OpenGL and its making me literally cry but it's rewarding.
Remember to enjoy and cherish your own accomplishments, it'll help you get through. :)
My stupidity outweighs yours
7:40 gonna "umm actually" this and say that in the code you gave you're using references, not pointers, also in C++ you can use std::function instead of a reference or pointer to accept a function as an argument just so you know
That double for loop with an if statement was not your problem. Compiling it with -O3 would've increased the performance of it insanely. Also, your renderer was one way. No need to get pixels back from opengl. Just use you cached array?!
Welcome back!
Next up: directly writing to a framebuffer
Building my own processor to run my own programming language with my own game engine
@@Chadderbox sounding very How you say.... jdh like, if you will
What if you make a game using the Win32 API and C?
I was tempted, but I had already started with this after I realised I could.
@@Chadderbox WTF!? THE MAN ITSELF REPLIED ME!? OMG
actually something that has a lot of potential for schools
id like to take a look at it when it is finished for my IT classes
It's not cheating, it's keeping your will to live
OH MY GOD NEW CHADDERBOX VIDEO!!!!!!!!!!!!!!!!!!!!!!!!!! I cant believe it. HOLY SHIT. NO WAY UOGHSBOUIDBVIPSBDVYUIBS!!!!???
JDIHDHOEHSHSHHDHSHRHJMMFNNFMMMMM
Look at this internet micro-celebrity :0 This is mad stuff, I can barely use Godot for game making stuff, you're light years ahead mate lol
Ayy, not heard from you in a while, are you doing good?
@@Chadderbox yeah man I'm pretty well, (minus being an adult😮💨) how about you?
@@profsnowleopard5326 Pretty good, working as a software developer. Adult stuff 😮💨. Good luck!
@@Chadderbox yo that's pretty cool - you wanna chat sometime? about game dev or otherwise :)
Sure, my discord is chadderboxdev
Python.
I was making a CLI chess implementation in Python yesterday, now you've inspired me to try to make a graphical board in 64*64 pixels. Hardest part might be designing readable sprites to fit within the checkboard lol
Yeah it might be hard to read like that. 256x256 would be lots easier.
the speed myth is still a myth. opengl is not slower than the other two. and it saddens me that all sorts of people who started making engines believe that : (
OpenGL definitely doesn't perform worse than DirectX, in fact you could even get VK or DX12 like performance out of it if you know what you're doing. These "frameworks" are actually graphics APIs that send commands to the GPU, period. The performance you get is almost entirely dependent on your engine design/subsystems, draw call optimizations, etc.
i definitely had to use learn opengl to make my game engine
Technically Direct X, Vulkan and OpenGL are not Graphics Frameworks, but Graphics API's
I don't understand OpenGL but I write my own engine.
That's exactly the point: people creating custom engines for fun and learning (no, nobody in the comments needs a custom engine because of performance 😅)
For a programming language, maybe Squirrel or forth or tinyscheme. Something tiny.
You should take a hourly payment for using your game engine, where the cheapest option is 50$
What editor theme are you using, it looks really nice :)
Thanks! It is Monokai Vibrant Amped for VSCode - you might recognize the creator of the theme ;)
I’m really struggling to find an easy to install game library that works on VS code do you have a good library or a good ide for c++?
Raylib is a fun library for C++, not too hard to install and use with VSCode, look for a tutorial for it.
@@Chadderbox ok thanks
u can make it so it only renders a pixel if it is different from the pixel at that xyz coord one frame ago!
That is a good idea, it is in the queue for when I get round to updating this stuff
i think creating bindings for intercal would fit the engine
Wow I've certainly never heard of that programing language, will have to look it up.
Great info. I just want to know how to create a Lumen system like UE5. The day when someone discovers it will be revolution, because is needed for the open source world.
UE5 is open source?
I don't know buddy.@@Chadderbox
Vulkan time!
Hey, I don’t know if you still respond to comments on this video but can you tell what app did you use to code. Did you use VS code or something else? And if you did, how’d you set it up?
This was in VSCode, I followed a video to set it up but unfortunately that video is no longer available. I'm sure if you look it up on RUclips then you'll find something. It is basically just setting up MinGW on windows and that is it.
hi, in fact I would like to code projects like you but since I hear here and there that we have to stay at the cutting edge of technology, I tell myself that if I start a project that takes me... let's say 03 months, trends will change and I will no longer be up to date. This means that I abandon all projects that exceed 1 week max. So how do you do it?
There isn't much point in trying to appeal to trends, just do projects that you want to do. :)
Building a game engine up to the latest "trends" with all the newest complex rendering techniques would take you about 50 human lifetimes, it isn't a goal for anyone other that big companies like Unreal with large teams of engineers.
"hobbyist" games engines are mostly basic and don't try to chase trends as that is completely out of reach. Just reaching 2002 level graphics is a huge achievement for a custom made game engine
I'm curious to know why you chose C++ of all the languages to make a game engine in
I just wanted to tbh. Most people using OpenGL use C++, generally speaking. I have seen bindings for other languages.
C++ is very performant and has some useful abstractions over C or assembly like OOP which becomes very handy for Engine Development later on. But it's a memory unsafe language so you should use rust btw.
@@unendlicherping318 it's only unsafe if you do a bad job. C++ is designed to give the programmer almost full control of low level memory management
Kind of a weird and loaded question to ask, considering most game engines are made in C++, literally every major game engine is written in C++ (Unity, Unreal, Godot, etc). What was he gonna write it in, Python? Sure, if you want very weak performance, by all means. But you're not gonna get hundreds or even thousands of frames per second like you can with C++. It's a compiled language and creates a native binary executable for the system it's on. Really the only logical option.
@@nick15684 Rust nowadays is seen as a compelling alternative for C++, it has a lot of the features that C++ has, but without the fear of breaking things with manual memory management
Please port this to arnoldc. ArnoldC doesnt support inputs pixel based rendering and libraries, but port it anyway.
IT'S SHOWTIME
TALK TO THE HAND "Chadderbox says that doing that sounds difficult"
YOU HAVE BEEN TERMINATED
next time actually learn how to use shaders and how to use them to run stuff on the gpu because now it is like doing all the rendering using the cpu which is not good at all
Yep, I did say this method was bad :)
you should make it work with lisp
Oooh I've never used Lisp, might be cool :)
@@Chadderbox languages like lisp are really interesting, I’ve tried one for a framework called cel7 which was pretty fun but insanely confusing cause of the formatting lol
(That(would(be an(awesome(idea)))))(!)
You should make bindings for C#
OPEN GL: the true dog
Direct X: Microsoft’s “implementation” of OGL, but it is NO LONGER A STATE MACHINE (open gl IS).
Vukan: literally open GL 2023.
At the end of the day, they all literally evolved from Open GL.
🙃
Correction: Real programmers make their games in c99 with vulkan
O(N) set_pixel routine wins you an award!
i mean, still impressive tbh
You should port it to python, even though pygame is a much better library.
Been SO LONG omg
Yeah it's been a while
That pixelndex calculation is wrong and works only because you have square window. It should be:
((y * windowWidth) + x) * 3
also the bounds checking should not allow x or y to be the width or height respectively.
Oop, looks like I'm stupid (who would have guessed). Thanks!
*Good afternoon. I'm from Donetsk. And so, he made a game engine in C ++, given that he fully learned to be a programmer WITHOUT PROBLEMS. I am very jealous and will immediately move to the UK. In order to learn my specialty, start a new life, and forget about this 10-year war in Ukraine.*
*I am very sad. I was not destined to be in this country, where there has been a war in Ukraine for 9 years.*
I'm also very sad for you :(
I was with some Ukranian kids at my school who had to move, I can't even imagine it. Hopefully things will be better soon.
Real programmers program their own OS to make games. :D
Is that a challenge? 😉
@@Chadderbox I mean JDH already did it, but unless you wanna torture yourself with x86 16bit asm and try to figure out ancient interrupts, I can't really recommend it.
Having done both. This is true.
NO DON'T DO IT YOU'LL SPEND YEARS WRITING YOUR OS AND END UP WITH A MOUSE CURSOR ON A GREEN SCREEN THAT DOES NOTHING AND ESPECIALLY DOESN'T PLAY YOUR GAME EXCEPT FOR THAT ONE TEXT ADVENTURE GAME YOU WROTE FOR YOUR OS BUT SUBSEQUENTLY BROKE WHEN YOU ADDED IPC SUPPORT!
I mean, it's fun.
@@AugustCoder Yeah, I'm currently trying to make 3d game in an os.
It would maybe be less painful, if I didn't wanted to do it "the ol' way" and just used UEFI like a normal person.
Temple os?
How about making bindings for Rust?!
Rust is super cool, it should be easier once I update the library.
Why even bother with c++ if you can just rewrite the whole thing in rust?
Yeah I didn't really understand the decision to make the game engine in C++
@@shadamethyst1258 My brain can not comprehend how someone who knows rust is able to choose c++ for his projects.
Good, now write it in Nim.
Wait.... what about audio?!
What did you say? I CAN'T HEAR YOU!
In all seriousness, I completely forgot about audio.
Sound?
I forgor 💀
If you're going to just manually put every pixel into a texture you should just use software rendering
That's basically what it is, since every pixel is basically managed by the CPU. I'm only using OpenGL to display.
I am making a 4D game engine do you Wana colab
I'd love to see it.
@@Chadderbox OK tell communication scr
You could have just used SDL2 and you'd be drawing pixels in software in no time.
sdl2 always got my back o7
i think `Lua` can be a good langauge to use
bro called glfw opengl...
Bro used the most commonly used OpenGL api while trying to keep the video short and accessible 😁
Glfw is the library/api to create windows and create OpenGL contexts in those windows ,by default glfw imports OpenGL(or Vulkan if #define GLFW_INCLUDE_VULKAN is written in the file) which means he did indeed have to use OpenGL, just because there isn't any shaders or anything doesn't mean he didn't use OpenGL,He. literally talked about an OpenGL FUNCTION he used. So why do you Mr. Zuckerberg feel the need to chime in on his work even though you most likely have never written a single c library in your life. What has he done to you to make you angry? Do something different than usual?
@@Chadderbox that comment was just nit-picking, I probably typed it cause I was bored. It was a good video. Is it ok to contribute to the repo?
I understand, I'd probably recommend forking the repo if you want to do anything since I'd rather do everything myself for this project as a learning experience.
@@OkOkOkIMightKnowYoumy bad then mr zuck
If you've used openGL, you haven't actually made a game engine... not a contemporary one, there will be no ray tracing
You can definitely ray trace in opengl, just not using the dedicated hardware for it. People are still making engines in OpenGL because it is simple and all we need.
Well, your video is already quite old, so maybe you already figured these things out by yourself, but it looks like you don't quite understand the purpose of OpenGL.
You're not supposed to send an array of pixels as a texture and have it render there. The whole point of OpenGL is that you send to it primitive shapes and then it figures out by itself how the pixels should be. It can do this much faster than CPU code, because it runs on the graphics card.
If you already know how the pixels should be, then you did all the rendering in the CPU, and you don't need OpenGL.
What you're doing is rendering everything in the CPU, sending all the stuff to the graphics card, which puts them on the screen without doing anything.
It would be more straightforward, faster, and give you code with less dependencies if you just opened a pixmap to your OS display server and transfered the pixels there. In Linux Xlib does this, I don't know about Windows, but pretty much any GUI toolkit should have a way to open a pixmap.
I always understood that you weren't supposed to do it like this. I just did it anyway because I could. ¯\_(ツ)_/¯
lit
The reason why I write game engines is because that's were most of the hard challenges are....compared to the game engine, games are dead simple
try odin instead of C++
it is way simpler and has libs ready
Less self doubt and bad criticism please.
After a few months actually programming, the self doubt and criticism was deserved 🙏
this is awesome
Gonk :]
Beta people: Making games with just common programming language
Alpha people: Making games with a game engine
Sigma people: Making games building their own game engine
Omega people: Making 3D games using assembly