I used odin some amount about half a year ago to build a chess UI app, and I have to say It was a VERY PLEASANT experience. It's like if there was C but it's not the 80s anymore. I REALLY WANT TO GO BACK TO WRITING ODIN ASAP, Imagine you had a technology that made you say that.
It's always worth listening to what Ginger Bill has to say! Would be good to see you have Andrew Kelley on too, I have the same big respect for both of those guys. Loris Cro would be interesting too actually.
AoS or Array of Structs is an array of Enemies, while SoA is a struct containing an array for the Enemy positions, an array for their textures, etc. SoA allows you to loop over only the positions, meaning the cache doesn't get ruined with irrelevant textures.
Discovered him after his video on the Casey Muratory vs Uncle Bob "discussing" clean code a couple of weeks ago... Searched where he did speak from... And landed on the Odin page ! I like his language so far...
Like everyone here, Ive wanted to build my own toy language thats a hybrid of all the fun stuff from established langs. Odin is like 95% of what I imagined.
@@Im_Ninooo yea there arent many resources, i learnt by just using the odin overview to get a grasp of the language and then applied the C examples to odin. the overview + the raylib cheatsheet & examples is pretty much all you need
@@Im_Ninooo and which is easier to learn odin or zig? im from oop java and i want to learn a low level language, but i want something modern and easy, thanks.
I knew about Odin but never really looked at the syntax until now and it really surprised me just how similar it is to Go. I literally had to check multiple times if I was looking at the right website lol
Odin is a fantastic programming language. If you have the time, it's worth taking a look behind the curtain at the core library's source code. The code is straight forward and there's a lot to learn there.
I did read up on Odin, but as someone who has not been programming for that long, some stuff went over my head. Having Bill go through language features with Prime is an awesome format for an introductory tutorial
We are at a stage in computing where we have an increasingly good assortment of high performance compiled languages: Nim, Zig, Crystal, Odin, Go, Rust, many flavours of LISP and more. Also according to what many people say OCaml recently also became a very fast compiled language (It can be both compiled to fast and optimized standalone static binaries or interpreted interactively by its interpreter which of course will be slower but allows to run chunks of the software you develop as you work on it which is cool for testing/checking/prototyping stuff out). Also Fortran 2018 standard may be cool to program GPUs using NVidia and AMD toolchains, In fact it can be a much clearer looking code than equivalent C and C++ code for high performance number crunching.
@@astroid-ws4py The EmberGen team is crazy good. Bill helped create the current version of EmberGen. Morten Vassvik is the co-founder and lead dev. rxi made the new UI (he's kind of a UI legend over at the Handmade Network). graphitemaster is very talented too. Just a small team of really smart folks.
Wow I love what I see here! I didn't know oden was this clean... taken inspiration from my favourite languages... Lua and Pascal :) I'm learning zig right now... I will try Odin after that
Now I am even more excited about Odin! But I still have a question: interview with Nim lang creator when? I am curious about what he would have to say and his approach to you putting Nim to "dog water" category.
@@element1111 When they were rating different languages. This interview is a consequence of that, because they pinned Odin as "dog water" as well. And Bill showed why it is not like that
Looks promising. I really really like the level of machine control it allows. I like no garbage collection as long as garbage creation isn't enforced. I like the Lua-like multiple return.Good syntaxic choices inspired from the best of Lua and Pascal. I like it. Maybe I would like something more algebraïc, yet I really like how uncluttered and systematic the syntax is. Good job !
He did say that odin doesn't have "gpu support" in the sense that odin code can run on the gpu, he mentioned they used glsl (or other shading languages) for that. Just like C cannot run on the gpu. It's just that they can run glsl code on the gpu and then "speak" with it to send data to the gpu for processing and receive the processed result back. He says this at 33:40.
Currently only C, C++ and Fortran have first class direct GPU access support (through the official NVidia and AMD compilers, Not sure about Intel though..), In every other language (Odin included) you have to jump over hoops and even then it is not promising that anything will work fully. Better to rely on the official toolchains provided by the companies than to make your own or use a third party that will never be able to catch up.
There's the experimental rust-gpu project, but that's both real early and more about being able to share code, I don't think there's any expectation of transparent GPU usage?
I wanted to watch just the first minutes to see what the language is about, almost one hour later, I've watched the hole thing. How did I never heard of Odin?
Would be nice to see Nim's creator here too. It's great language for those who needs performance but okay with high-level stuff (e.g. don't really need low-level C).
It's so weird that we collectively spent the 90s, the 00s and 10s coming up with more and more managed and scripting languages. And now all of a sudden after 50 years, people start to explore more low-level language designs. How come it didn't happen earlier.
For ages we’ve only had Pascal (it’s FreePascal variant is well and alive today), Ada, Fortran, C, C++ to target native development, Only suddenly in just recent years we started to get all the goodness with the likes of Zig, Rust, Nim, Odin and if you allow a “little GC” maybe Crystal and Go and some LISPs can be added too to this list.
It might be because incredible performance increases made it possible to ignore the performance costs you get with OOP. But now we're at a point where you notice the costs again and it's getting painful. We're also at a point where you almost always have a competitor for your product (this has been rare in many cases). With different features existing in all competing products, performance now becomes a selling point for the customers. So using lower level languages actually gives you a competitive advantage as well. My guess is that many products written with an OOP language will either lose to their competitors or be rewritten without OO.
Yeah, it seems like decades there was realistically only two languages when you wanted to target low-level: C or CPP. Sure, there were some others that have been around for ages, but if for anything "mainstream" that was widely supported and run on anything without much effort, realistically those were the only options. It feels nice being in an era where there is more languages gaining traction and on the horizon than I have time to learn.
I think that with the popularization of IOT, embedded devices and indie games, there is a lot more popular interest in low level programming. We are back to dealing with much more restricted hardware. Also, worried about language level safety. idk
It is also very important to pick a tool which most team members are comfortable with. Also, tools that have proper adoption. You don't want to get in a situation where you get blocked because some third party issue / abandonware.
I finally got around to giving Odin a serious look, and I must say, it is really good. I have been using Go pretty heavily lately, with some Zig mixed in here and there, and Odin feels like the two had a baby, with a heavy lean towards Go in its syntax.
which do you recommend to learn to programming a game engine zig or odin? which is more readable? which is faster in performance? which is shorter in lines? thanks
"defer if" is interesting but it will be hated in the future. The fact you can see a function end with n = 123 return And n could potentially not be 123 on return is so confusing.
He just used GLSL and interacted with the GPU in the standard graphics pipeline way, Direct GPU integration in the CUDA-like sense is possible these days only through C, C++ and Fortran toolchains by AMD and NVidia and it will probably stay that way for the forseable future. (As long as those companies do not decide to integrate a new language you will only get subpar solutions for other languages).
@@astroid-ws4py you can compile to a shader language and generate the API stuff that's what languages like futhark (compiles to CUDA with a C wrapper iirc) do it'll still need to be JITed but it won't be any different from how it is usually done
It’s always good to keep away from STDs!!! I like Zig but I also want to look at Odin and this and Bill’s own to presentation is a great primer. I like both languages because they remain small and simple.
14:24 Very good point on the fact that it's easier porting from Windows to Linux rather than the other way around, especially with assuming UNIX tools are everywhere once you get too used to them.
It seems like a lot of the features of Odin are pretty similar to the Jai language from Jonathan Blow. Not sure what came first though since Jai is not even publicly available.
Pointy pointers - Ginger Bill is my hero! Overall I like most of the design decisions made in this language. Few are the other languages with design decisions I like. Crystal is one of the other good ones. I also use D a lot, and Vala for gtk GUI and image processing programs. If I'm lucky in my career, I'll be using Odin a lot more, and never or rarely use C++, Python or the other popular languages.
Odin looks really promising. I think I'll try it out. It's about as close as I'll probably come to Jai. I have a bit of skepticism about it, since I'm a bit reluctant to agree with all their decisions.
need more languages with builtin vector math a+b should be element wise for arrays for example. I would love APL with modern conveniences of procedural/OOP/whatever ontop
@@andreffrosa not that I think vector math is necessarily required, but that's hardly an issue. What happens when you try to add an int to a bool? Depends on implementation. Languages with a truthy concept will coerce a result, languages without it will throw an exception. What's the issue with a similar concept for array addition?
Why does the 1.0 literal can be cast to an int? Yes it fits but it doesn't make much sense to me, since it's obviously dumb to write "a: int = 1.0", its just confusing. The other way around however "a: float = 1", is really awesome, and makes a lot of sense because the real numbers are a strict superset of integers
C++ made the 1 root inheritance tree a thing that everyone copied, except Ada. Go look at Ada's OOP. Ada has the package Ada.Finalization in which Controlled is defined, that is one root type (or "class"), another is Root_Stream_Type in package Ada.Streams. You can't do proper class checks for inclusion in C++ one root class trees, in Ada you can. if My_Class in Root_Type'Class then...end if;. Ada is probably safer than rust in most cases. Pointers shouldn't even be a thing now.
I was dissapointed when he said "dont worry about that, its magic" in regards to the t.derived = t^ line. I really want to know what that is and why he didnt want to talk about it.
"derived" is a field on the struct type which has the "any" type. "any" is really just a "struct { ptr, typeid }" with syntactic sugar around it. So here, he's setting it to an instance of Entity (t is ^Entity and t^ means "de-ref t" so it would be of type Entity), which secretly will store the fact that it's of Entity type (this is the typeid part) as well as put in a reference to that Entity (this is the ptr part). You can later do logic to do type-checking/assertion on an "any" variable and get the data out of it.
@@Tekay37 I mean it's technically just storing a pointer to the data. It's pretty much like doing "(MyStruct *)((void *)p)" in C. Nothing magical really happening, it's not implicitly doing malloc + memcpy or anything, it's just a sugary wrapper around pointer casting. The only time you actually need to use this over tagged unions is in printing procedures, and maybe also serialization. But in my ~20k lines of Odin so far I've never found a single use for it.
@@ps4star286 I don't understand that C comparison. But if it make the program treat some data as if it was of a certain type when it may actually not be, then I might actually have a use case for it.
Assert early (even with runtime asserts) is great in a lot of programming scenarios, especially with batch processing or asynchronous long running business back-end processes, where atomic revertable (or revert-on-fail) transactions are possible. Videogames are often different though. Players (rightly) really don't like or tolerate their game crashing, and they'd rather it keep playing in a buggy state. I wouldn't be surprised if people using creative tools had a somewhat similar take, since they'd like the chance to save their progress instead of just dumping out without warning. In such cases, you want to be careful and limit the places where you can trigger a fatal error, so at the very least you can get to a good, savable state before you dump out. Runtime asserts can act kind of like unchecked exceptions.
IMO "Crash Early" is the way to go for pretty much all __logic errors__ in games. Otherwise, you risk working with corrupted data that could cause worse errors down the line. User input errors should never crash a program though. In production, crash logs are incredibly useful if you have a team that can patch errors quickly.
@@davidholmin875 That's true. In release, I only disable asserts in hot code paths. For everything else, I spawn a popup allowing the user to send me the assert text and call stack info. You can also allow the user to "resume" in systems that don't impact the save state, such as audio. But my general approach is to exit the process completely.
@@davidholmin875 (and also Matias) people can mean different things when they say "asserts". People with a C background tend to assume C's macros that are disabled in release builds. Other languages and frameworks have different facilities. Sometimes people just talk about defensive checks that throw exceptions as "asserts", and continue to run it in production - that's pretty common in web dev, particularly on the backend. And yeah, crash early and often is desirable in dev mode. Or in open source software with a lot of maintainers and forum support. Or in business processes where you have dev staff on call to fix it for you, and robust rollback built in. It's not nearly as good a global strategy useful for customers who are running the software on their own computers, who don't have a direct line to devs. Pretty common for paid desktop apps and games and embedded hardware that doesn't have a ton of constant monitoring. In those cases limping along in a partial state can be a better failure behavior, especially if you can still make certain parts of the software still robust, like not corrupting the customer's long-lived data.
i really wish it didn't implicitly dereference a pointer when you are accessing its data member. i want to know exactly what the code means by just looking at it. to date i only know c and c++ that don't do this.
Looks like a mix of Zig, Go and Dlang. Not too bad actually. I think will stick to Dlang (I went through the entire sample code of all future of Odin, and everything there can be done in D), but definitively Odin is worth exploring (i.e. a lot of D standard library mostly assumes garbage collection, due to historical reasons, so Odin could be better - still with `@nogc` you can write easily for games, embedded system or kernels). All the polymorphism, dispatch, embedding, compile time stuff, even SOA, can be done in D with just little of library code. So, I am Odin then feels limited, because it is built-in into language, and you cannot easily create similar (but different), things yourself, where in D you can. Still, compared to things like Nim, Zig, Vlang, Go, Odin is really good (Go has superb packaging system, and concurrency control, so I doubt Odin actually beat it in this area - you wouldn't use Go style concurency in games, but everybody should be learning from Go package manager, there is nothing better).
If writing in C robs you of your hair, then Odin surely gives you a beard
Bald people with beards:
That Java/JavaScript joke went under appreciated
3:53
@@sa-hq8jk True heroes provide timestamps
Best statement I've heard in ages. Highly agree
@@HalfMonty11 3:33
I also thought Ginger Bill's STD joke went under the radar as well 10:40
3:54 That jab at JS went completely unnoticed.
Jaba script
@@backendtower6580 No one out pizzas the hut.
I used odin some amount about half a year ago to build a chess UI app, and I have to say It was a VERY PLEASANT experience. It's like if there was C but it's not the 80s anymore.
I REALLY WANT TO GO BACK TO WRITING ODIN ASAP, Imagine you had a technology that made you say that.
what UI lib did you use?
@@valethemajor I used microui with sdl.
I couldn’t get microui working fully, ended up going with Raygui. Any tips on getting SDL working with microui?
It's always worth listening to what Ginger Bill has to say! Would be good to see you have Andrew Kelley on too, I have the same big respect for both of those guys. Loris Cro would be interesting too actually.
Seconding Andrew Kelley, if he hasn’t been on yet
AoS or Array of Structs is an array of Enemies, while SoA is a struct containing an array for the Enemy positions, an array for their textures, etc. SoA allows you to loop over only the positions, meaning the cache doesn't get ruined with irrelevant textures.
Why Fireship hasn’t done a 100 seconds about this?!?
I've had Odin on my radar for a while. This talk might be the actual kick off for me to actually explore it.
Discovered him after his video on the Casey Muratory vs Uncle Bob "discussing" clean code a couple of weeks ago... Searched where he did speak from... And landed on the Odin page ! I like his language so far...
Like everyone here, Ive wanted to build my own toy language thats a hybrid of all the fun stuff from established langs. Odin is like 95% of what I imagined.
+ odin isnt a toy lang, it might be new but it's incredibly capable
@@marcs9451 Oh i know. Due to it being so capable it makes my toy version just a poorly written copy. It would be like me trying to rewrite Rust.
Odin has been my go to language for about a 8 months now, and it is simply amazing for game development and creating random tools
unfortunately there's not many tutorials out there on it. I specifically looked for Raylib + Odin resources but couldn't find any. :(
@@Im_Ninooo Just use a C or C++ tutorial and translate it. I've used SDL2 before in C++ and it seems to translate well into Odin.
@@oscarsmith-jones4108 I've started learning Zig instead
@@Im_Ninooo yea there arent many resources, i learnt by just using the odin overview to get a grasp of the language and then applied the C examples to odin.
the overview + the raylib cheatsheet & examples is pretty much all you need
@@Im_Ninooo and which is easier to learn odin or zig? im from oop java and i want to learn a low level language, but i want something modern and easy, thanks.
I knew about Odin but never really looked at the syntax until now and it really surprised me just how similar it is to Go. I literally had to check multiple times if I was looking at the right website lol
Odin is a fantastic programming language. If you have the time, it's worth taking a look behind the curtain at the core library's source code. The code is straight forward and there's a lot to learn there.
@@rodrigoetoobe2536 dude, chill, that equation has no time in it.
Great video. It's so good to see Odin get more attention. For the "joy of programming," it really does deliver.
I did read up on Odin, but as someone who has not been programming for that long, some stuff went over my head. Having Bill go through language features with Prime is an awesome format for an introductory tutorial
Prime. Thank you. I’ve been using Odin since it’s early days. It does 99% of what I want from ergonomics to syntax.
Glad to see Ginger Bill up in here!
Love that ginger bill! Thanks for putting him on.
Trying to stay away from STDs as much as possible - coffevevefe all over keyboard
Ahhh, another language to add to my evergrowing collection of compiled languages I have yet to master...
We are at a stage in computing where we have an increasingly good assortment of high performance compiled languages:
Nim, Zig, Crystal, Odin, Go, Rust, many flavours of LISP and more.
Also according to what many people say OCaml recently also became a very fast compiled language (It can be both compiled to fast and optimized standalone static binaries or interpreted interactively by its interpreter which of course will be slower but allows to run chunks of the software you develop as you work on it which is cool for testing/checking/prototyping stuff out).
Also Fortran 2018 standard may be cool to program GPUs using NVidia and AMD toolchains, In fact it can be a much clearer looking code than equivalent C and C++ code for high performance number crunching.
GingerBell nailed that RUST roast in the end. Primeagen needed that.
5:10 never knew the creator of Odin was a physicist
He created EmberGen which is all the way down a fluid physics simulation engine so no surprise here...
c was developed by a physicist too
@@astroid-ws4py The EmberGen team is crazy good. Bill helped create the current version of EmberGen. Morten Vassvik is the co-founder and lead dev. rxi made the new UI (he's kind of a UI legend over at the Handmade Network). graphitemaster is very talented too. Just a small team of really smart folks.
@@matias-eduardo
Thanks for the info, Nice to know
Wow I love what I see here! I didn't know oden was this clean... taken inspiration from my favourite languages... Lua and Pascal :)
I'm learning zig right now... I will try Odin after that
Loved the dynamic, what a great guest!
Thank you for these videos, and helping me catch up to you Chads
Funny enough I was just getting into Odin the other day. Fantastic language, love it
Please do an Odin project asap!
loved this on the stream will give you a free view on youtube also , and looking forward to your project for testing this.
Now I am even more excited about Odin! But I still have a question: interview with Nim lang creator when? I am curious about what he would have to say and his approach to you putting Nim to "dog water" category.
Where did he pin as dogwater?
@@element1111 When they were rating different languages. This interview is a consequence of that, because they pinned Odin as "dog water" as well. And Bill showed why it is not like that
Bless. Excited for this!
Looks promising. I really really like the level of machine control it allows. I like no garbage collection as long as garbage creation isn't enforced.
I like the Lua-like multiple return.Good syntaxic choices inspired from the best of Lua and Pascal. I like it.
Maybe I would like something more algebraïc, yet I really like how uncluttered and systematic the syntax is.
Good job !
Great talk! Interesting to know Odin has love for the GPU, maybe good for migrating some of the ML stuff that's going on. Does Rust have GPU support?
He did say that odin doesn't have "gpu support" in the sense that odin code can run on the gpu, he mentioned they used glsl (or other shading languages) for that. Just like C cannot run on the gpu. It's just that they can run glsl code on the gpu and then "speak" with it to send data to the gpu for processing and receive the processed result back. He says this at 33:40.
Currently only C, C++ and Fortran have first class direct GPU access support (through the official NVidia and AMD compilers, Not sure about Intel though..), In every other language (Odin included) you have to jump over hoops and even then it is not promising that anything will work fully. Better to rely on the official toolchains provided by the companies than to make your own or use a third party that will never be able to catch up.
There's the experimental rust-gpu project, but that's both real early and more about being able to share code, I don't think there's any expectation of transparent GPU usage?
For ML stuff you can maybe look into Julia. It’s a pretty performant language for Data Science and ML and it can run natively on a GPU
I wanted to watch just the first minutes to see what the language is about, almost one hour later, I've watched the hole thing. How did I never heard of Odin?
there is slew of good others too like: beef, hare, jai, vale, c3
Would be nice to see Nim's creator here too. It's great language for those who needs performance but okay with high-level stuff (e.g. don't really need low-level C).
@23:13 Excuse me? Are you kidding me!? Struct of Array? MIND BLOWN!
So basically go with manual memory management, I like it.
Ohhhh yeahhhhh, more Odin please :D
8:45 What's also important is is it good for your company/team.
Hell yeah Ginger Bill!
Very intriguing language. Lots of new ideas, worth an explore.
Parametric polymorphism procedure is OUTSTANDING!
It's so weird that we collectively spent the 90s, the 00s and 10s coming up with more and more managed and scripting languages. And now all of a sudden after 50 years, people start to explore more low-level language designs. How come it didn't happen earlier.
For ages we’ve only had Pascal (it’s FreePascal variant is well and alive today), Ada, Fortran, C, C++ to target native development, Only suddenly in just recent years we started to get all the goodness with the likes of Zig, Rust, Nim, Odin and if you allow a “little GC” maybe Crystal and Go and some LISPs can be added too to this list.
because they are an evolution and the right tool for 99% of apps
It might be because incredible performance increases made it possible to ignore the performance costs you get with OOP. But now we're at a point where you notice the costs again and it's getting painful. We're also at a point where you almost always have a competitor for your product (this has been rare in many cases). With different features existing in all competing products, performance now becomes a selling point for the customers. So using lower level languages actually gives you a competitive advantage as well.
My guess is that many products written with an OOP language will either lose to their competitors or be rewritten without OO.
Yeah, it seems like decades there was realistically only two languages when you wanted to target low-level: C or CPP. Sure, there were some others that have been around for ages, but if for anything "mainstream" that was widely supported and run on anything without much effort, realistically those were the only options. It feels nice being in an era where there is more languages gaining traction and on the horizon than I have time to learn.
I think that with the popularization of IOT, embedded devices and indie games, there is a lot more popular interest in low level programming. We are back to dealing with much more restricted hardware. Also, worried about language level safety. idk
It is also very important to pick a tool which most team members are comfortable with. Also, tools that have proper adoption. You don't want to get in a situation where you get blocked because some third party issue / abandonware.
no tool would ever be used in the first place if you were to follow this strictly.
I finally got around to giving Odin a serious look, and I must say, it is really good. I have been using Go pretty heavily lately, with some Zig mixed in here and there, and Odin feels like the two had a baby, with a heavy lean towards Go in its syntax.
which do you recommend to learn to programming a game engine zig or odin? which is more readable? which is faster in performance? which is shorter in lines? thanks
Odin is like Go on steroids.
"Present Bill is not smart, future Bill is dumber" is such a good quote
Just amazing, the best show of this channel so far ! Get Rich Hickey on board (Clojure creator)
i can't believe he pronounces tuple as tuple
Yeah image pronouncing it, tuple.
This is looking very interesting so far, I will try it after the video ends.
"defer if" is interesting but it will be hated in the future.
The fact you can see a function end with
n = 123
return
And n could potentially not be 123 on return is so confusing.
Les GO !!!!! odin is really fun to use
Love the ergonomics, but killer feature: direct GPU integration.
He just used GLSL and interacted with the GPU in the standard graphics pipeline way, Direct GPU integration in the CUDA-like sense is possible these days only through C, C++ and Fortran toolchains by AMD and NVidia and it will probably stay that way for the forseable future. (As long as those companies do not decide to integrate a new language you will only get subpar solutions for other languages).
@@astroid-ws4py you can compile to a shader language and generate the API stuff
that's what languages like futhark (compiles to CUDA with a C wrapper iirc) do
it'll still need to be JITed but it won't be any different from how it is usually done
It doesn't have this. You interact with the GPU the same as in C. Odin just provides bindings for APIs like Metal etc.
I think this is the third time that I am watching this... Nice one!
It’s always good to keep away from STDs!!!
I like Zig but I also want to look at Odin and this and Bill’s own to presentation is a great primer. I like both languages because they remain small and simple.
23:05 Huh, I don’t know about ECS, but my mental analog for SoA vs AoS is columnar data store vs row based. might not be right
That's completely right and I wish I said that now on stream because that is a better framing for the more web-based audience.
Stay till the end of the video to see Prime get Rusted
You know what we need? A Treesitter and Language Server tutorial for Odin.
They have been officially merged to neovim treesitter and lspconfig recently
@@marcs9451 thanks. Installing now! 🙏🙏🙏🙏
I had the custom versions
49:32 Bill waited for this moment for putting odin in dogwater tier.
Thanks!
14:24 Very good point on the fact that it's easier porting from Windows to Linux rather than the other way around, especially with assuming UNIX tools are everywhere once you get too used to them.
I always install Cygwin when I must work on Windows. Then I can pretend I'm using a somewhat unpolished version of Linux, mostly.
It seems like a lot of the features of Odin are pretty similar to the Jai language from Jonathan Blow.
Not sure what came first though since Jai is not even publicly available.
Pointy pointers - Ginger Bill is my hero! Overall I like most of the design decisions made in this language. Few are the other languages with design decisions I like. Crystal is one of the other good ones. I also use D a lot, and Vala for gtk GUI and image processing programs. If I'm lucky in my career, I'll be using Odin a lot more, and never or rarely use C++, Python or the other popular languages.
Woah. I really like that they have matrices and swizzling built in...I might have to check this language out
Damn. No comptime. That sucks
Odin looks really promising. I think I'll try it out. It's about as close as I'll probably come to Jai. I have a bit of skepticism about it, since I'm a bit reluctant to agree with all their decisions.
need more languages with builtin vector math a+b should be element wise for arrays for example. I would love APL with modern conveniences of procedural/OOP/whatever ontop
what happens when you try to sum arrays with different lengths?
@@andreffrosa not that I think vector math is necessarily required, but that's hardly an issue. What happens when you try to add an int to a bool? Depends on implementation.
Languages with a truthy concept will coerce a result, languages without it will throw an exception. What's the issue with a similar concept for array addition?
@@arnontzori I think you misunderstood. I was asking what happens in odin, not pointing it out as a problem
Fortran has that for ages, Zig has that only for SIMD vectors.
Why? What’s stopping you from using a library? How would you perform pointer arithmetic?
Love Odin, building a game in it
I'll probably stick to learning go, but this looks very nice
heck no.. golang is such a terrible language
@@_slierNou.
Go Odin!!
Finally I remembered why I thought the voice was familiar: Ricky Gervais. He sounds a bit like him at times. :)
It feels like Golang with some weird kwerks
What quirks did you feel were weird?
MAKE THE SHORT
that was hilarious, i was hurt seeing you hurt, you were so hurt i felt it myself
Ginger Bill hot takes, spicy indeed haha
"it could be any language... ya know... like Rust for example" lol
8-space tabs. 👀
Odin looks great, but does the compiler throw an error if the programmer improperly managed the memory?
It depends on how you improperly managed memory.
segfault
Why does the 1.0 literal can be cast to an int? Yes it fits but it doesn't make much sense to me, since it's obviously dumb to write "a: int = 1.0", its just confusing. The other way around however "a: float = 1", is really awesome, and makes a lot of sense because the real numbers are a strict superset of integers
From a mathematical standpoint it makes sense as 1 = 1.0
Prime, you forgot to add the links as you promise in the video
What a nice language. I'm going to learn it.
C++ made the 1 root inheritance tree a thing that everyone copied, except Ada. Go look at Ada's OOP. Ada has the package Ada.Finalization in which Controlled is defined, that is one root type (or "class"), another is Root_Stream_Type in package Ada.Streams. You can't do proper class checks for inclusion in C++ one root class trees, in Ada you can. if My_Class in Root_Type'Class then...end if;.
Ada is probably safer than rust in most cases.
Pointers shouldn't even be a thing now.
If you like the sound of Odin you are going to LOVE Jai when it comes out
I'm starting to think thats never going to happen
Isn't that designed by JBlow, the guy who's genetically incapable of being right about anything?
@@isodoubIet I rarely watch his content and I've seen him being right about some things. Namely that visual studio sucks, and that C++ sucks.
Tbh I'm not hyped for Jai. I don't quite like the syntax. I do quite enjoy Odin a lot, though.
@@skaruts He's wrong about both though. With C++ he can't even get basic syntax right, so his opinion is pretty much worthless.
Having Graphics API inbuilt could be key to Odin's success, given the current hype around ML and training it.
Graphic API's are not for ML. Graphic API's is still classic programming, no training here. They just enable the power of the GPU.
Didnt know simon pegg is into programming
interesting language. good insight
Will you ever try the language?
using the gpu for graphics, that's crazy who would do such a thing
U have to get Andrew Kelly on. Zig is underrated.
Loved the bad side of bad take HAHAHA lol
i really do not understand what has bethesda with odin language?! (but mentioned on odin's site)
EmberGen is made with Odin and is used by many companies, including Bethesda, and this is evidence of the viability/quality of Odin.
I was dissapointed when he said "dont worry about that, its magic" in regards to the t.derived = t^ line. I really want to know what that is and why he didnt want to talk about it.
"derived" is a field on the struct type which has the "any" type. "any" is really just a "struct { ptr, typeid }" with syntactic sugar around it. So here, he's setting it to an instance of Entity (t is ^Entity and t^ means "de-ref t" so it would be of type Entity), which secretly will store the fact that it's of Entity type (this is the typeid part) as well as put in a reference to that Entity (this is the ptr part). You can later do logic to do type-checking/assertion on an "any" variable and get the data out of it.
@@ps4star286 So that's a magic way around casting to a type only known at runtime?
@@Tekay37 I mean it's technically just storing a pointer to the data. It's pretty much like doing "(MyStruct *)((void *)p)" in C. Nothing magical really happening, it's not implicitly doing malloc + memcpy or anything, it's just a sugary wrapper around pointer casting. The only time you actually need to use this over tagged unions is in printing procedures, and maybe also serialization. But in my ~20k lines of Odin so far I've never found a single use for it.
@@ps4star286 I don't understand that C comparison. But if it make the program treat some data as if it was of a certain type when it may actually not be, then I might actually have a use case for it.
It's suspiciously similar to Jonathan Blow's Jai which started at Sep 2014, but no regards were made.
Jai is credited as a major influence on syntax, but the difference is that anyone can download Odin right now
@@ezg5221 not on the language' site
@@DeathSugar my mistake, it's a "minor influence" on the github wiki
Ginger Bill must be a billionaire if that app is one of a kind and it's that revolutionary.
Assert early (even with runtime asserts) is great in a lot of programming scenarios, especially with batch processing or asynchronous long running business back-end processes, where atomic revertable (or revert-on-fail) transactions are possible. Videogames are often different though. Players (rightly) really don't like or tolerate their game crashing, and they'd rather it keep playing in a buggy state. I wouldn't be surprised if people using creative tools had a somewhat similar take, since they'd like the chance to save their progress instead of just dumping out without warning. In such cases, you want to be careful and limit the places where you can trigger a fatal error, so at the very least you can get to a good, savable state before you dump out. Runtime asserts can act kind of like unchecked exceptions.
IMO "Crash Early" is the way to go for pretty much all __logic errors__ in games. Otherwise, you risk working with corrupted data that could cause worse errors down the line. User input errors should never crash a program though. In production, crash logs are incredibly useful if you have a team that can patch errors quickly.
Asserts are usually disabled for release builds, but they’re great for catching bugs during development.
@@davidholmin875 That's true. In release, I only disable asserts in hot code paths. For everything else, I spawn a popup allowing the user to send me the assert text and call stack info. You can also allow the user to "resume" in systems that don't impact the save state, such as audio. But my general approach is to exit the process completely.
@@davidholmin875 (and also Matias) people can mean different things when they say "asserts". People with a C background tend to assume C's macros that are disabled in release builds. Other languages and frameworks have different facilities. Sometimes people just talk about defensive checks that throw exceptions as "asserts", and continue to run it in production - that's pretty common in web dev, particularly on the backend.
And yeah, crash early and often is desirable in dev mode. Or in open source software with a lot of maintainers and forum support. Or in business processes where you have dev staff on call to fix it for you, and robust rollback built in. It's not nearly as good a global strategy useful for customers who are running the software on their own computers, who don't have a direct line to devs. Pretty common for paid desktop apps and games and embedded hardware that doesn't have a ton of constant monitoring. In those cases limping along in a partial state can be a better failure behavior, especially if you can still make certain parts of the software still robust, like not corrupting the customer's long-lived data.
i really wish it didn't implicitly dereference a pointer when you are accessing its data member. i want to know exactly what the code means by just looking at it. to date i only know c and c++ that don't do this.
Agreed
Looks like a mix of Zig, Go and Dlang. Not too bad actually. I think will stick to Dlang (I went through the entire sample code of all future of Odin, and everything there can be done in D), but definitively Odin is worth exploring (i.e. a lot of D standard library mostly assumes garbage collection, due to historical reasons, so Odin could be better - still with `@nogc` you can write easily for games, embedded system or kernels). All the polymorphism, dispatch, embedding, compile time stuff, even SOA, can be done in D with just little of library code. So, I am Odin then feels limited, because it is built-in into language, and you cannot easily create similar (but different), things yourself, where in D you can.
Still, compared to things like Nim, Zig, Vlang, Go, Odin is really good (Go has superb packaging system, and concurrency control, so I doubt Odin actually beat it in this area - you wouldn't use Go style concurency in games, but everybody should be learning from Go package manager, there is nothing better).
C# vs Odin would have been interesting
wow that is one way to end an interview...
As C++ dev trying Odin I desperately miss something like RAII. `defer` is not a sufficiently effective replacement. I wish for Odin++
All the graphics API's built in natively? WTF :D
I wonder what his thoughts on D (with the disable gc flag on) are. Any other D fans?
Still haven't found anything (that I can use anyway... darn Jai being closed beta) that has the compile time code analysis and generation power of D.
LMAOO @3:55