Active: Jax enters Evasion, a defensive stance, for up to 2 seconds, causing all basic attacks against him to miss. Jax also takes 25% reduced damage from all champion area of effect abilities. After 1 second, Jax can reactivate to end it immediately.
Active: Jax enters Evasion, a defensive stance, for up to 2 seconds, causing all basic attacks against him to miss. Jax also takes 25% reduced damage from all champion area of effect abilities. After 1 second, Jax can reactivate to end it immediately.
ACTIVE: Jax Dash dashes to the target unit's location. If the target is an enemy and they are in range upon arrival, Jax deals them physical damage. Jax can cast any of his abilities during the dash
@@Michallote because JAX basically compiles down to machine code when run, and while a lot of Pytorch iis done at a low level, there's still some overhead that's python running things. But conversely torch allows you to do dynamic ML (edit: actually pytorch also let's you compile but I haven't played with it personally), whereas JAX has to all be perfectly laid out from the start so it depends what you're trying to do. Torch is also just more useful to more people so it's more well supported so even if Jax would be better for your project, there's incentive to follow the herd and use the thing everyone else is using.
jax borrows a lot of ideas from haskell and functional programming it just has a more familiar syntax (python-style) and calls things differently (scan is a bifunctor for instance)
The sample code u shared at the end, so simple & elegant shows what training is all about. Approximation of a function F defined as multiplication of multiple matrixes (its layers), then calculating its error over known data E(F(x), y), computing gradients of that error E', & then updating F's parameters, such E' approach to zero. I will share this code to my students. Thanks again.
Do Elm in 100 seconds! You'll love it! It's an older JavaScript framework that had a lot of thought behind it. It's funny to see more popular ones catch up, but Elm did a lot of the things very well and it deserves some love!
@@bruwyvn For non native speakers XLR8 is just XLR8 in the translated language, usually who watches/ed ben10 at 10yo would not speak english, outside of native speaking countries. It took me years to realise that xlr8 is spelled accelerate
@@onça_pintuda999 Nah it really depends. For example, in the Hungarian dub its name would literally translate back to English as "Lightning Gremlin". They just took the message of it, and gave XLR8 a name a 10 year old could have come up with. Instead of a letter sequence that is otherwise nonsensical. Or, in the Japanese dub, they just called it "Accelerate", as they kept the pronunciation of all the aliens, not the spelling.
Active: Jax enters Evasion, a defensive stance, for up to 2 seconds, causing all basic attacks against him to miss. Jax also takes 25% reduced damage from all champion area of effect abilities. After 1 second, Jax can reactivate to end it immediately.
Was looking for this comment. I always wanted to see a library or tech called Jax, because all I got was AJAX concept from Javascript, which would not entirely fit it, but somehow always reminded me of the MK fighter.
Can I suggest R? It seems to be gaining traction in Google Courses, EdX, Uni, Etc. It'd be good to get a high level overview of the langugage that can be used to summarize it for business, friends, family, etc
This video really clarified how this isn't just another accelerated linear algebra library, but a tool developed by Google for futuristic new hardware. The constraints on immutability and the way it compiles to low-level code for accelerated Hardware seems ideal for machine learning. And the automatic differentiation feature is truly next level.
I'm an artist who worries constantly about saying the wrong thing on occasion. Meanwhile, I'm watching some guy talk about nuclear warheads. What a fascinating channel. 👏
My experience: ~1200 to 1500x accelaration of a function with numpy operations on a conventional 3070 GPU, by just adjusting the numpy code to JAX syntax. Just remember that array dimensions must be constant.
1) Always keep track of jungler pathing. 2) Freeze lane when ahead, even if the opponent does not contest and thus does not die, the gold and xp lead from the minion is enough to snowball. 3) Death timers are very low until level 7, don't greed a plate if a takedown is taken, you will lose in tempo and therefore reduce the advantage from the kill. 4) If opponent roams, hard shove then follow up if action not ended. Best case scenario the roams does not connect and opponent will lose wave. Worst case scenario, the roam is successful but as the lane is under the opponent tower, they can neither base nor push for plates, therefor the play is even in tempo even if it is losing in gold/xp.
I feel like I'm always just out of the grasp of understanding how something like this is useful for like 3d animation data... idk how the array of partial derivatives would be useful but it seems like it would be for some of the stuff I'm thinking about
normally, i feel a tad bit smarter after watching every one of these videos. this is one of the few ones that makes me realize how little i know and how much of a dumb-ass i really am.
@@incremental_failure I haven't used JAX myself. The Numba+Numpy combo has served me well so far. JAX feels more like hardware focused thing with an emphasis on Linear Algebra. I don't do data science/ML rather astrophysics so I don't have to directly deal with Linear Algebra stuff that much. Just numerical stuff
If it doesn't maintain internal state (with assorted memory overhead) or actually produces new copies of arrays, this is Hugely wasteful on the GPU memory bus. Wide scale SIMD architecture does Not model after OOP or FP concepts perfectly, there is a damn good reason they aren't driven the same way CPUs operate. Both approaches seem faulty and unavoidably slow/wasteful translation layers to CPU styled paradigms. Far more useful would be having access to well designed wrappers on most languages to the underlying APIs which too often require a shit ton of boilerplate and horrible docs along with badly designed interop calls.
"Imagine you're writing a nuclear warhead in your mom's basement" Oh I don't have to imagine fireship
Lol worst day to possibly make that joke.
Relatable
FBI would like to know your location
@@gezenews?
@@GLUBSCHI he must know something we don't 😳
"Why did you start using JAX?"
Me: "I like the logo"
Now I have 10 years of experience in JAX. NICE!
Well done! Congratulations!
parallelized
Enough years for Jax
Good because that's the min req for the newest jobs using it
*writes in resume
Active: Jax enters Evasion, a defensive stance, for up to 2 seconds, causing all basic attacks against him to miss. Jax also takes 25% reduced damage from all champion area of effect abilities. After 1 second, Jax can reactivate to end it immediately.
Active: Jax enters Evasion, a defensive stance, for up to 2 seconds, causing all basic attacks against him to miss. Jax also takes 25% reduced damage from all champion area of effect abilities. After 1 second, Jax can reactivate to end it immediately.
Imagine if Jax had a real weapon?
ACTIVE: Jax Dash dashes to the target unit's location.
If the target is an enemy and they are in range upon arrival, Jax deals them physical damage.
Jax can cast any of his abilities during the dash
Gotta try AP Jax someday, one time I had this one rando Jax who went AP in my team. He was melting everyone!
Was just scrolling waiting to find a comment like this.
As a software engineer with 10 years of experience, I understood some of those words.
I felt so stupid watching this
@@MaxMalm You're not the only one. 😉
Yeah, I guess that goes to show that those years of experience were wasted
This is what happens when people become software engineers from the internet lol
As a ML student I understood everything. That just means you're not into this stuff.
Time to add JAX in my resume
10 years experience
😂😂
Good idea. Thank you for the career advice
Gg no top
I watched this vid too, I'm going to do the same
Jax sounds like less abstract version of pytorch and more complicated version of numpy
Pretty much yeah
huh
However performance is above torch somehow
@@Michallote because JAX basically compiles down to machine code when run, and while a lot of Pytorch iis done at a low level, there's still some overhead that's python running things. But conversely torch allows you to do dynamic ML (edit: actually pytorch also let's you compile but I haven't played with it personally), whereas JAX has to all be perfectly laid out from the start so it depends what you're trying to do. Torch is also just more useful to more people so it's more well supported so even if Jax would be better for your project, there's incentive to follow the herd and use the thing everyone else is using.
jax borrows a lot of ideas from haskell and functional programming
it just has a more familiar syntax (python-style) and calls things differently (scan is a bifunctor for instance)
The sample code u shared at the end, so simple & elegant shows what training is all about. Approximation of a function F defined as multiplication of multiple matrixes (its layers), then calculating its error over known data E(F(x), y), computing gradients of that error E', & then updating F's parameters, such E' approach to zero. I will share this code to my students. Thanks again.
Oh boy another Fireship video!
truely the first
At least that one haven't made me anxious about AI taking over the world
that's a very clever logo
Then you should also love Tensorflow.
2:43 that sure looks cool and inspiring. but whats clever about it?
@@yash1152 it incorporates an impossible geometry
@@joseville is that related to the branding in any way?
> _"it incorporates an impossible geometry"_
don't know JAX about JAX
The impossible geometry in the logo is similar to a Penrose triangle
"Imagine if i had a real weapon!"
I WAS LOOKING FOR IT xd
now we need another library call annie
My thoughts exactly, especially the color scheme looking like his default costume
League x Programming = Legendary
@@potatosheep found the scripter
i'm interested in computer science, calculus and statistics, but i only have 300 seconds in total
Then you're in the right place.
Then you need Matlab - short, fast, mathematical!
give 100 seconds to each and you're done.
I have never heard the best explanation of derivative from calculus like 'if you want to know how big is the mushroom cloud is growing ' .
Do Elm in 100 seconds! You'll love it! It's an older JavaScript framework that had a lot of thought behind it. It's funny to see more popular ones catch up, but Elm did a lot of the things very well and it deserves some love!
0:25 wow, they incorporated an impossible triangle into their logo! that's really cool!
- “Mum, could I practice my predictions from automatic differentiation in our garden?”
1:47
"Let's start with X: Accelerated (...)"
"???"
That's because X is pronounced "eks", and accelerated is pronounced like "eks-elerated"
Clearly you haven't watched BEN10
ai fireship strong with this one
@@bruwyvn For non native speakers XLR8 is just XLR8 in the translated language, usually who watches/ed ben10 at 10yo would not speak english, outside of native speaking countries. It took me years to realise that xlr8 is spelled accelerate
@@onça_pintuda999 Nah it really depends.
For example, in the Hungarian dub its name would literally translate back to English as "Lightning Gremlin". They just took the message of it, and gave XLR8 a name a 10 year old could have come up with. Instead of a letter sequence that is otherwise nonsensical.
Or, in the Japanese dub, they just called it "Accelerate", as they kept the pronunciation of all the aliens, not the spelling.
Active: Jax enters Evasion, a defensive stance, for up to 2 seconds, causing all basic attacks against him to miss. Jax also takes 25% reduced damage from all champion area of effect abilities. After 1 second, Jax can reactivate to end it immediately.
0:55 the points of inflection are marked wrong on the graph. The yellow points are local extremums, not inflection points.
Dang missed opportunity using the Jax from league of legends on the video
missed opportunity to take a shower
Who?
great video but still very sad we did not work in a mortal kombat jax reference
Was looking for this comment.
I always wanted to see a library or tech called Jax, because all I got was AJAX concept from Javascript, which would not entirely fit it, but somehow always reminded me of the MK fighter.
Thanks, I can now add 10 years experience of JAX to my résumé!
Thank you, Fireship. Up until I watched this video, I thought I was smart.
Could you do a ROCm video (AMD's CUDA couterpart) ? Love you and your vids !
For those who don't know:
Fireship lives on a planet whose gravity is stronger than our Earth, that's why his 100 seconds are our 226..
Brilliant back at it again, everywhere I go I see brilliant.
Holy shit automatic differentiation is so freaking cool. Imma need to remember this when I jump back into calculus
I just opened RUclips and thought it must be about time for a new fireship video. And bang! There it is
A must like video, due to your effort to research this type of content, keep that good work
Man that logo design for Jax is amazing
Love the visuals in this one
Actually understood a lot more than I thought I would randomly clicking on this video.
Can I suggest R? It seems to be gaining traction in Google Courses, EdX, Uni, Etc. It'd be good to get a high level overview of the langugage that can be used to summarize it for business, friends, family, etc
You videos with Brilliant sponsorship are max quality ser.
This video really clarified how this isn't just another accelerated linear algebra library, but a tool developed by Google for futuristic new hardware. The constraints on immutability and the way it compiles to low-level code for accelerated Hardware seems ideal for machine learning. And the automatic differentiation feature is truly next level.
Thanx totally not chat gpt or equivalent
How you edit your videos? . It was an amazing edit And which platform do you use for such arrows and animantions .
100 seconds is a bit of a stretch, but well what can you do. Keep up the great work.
Oh, that is actually useful. The abstraction is very much welcomed here. Can't wait to see RandBLAS and RandLAPACK too!
That's exactly the thing I needed lol, thanks a lot.
This is te best youtube channel on the internet. I just wish youtube wasnt censored so i could expresshow much I enjoy the content
nice I just literally started learning JAX after using TF and torch for a long time
I don't understand anything you say, but I love watching these videos anyway.
Automatic differentiation is fascinating
ah yes that one character from The Amazing Mortal Kombat Circus
Finally something I’ve used in this channel :)
Just about time I add it in my CV
Hey Fireship, what software do you use to edit/produce your videos? thx
common W fireship vid
i love the 100 seconds series lol
I'm an artist who worries constantly about saying the wrong thing on occasion. Meanwhile, I'm watching some guy talk about nuclear warheads.
What a fascinating channel. 👏
Please do scala next!
I have 20 years experience in JAX and I must say I love it
There is an error for mushroom example with array, it's a 3 elements array but you try to get arr[3], je gradient is [60., 4., 3.]
My experience: ~1200 to 1500x accelaration of a function with numpy operations on a conventional 3070 GPU, by just adjusting the numpy code to JAX syntax. Just remember that array dimensions must be constant.
The linear algebra field going to the moon with this one
I saw the title and thought it was going to be on Jax-RS. I'm glad it wasn't!
Those who didn't come from Tiktok, are allowed to like this comment.
Who df would come from TikTok?
The video dropped 2 fucking minutes ago
Skibidi toilet?
what is Tiktok :)
Lets get out of this schrisophobia of social media and embrace❤
I said "Im not going to use JAX probably not going to watch". I'm glad I did, it actually looks useful lol.
The most understandable 100 seconds ever
*jax from the amazing digital circus* ⁉️
What i thought
Nah, from Mortal Kombat
please end it all my bro
I was sure there would be at least 1 comment like this...
No it's the classic creator from Geometry Dash, the one behind the legendaray The Nightmare
Good to see its not just JS languages in the merry-go-round
Who else is proud of our little Jeff for graduating to a brilliant sponsorship?
That logo is sickly designed
All those new languages and libraries are converging closer to functional programming
I would love to see Smalltalk in 100 seconds
You explained it better and faster than my ML professor
Hoped to get some tips on top lane
1) Always keep track of jungler pathing.
2) Freeze lane when ahead, even if the opponent does not contest and thus does not die, the gold and xp lead from the minion is enough to snowball.
3) Death timers are very low until level 7, don't greed a plate if a takedown is taken, you will lose in tempo and therefore reduce the advantage from the kill.
4) If opponent roams, hard shove then follow up if action not ended. Best case scenario the roams does not connect and opponent will lose wave. Worst case scenario, the roam is successful but as the lane is under the opponent tower, they can neither base nor push for plates, therefor the play is even in tempo even if it is losing in gold/xp.
Jax seems really cool.
its about time for another 100 seceonds
My head is exploding like a nuclear mushroom cloud after this one
I just LOVED this logo
Really want to see how impoved versions of KANs wil perform using JAX
I feel like I'm always just out of the grasp of understanding how something like this is useful for like 3d animation data... idk how the array of partial derivatives would be useful but it seems like it would be for some of the stuff I'm thinking about
My math and CS degrees allow me to understand everything you just said without really knowing how to put it to good use.
Can i put a jax differentiation as the gradient of my scipy simulation?
JAX actually stands for "Just After eXecution" according to the original white paper
Thanks to Fireship, my GPU keep me warm at night.
I have a 30 day streak in brilliant, and I can confirm is really useful
Thanks for the video mate now I can add JAX to my resume
Thought this was a TADC compilation video, but I'll take this instead
What a cool logo it has!
"i invented a new hyper parallel execution language!"
new or shading language?
"shading language"
Annie would love this programming language
100 seconds (adjusted for inflation)
oooh how much I have to learn.
please which softaware best for screen recording for coding
Very cool video! May I suggest you tackle Dyalog APL (or APL in general) in the future?
normally, i feel a tad bit smarter after watching every one of these videos. this is one of the few ones that makes me realize how little i know and how much of a dumb-ass i really am.
Thanks Fireship, this library will definitely accelerate the development of my nuclear warhead!
"Imagine if I had a real weapon"
We missed you
I think Numba also deserves its own "100 seconds of" now
I'm using Numba and NumPy extensively now, combined as well. Is Jax worth learning?
@@incremental_failure I haven't used JAX myself. The Numba+Numpy combo has served me well so far. JAX feels more like hardware focused thing with an emphasis on Linear Algebra. I don't do data science/ML rather astrophysics so I don't have to directly deal with Linear Algebra stuff that much. Just numerical stuff
@@kinomonogatari Yeah I'm now into the ML stuff but my level is that of a toddler. Too much new tech to learn.
All this reminds me of haskell.
Btw, is there a way a write a code for GPU in haskell?
Recruiters are gonna ask for ppl with 15 years of experience with JAX
Mr fireship, please don't scam us, this ain't no 100 seconds anymore, Thank you ❤
You know you're balling if you have your own TPU
Imagine I tell the interviewer I have 100 seconds of experience in JAX
as a math student learning java I can confirm I have ten year experience in JAX
If it doesn't maintain internal state (with assorted memory overhead) or actually produces new copies of arrays, this is Hugely wasteful on the GPU memory bus.
Wide scale SIMD architecture does Not model after OOP or FP concepts perfectly, there is a damn good reason they aren't driven the same way CPUs operate.
Both approaches seem faulty and unavoidably slow/wasteful translation layers to CPU styled paradigms.
Far more useful would be having access to well designed wrappers on most languages to the underlying APIs which too often require a shit ton of boilerplate and horrible docs along with badly designed interop calls.
So you're saying it's matlab's lin alg libraries in python?
Jax from TaDC?