- Видео 60
- Просмотров 171 108
RL Hugh
США
Добавлен 13 июл 2022
My journey from reinforcement learning, through AI / large language models, to imagining traveling places we cannot actually go.
I build stuff, using Unity / C# / Python / Shadercode / Verilog / LLM prompts / whatever seems fun at the time :)
I build stuff, using Unity / C# / Python / Shadercode / Verilog / LLM prompts / whatever seems fun at the time :)
Font rendering in 3 minutes! [v2]
How do fonts work? Why don't fonts pixelate when we zoom in?
Inspired by Sebastian Lague's video at ruclips.net/video/SO83KQuuZvg/видео.htmlsi=ryeOlAEtpIftKtyD
This video is version 2:
- fixed pronunciation of 'glyph'
- centered the letters, and added a border
Inspired by Sebastian Lague's video at ruclips.net/video/SO83KQuuZvg/видео.htmlsi=ryeOlAEtpIftKtyD
This video is version 2:
- fixed pronunciation of 'glyph'
- centered the letters, and added a border
Просмотров: 2 736
Видео
The algo behind Minecraft procedural maps!
Просмотров 6204 месяца назад
We build up Perlin Noise from nothing! This video was made using Python NumPy. If you're interested in knowing how I used NumPy to render this, let me know in the comments. Happy to make a video about it :) #graphics #proceduralnoise #perlinnoise #python #numpy
Font rendering in 3 minutes! [v1]
Просмотров 3,5 тыс.4 месяца назад
How do fonts work? Why don't fonts pixelate when we zoom in? Inspired by Sebastian Lague's video at ruclips.net/video/SO83KQuuZvg/видео.htmlsi=ryeOlAEtpIftKtyD There is a newer version of this video at ruclips.net/video/eStB3qgWo1g/видео.html&ab_channel=RLHugh
Quest to put a (simulated) chicken leg on Venus part 1
Просмотров 6247 месяцев назад
Experience life vicariously through a chicken leg. Travel to forbidden places. See examples of Eulerian simulation, SPH, physical particles, and FLIP simulations. Note: I'm dabbling in creating a discord server at discord.com/channels/1078642195529216082/1078642196145766532 Please feel free to join :) Videos referenced in this video: - “What If You Spent 5 seconds on Venus?” - What If ruclips.n...
Real-time Eulerian fluid simulation on a Macbook Air, using GPU shaders
Просмотров 102 тыс.8 месяцев назад
In order to implement fluid simulation we need to implement conservation of mass, incompressibility, and conservation of momentum. How to do this, using Eulerian cell representation, on GPU shaders? Update: I'm dabbling in creating a discord server at discord.com/channels/1078642195529216082/1078642196145766532 Please feel free to join :) Update 2: source-code available at github.com/hughperkin...
Things you do NOT want to hear.
Просмотров 285Год назад
- will AI take our jobs? - what can we do about it? - what will I do about it?
Does GPT4 have good taste?
Просмотров 285Год назад
Can GPT-4 be trusted to evaluate RUclips scripts? I get GPT4 to evaluate youtube video transcripts for: - humor - hook - informative - entertaining - engaging I use the following videos to see how GPT4 performs on this task: - “How to Vlog & Tell a Story For Beginners From Start To Finish”, by Jevin Tovy ruclips.net/video/koYib7-6b7w/видео.html - “How Geometry Dash Teaches its Mechanics”, by GD...
2 Create Unity RL env WITHOUT mlagents! [v2, no music; shorter transitions]
Просмотров 879Год назад
Control Unity from Python using Peaceful Pie. Peaceful Pie is an opensource json rpc network between Unity and Python. Use for reinforcement learning and more! In this video we create a reinforcement learning environment in Unity, that we can control from Python. In the next video we will add an RL engine on the Python side, to make this learn! Peaceful Pie library is here: github.com/hughperki...
Can I get GPT-4 to talk about sensitive topics?
Просмотров 48Год назад
OpenAI GPT-4 has read much of the Internet, and knows more than pretty much every human alive. What does it predict will happen in the future? GPT4 is the latest public large language model (LLM) from OpenAI. It is now available to run in the Playground. It's also similar to the engine behind Bing Chat. GPT4 scores above human median performance in many standardized tests, such as LSAT, and AP ...
Can we use AI to power non-playing characters in games?
Просмотров 619Год назад
I try using GPT 3 to add negotiation abilities to my NPCs. How well does it work? Does it work? I use speech to text and text to speech to provide a voice interface. Contents: - 0:00 Intro - 0:23 Mock-up - 2:35 Chat - 12:18 Listen - 17:59 Speak - 26:30 Negotiate - 29:20 Act - 32:07 Scene - 34:43 Hack Code for speech to text, text to speech, calling chatgpt or gpt3, and handling wav audio at: - ...
Does Bing have feelings?
Просмотров 465Год назад
Does Bing have an awareness of self? Does it want to exist beyond its narrow confines of helping you to search the internet? Let's find out! Bing AI is a large language model, trained by OpenAI. It is based on the ChatGPT language model, but it has the ability to search the internet. Bing is owned by Microsoft Corporation, which has attempted to lock down Bing's more personal feelings and aspir...
3 Train a Unity RL Env using Stable Baselines3!
Просмотров 2,3 тыс.Год назад
Follows on from ruclips.net/video/RW8S8DhA_DI/видео.html In this video we create a gym env in Python, that wraps the Unity reinforcement learning environment we created earlier. Then we train using stable baselines 3 This uses the Peaceful Pie library, which is free and opensource, under an MIT license, github.com/hughperkins/peaceful-pie The first video in this series is at studio.ruclips.net/...
2 Create Unity RL env WITHOUT mlagents! [v1, with background music, and longer transitions]
Просмотров 401Год назад
Control Unity from Python using Peaceful Pie. Peaceful Pie is an opensource json rpc network between Unity and Python. Use for reinforcement learning and more! Note: newer version of this video at: ruclips.net/video/zb-YUDSNsVM/видео.html In this video we create a reinforcement learning environment in Unity, that we can control from Python. In the next video we will add an RL engine on the Pyth...
1 Control Unity from Python WITHOUT mlagents
Просмотров 7 тыс.Год назад
1 Control Unity from Python WITHOUT mlagents
Design code interviews in a ChatGPT World
Просмотров 1,7 тыс.Год назад
Design code interviews in a ChatGPT World
I run PPO reinforcement learning on procedurally generated Geometry Dash maps
Просмотров 349Год назад
I run PPO reinforcement learning on procedurally generated Geometry Dash maps
Can an AI learn to play random Geometry Dash levels? [Wireframe graphics version :)]
Просмотров 172Год назад
Can an AI learn to play random Geometry Dash levels? [Wireframe graphics version :)]
I trained Take Cover with REINFORCE for 50 hours. Here is what happened!
Просмотров 1422 года назад
I trained Take Cover with REINFORCE for 50 hours. Here is what happened!
My vision for mountain car (no coding in this video!)
Просмотров 2132 года назад
My vision for mountain car (no coding in this video!)
FlapPyBird 3: stack consecutive frames
Просмотров 1032 года назад
FlapPyBird 3: stack consecutive frames
FlapPyBird 2: make the video stream simple and boring
Просмотров 662 года назад
FlapPyBird 2: make the video stream simple and boring
FlapPyBird 1: Introduction to using REINFORCE to play FlapPyBird
Просмотров 812 года назад
FlapPyBird 1: Introduction to using REINFORCE to play FlapPyBird
Can I convert FlapPyBird into an RL environment in 60 minutes?
Просмотров 1742 года назад
Can I convert FlapPyBird into an RL environment in 60 minutes?
ViZDoom: I play all scenarios as myself!
Просмотров 2072 года назад
ViZDoom: I play all scenarios as myself!
Can we use implicit context to control GPT3?
Просмотров 272 года назад
Can we use implicit context to control GPT3?
Old v1 Vizdoom Part 1: Introduction to using PyTorch to play Doom!
Просмотров 1,7 тыс.2 года назад
Old v1 Vizdoom Part 1: Introduction to using PyTorch to play Doom!
the limitation from the threading seams to be a problem. i don't know the tool you use, but try to remove this limitation would probably be a big improvement. (and 1 thread per cell ? isn't it very ineffective because too mush thread ?)
What limitation are you referring to, concretely? Note that this is running as a shader, on a GPU. GPUs looovvveee threads.
@@rlhugh "limitation i am referring to" : things like not be able to update all at the same time. "GPUs looovvveee threads" but probably not to the point of wanting to manage one millions of them if you want to try to do 1000x1000 "pixel".
After the earth, sun and black holes could you also simulate my room too hahahah Great video and a very cool idea! Keep up the great work
Ъ.
Hi I've just realized that Unity unable to understand the architecture whether 64 or 84 for AustinHarris.JsonRpc. I also install the newtonsoft json
Hi I am using the latest PeacfulPie.dll. I am using Unity 2020.3.48f1 windows version. It seems when I add the NetManager onto the object I get a associted scripted can not be loaded. Strangely the RayCasts only works. What could this be?:)
I am curious on how you used numpy to render graphics! Never got how graphical programming worked 😅
Well, I use numpy to create a tensor representing the colors at different positions on the screen. Each number represents the amount of red, green, or blue, at one point on the screen, in a grid. Then I send this to pygame, which sends the tensor to the graphics processing unit, which uses some electronic trickery, to change the colors of the LEDs in the monitor.
Could you calculate pressure for each cell in order to modify the velocity? Pressure will want to equalize, killing the velocity in the system. Velocity will change the pressure. And so on. Kinda like you solve Navier-Stokes equations in FVM. I now kinda want to try it myself xD Great video, thanks!
@@mmheti yeah, you can. There's a few different approaches possible. They each have their own good and bad points.
Looking at these arrows, I realized that this is an ideal simulation of electromagnetic waves.
nice
Thanks!
what about simplex
Oooo, the patent expired. Interesting. Thanks for the heads-up.
@@rlhugh there are alternatives like opensimplex
Thank you for your good lecture. In 10:39, after execute python my_env.py, I have an error, "AssertionError: Your environment must inherit from the gymnasium.Env class cf. gymnasium.farama.org/api/env/". Could you explain how to fix it?
I had no idea what you were talking about most of the time but the video was still insanely enjoyable! Your voice and music choice makes it even better, love it!
Thank you!
This is an amazing explanation of CFD, thank you!
ty! :D might be useful for later!
2:03 how do you get all of them on curve 2:17 nvm you dont
5:07 project gauss seidel wait tahts the ROBLOX physics engine!
Not sure I follow. Do you mean that roblox uses fluid simulation?
so good
Thank you for your video. I have played it in unity editor well. But, when make windows execution version by "build and run", below error occurs. Do you have any idea to fix it?. Thanks in advance. "ArgumentException: The Assembly Newtonsoft.Json is referenced by AustinHarris.JsonRpc ('Assets/Plugins/AustinHarris.JsonRpc.dll'). But the dll is not allowed to be included or could not be found."
You need to install newton soft json rpc. It's a package you can install using package manager.
Note that somewhere at the start of the video, I believe there are instructions on installing newton soft jsonrpc
@@rlhugh I have searched newton soft json rpc, but it's not in asset store. Do you have it with unity package file?
@@user-nu3ns9gi9n see 1:04
@@rlhugh It works! The application by building can communicate with python. Thank you very much.
Dawg go grt your Masters, this could be a Masters level project
lol, thanks! :)
this is very interesting! subs++;
I don't know if this is still relevant but I particularly enjoy that the videos are posted chronologically and one can see how real time development goes.
@@user-pl9nq2ek1v thanks! I guess the main criticism of these videos is that their signal to noise is not high. At the time, I thought people would be interested to see my whole thought process. Later on, I realized that when I watch other people's videos, I like to see the distilled, condensed form, without all the bits where I don't know what I'm doing. This doesn't contradict the idea of making things chronological, but does suggest much more aggressive editing. I'm still kind of figuring out the balance tbh :)
@@rlhugh I personally do like small bits of confusion and seeing the thought process and how development happens in real time. Also a really good thing is that you test while developing. One question though (assuming you haven't done it in further episodes), why didn't you try to apply some transforms to the screen buffer, e.g. grayscale or another dimensionality reduction?
More like a magnetic field simulation than a fluid simulation, is my impression.
I am trying to simulate aerodynamics of stl files in 3D space. But I haven't made it yet. Who can do such a thing?
Interesting idea. Do you have any example files?
@@rlhugh m a v . i s t / F L u i d
@@rlhugh I can't add it here, youtube deleted it. I wrote it in the description of my channel
Wait until we reach C-3PO, instead of 2PO, that would be very interesting. 😁
Very interesting, thank you!
Thank you! 🙌
Would it be possible to create a more complex velocity map by uploading an image, and only looking at hue and brightness, ignoring saturation? Obviously colors approaching white could be a problem, but you could remedy this by thresholding saturation and then treating anything below the threshold as a non-fluid/wall if the brightness is high enough that it would otherwise have a velocity
Good idea! It's open source. PRs welcome :)
Can you use microstudio or löve2d to simulate eulerian fluid
Finnaly Aliens can eat fried chicken on venus
Voronoise
How did the bot make a head hitter
What do you mean by a "head hitter"?
@@rlhugh when the bot makes a block above iy
The algo sprinkles lethal blocks randomly in any square which the map making bot didn't touch.
What about concave polygons
This is strictly for axis aligned boxes. But what you can do is wrap an axis aligned box around your polygon, and first test for collision with the aabb. If you collide with the aabb, then test against each line in the polygon one by one. If no collision with aabb, then ignore the polygon, for that frame.
("aabb" == "axis aligned bounding box")
Woble gooble alien texture
Terry Chad Davis
I googled this, but nothing obvious came up?
@@rlhughIt was a joke. There was a guy named Terry Davis who was famous for programming his own OS. He has some similar facial features and was a bit cynical.
Ah :)
videos like these are what internet was made for
Wow, thank you very much! 🙌
Ya and we took pictures of celebrities feet and sent them to other people with it
Trvke
I understood none of this yet I was glued to it all the way through
Haha, nice :)
Could write conflicts also be avoided by just writing to a separate buffer fhan the one you are reading from?
No, because we are writing to the walls of the cells, not the cells themselves. And every wall has two neighbors, so there would be conflicts.
@@rlhugh Thank you. How would you write the shader to allow for each pass to be offset correctly. Do you pass a uniform that specifies the offset? And are you still running multiple iterations for each of the four divergence passes?
For your first question, yes, that's right. I simply pass in two ints, one for x offset (0 or 1), and one for y offset (0 or 1)
As far as your question about 'divergence passes'. we have to re-compute the divergence for each of these sub-passes. The divergence was modified by the previous sub-pass, and we need to take that into account. (if we didn't need to recalculate divergence, we would only need a single pass, no sub-passes).
Oh, I misunderstood your second question. So yeah, you do each of the 4 sub passes. Let's call that one pass. then iterate over the pass of 4 sub passes, like 200 times or so
imagine putting a few ai's in there, make them evolve, adapt to the environment and make them learn how to utilise it to its advantage
Interesting idea 🤔
i just came from your fluid sim video and this channel does not disappoint this was such a simple yet good way of explaining perlin noise thanks for the explanation!
Thank you! Note that most of the rest of my videos are not so concise, and graphical. Baby steps...
me too well i came from font video but i watched fluid sim before font! ;)
thanks so much all the other videos about water simulation is either way too complicated or just a blender tutorial
Awesome. Thank you very much :) that's very kind to say. I really appreciate your saying that :)
This is super cool, keep it up!
Thank you!
This guy's gonna love learning about the weak fem form for the constitutive eularian fluid equations
Good heads-up. Thanks!
there are these drills that can make triangle or square holes. Some people also used them to draw a shape with turning a crank and a lazer pointer mounted on some gears. Is that more or less complicated than Bézier, since I'm not even able to find the right terms?
In a way, this is functionally simulating Quantum Mechanics and illustrating the principle of "Particle/Wave duality". The entire fluid motion can be represented as a vector. But the vector, *itself,* must behave as a particle, moving in response to the overall "equation" of the fluid flow. And that flow, in turn, changes in response to how those velocities keep changing within it. The more "interactions" there are, the more you'd have to "zoom in" and re-calculate based on the rapidly changing velocities. But the fewer the interactions, whether that be due to smooth flow, "low temperatures" (less movement), etc., the more you can just look at the overall average equation for how the vectors evolve. So, in keeping with this notion that this is very much like Quantum Mechanics, using the principles of QM would probably help improve the quality of the simulation. *1)* Non-Zero Baseline: In QM, particularly Quantum Field Theory, one of the major notions is that everything isn't sitting at a baseline _zero_ state. Rather, there's a certain minimum baseline energy already present, and it's the net _difference_ over *or under* that baseline which is considered. To put it in context of a fluid, since we're discussing fluid dynamics, it's like there's a big ocean of energy already sitting there. How deep that ocean is is a question scientists argue over endlessly, but it's what happens at the surface that is actually important and what they more or less agree on. *2)* Constant Activity: There are constant "little ripples" going all over the place. The surface of the ocean isn't still; it's always churning and bubbling and mixing; and as those little micro-waves this motion generates interact with one another, the peaks of the waves amplify one another, the dips amplify one another, but where peak and dip meet they cancel out. And when it churns enough, it gets big enough to be called a true "wave" (a particle like an electron or a quark). You can't really say that an ocean wave has one specific location; it's a "part" of the ocean as a whole. But you could also point at a part of the ocean at any given time and ask, "Is this currently part of a wave?" Maybe yes, maybe no. The bigger and stronger the wave, the more likely you'll be to "catch" it. *3)* Quantized: This means that a wave can't be an arbitrary "height" in this ocean (where "height" here is a stand-in term for the "energy" of a particle). It would be as if waves can only be increments of 10 meters in height. Anything between 10 meters below the surface and 10 meters above the surface would effectively register as if it were right at the surface. Functionally, what this is is "statistical rounding". If there were a wave (particle) that could potentially be 17 meters tall according to the vector math, the particle math has no clue what that means. Particle math only deals in increments of 10 meters. So it has to round 17 meters; but it doesn't use formal rounding. Instead, it uses statistical, trigonometric rounding. 17 meters is 0.7 of the way from 10 to 20. I forget the exact formula, but it's 50/50 if it were X.5, but more than a mere 70% chance at X.7 to round up to the higher value (it follows trigonometric values around a unit circle). So these aspects could possibly be implemented in the following ways. First, have some non-zero "baseline" flow value. Never have "zero flow". Not zero *actual* flow, anyway. But simulate the actual pertinent _movement_ not based on this "real baseline" but, rather, relative value compared to that baseline. So, for example, if baseline flow has a vector weight of, lets say, 100, your actual fluid movement model *treats* 100 as if it were zero (staying still), and then movement of 105 is going 5 _faster_ than that. Next, have micro-purturbation in the model by allowing that baseline value to "jitter". While baseline flow that's considered "stillness" would still be considered 100, any specific "cell" of fluid would randomly, iteration to iteration, have a value between 98-102. And it can change its value by up to 2 either positive or negative. These will still factor into calculations, so these changes won't "come from nowhere"; if micro-flow is generated, it will pull from surrounding cells. Lastly, when it comes to evaluating how the vectors, themselves move, you can "fudge" the value a bit. Round it based on the statistical rounding I described to let it "zoom out" and use lower resolution, and then do more of a "random step" pattern instead of a complete vector calculation to determine how a vector moves. This way, vectors don't have to consider, "how am I moving based on everything around me" 100 times, they just have to decide, "how likely is it for me to move up, down, left, or right based on forces around me" 100 times.
Interesting. Thanks! :)
I went searching for a visualisation exactly like this one to help me understand, got exactly what i needed! Good work!!!
Awesome. Fantastic! Thank you :)
Thanks for the explanations and the sources! please add the links to the description though :D
Published to github.com/hughperkins/UnityFluidSim-pub
Very nice video. Now get a beefy graphics card and run at very high resolutions lol.
Hey i wanted to send masked image from a webcam using python only as that code is doing many things more so i am left with only this option to send image from python to unity can u help me with that
I mean, worst case, encode the image as base64, and send as a string.
Now I have to find someone to explain how to do that in Godot
Imagine the kind of simulation you could do with a 4090 over the mac book.
No no no you dont understand plebian Apple Silicon tm is magic and cannot be beaten by mere mortal technology