@@reauxbloxx6963 they are steam, steam is valve, steam is all about game designers selling stuff so why wouldn’t the dip their fat corporate toes into selling things made buy a maker? (As long as it’s reasonable enough)
If that happens, then there must be an international lore to forbid any women that goes by the name ‘Caroline’ to interact with this man. Last time that happened, a robot committed mass murder on its company’s employees with gas.
There are already lemon liquers on the market that are alcoholic enough to readily ignite, so in a sense combustible lemons are already a thing. Things get more complicated if you want the lemons to still be lemon-shaped. Barring doing serious genetic engineering to create a cultivar of lemon which produces some highly flammable chemical(s) and concentrates enough of it in the fruit to behave as a proper incendiary; there are a few methods that come to mind. A) Inject a highly flammable substance into regular lemons, e.g. with large needle and syringe. B) Slit open the skin of a regular lemon, remove the inner lemon flesh, replace with some highly flammable substance, close and seal the cut in the lemon skin. C) Use the fact that with sufficient chemists, time, energy, and budget you can rearrange any chemical(s) into any other chemical(s) with the same number of each type of atoms to build an arbitrary incendiary device from the lemons. D) Use (C) along with the fact that with sufficient high-energy physicists, time, energy, and budget you can rearrange chemical(s) into any other chemical(s) via fission/fusion elemental transmutation to build an arbitrary incendiary device from the lemons. E) Heavily ferment the lemons into flammable substances such as methane, ethanol, acetone and declare the resultant product to be combustible lemons. F) Buy a Ford Pinto, it's already a combustible lemon.
If you do still have the the thought of eventually making an original personality core with the voice of a consenting person, I'd suggest contacting a youtuber called Harry101UK. He's done voices for multiple Portal related fan projects and has replicated the voices of some canon characters as well. I think he'd be great for something like that.
i love your take on the ai part, glad you actually took in consideration the fact that it's a real human being who voiced wheatley considering a large part of creators don't seem to care
Huge props for not using AI to generate more Wheatley content. Way too many people out here comfortable with AI just casually being trained on unauthorized data sets then et loose to produce unique content for profit
But i doubt he is making profit with the voice, he can only profit with the views and the view amount wouldnt change just because he put a few AI generated voices
Theres a somewhat straight forward solution to the animation mapping issue. You could just have a manual keyframing system in which you sort of eye-ball the in-game animation positions and then manually copy those into your animatronic via keyframes for each motor over time. Would allow for very specific movements to be made without any puppeteering system. In regards to the physics problem, you could get around this by simply limiting how far you move a given piece of Wheatley in a given timeframe, though it might be challenging. Either way I think the end result would be pretty cool.
Thank you for not using an AI voice of the actor for your project 🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏 (And gosh i really want to make a Wheatley too !!! Congratulation for you work!!)
So very happy to see you aren’t using ai for his voice. I already have so much respect for you but that has absolutely grown in that little explanation, so very happy to hear that.
2d animator here, but I guess it applies here too. You could make some key poses and program it so the values of the mechanism interpolate between the values of each pose (and make the eye blink separately). Also you could add in-out acceleration to make it look better
There are two different Wheatley voices. Obviously Stephen Merchant is in the released game. But there was an early preview where Richard Lord, an artist from Valve and a Brit spoke the lines. He's actually also the original modeller and rigger of Wheatley. When the game came out, lots of people were disappointed because of course they would be, because they already fell in love with Lord's performance. Works at Double Dagger now, i think. A way to deal with differences in the rig is animation retargeting, and it is an active research field with some approaches. The only plausible way i know of for differences in fundamental rig design is through IK, you calculate from the animation where the final position of important endpoints is, normally say on a vaguely humanoid character that's done with say toe, wrist, pointing fingertip, you want to boil down the complex animation to as little data as possible, plus constraint guide angles, so say elbow position is not used directly but it is used in the solving process, do some transformation to bring your target points within the target rig movement range, and then calculate (solve) back to the joint positions that are needed to achieve that pose with your rig. Obviously IK is a whole can of worms in and of itself, since not every position has a valid solution, and not all solutions are temporally coherent, but it's a "solved" problem of sorts. And then probably filtering and fixup, so basically insert magic here. So for sure it's going to be a whole research endeavour with no 100% solid recipes.
So, on the topic of transferring animations from the game. Inertia, can't do anything about that. But as far as retargeting, if you wanted to play with this (because it might make a good video, or not idk) YOLO Pose could, in theory, be used to collect animations from pertinent animation sequences recorded from the game. If these bones mapped to your control inputs you could at least collect the motions even if they are impossible to perform at the same speeds in the real world. Disney also has a lot of literature on retargeting skeletal animations specifically for animatronics, I'd honestly be shocked if there's not a paper on going from animated character -> animatronic from them because it feels like the exact sort of problem they have a lot. Not saying you should do any of this, but since you seem to enjoy creating game characters it might be a toolset that's useful down the road.
I've brought it up before, but he could semi-easily design keyframe software for the various servos in Wheatley and just manually add in keyframes for the in-game animations by just eye-balling the motor positions based on a video of in-game Wheatley's animations each frame.
such an amazing animatronic and so cool to learn more about how it works! also, thank you for taking such a respectful stance on AI in this instance; to a fellow artist, it means the world
The mechanical arm version of the management rail has an almost identical design to the arms of the panels, so if you ever decide to make either of those, knowing that might be a good time saver.
Firstly I wanted to say this project is AMAZING! I l aways wondered if something like this were possible, and your version turned out even better than I imagined. I completely understand and agree with your stance on using an AI model as a TTS, but I do think it would be interesting to see a simpler system, using a LLM like Mixtral 8*7b or similar running on a remote computer, to control devices like lights/AC without any auditory response (maybe just a nod of the head). Thanks for the awesome video as usual!
Hopefully Mr. Merchant answers soon! He has to see this wonderful recreation. As a compromise, you might try contacting Nolan North, voice actor of the other cores.
i betcha there are likely at least a few animators out there that would want to manually reanimate a few of the game animations into the irl wheatley. i guess the only thing necessary would be some kind of animation software for the animatronic
Regarding using the animations, not going in depth with what I did, but I have personally wanted to build wheatley for years but haven't had the financials for it. But software work is free 😉 I was able to extract the model and animations into a threeJS and also PyOpenGL instance, as the skeleton is different you don't need the mechanical movements, just the rotation in physical space. So extracting a couple of the bone rotations or physical positions such as the eye or faceplate could be translated back onto your robot. But it is a big endeavour and honestly idle animations could just be hand coded as sin equations.
It is absolutely amazing what you have done and if I ever win the lottery you will not be able to refuse the amount of money I offer you to build me a Wheatley. Plus if I do win the lottery I'm paying you obscene amounts of money to create me a glados hahahahahahaha
alrighty, im a lover of personality cores, and ive even been thinking about replicating your IRL version into a game model as a mod for Portal 2. 1. for "boss" Wheatley (the model of Wheately that takes control of the facility), youd have to add at *minimum* four extra servos for moving the side plates and top and bottom bits. in your current design, that would cause an issue since your design mounts to the rail from the side, therefore takes power from the side. it would add way too much complexity, plus might cause problems with the design keeping *any* stability. plus, you'd have to change the handles to be rotated independently from each other on both sides. besides, the boss Wheately model doesnt even have the handles lol 2. the "stick" from the intro sequence literally stabs *into* the model awkwardly 3. i would 100% be down to recreate animations that mimic the in-game ones, but that actually follow your rigs. its not too hard to import the animations into another software (for me, it ends up being Blender) 4. my inner child is SCREAMING, cause i WANT ONE, but im *damn* too poor to make one, even with changing the prints to PLA and whatnot. all in all, friggin love this! i cant wait to see more of your projects, and id honestly be curious to see you do more Portal stuff in the future. there's a couple of other bots in P2 you could do (id love to see you do the animated arms that the test chambers use), but id also like to challenge you with the Portal 1 personality core. its has a completely nonsensical design in terms of IRL replication (its basically a sphere sandwiched between two floating plates), but im curious as to how one could possibly work around that. maybe that could be a challenge for myself later on... who knows!
I'm sure you probably trusted that that your wheel arrangement would hold him, but I believe you can hold more weight, a bigger one, hang something from him like clothes or whatever, if you were to add another one to the top and configure them something like a w
Wheatley would do nice with an internal battery (if possible) that you can charge with those magnetic chargers... things so if he ever disconnects from his rail, he should not die.. yet
Next project: Building a Aperture Sience Wheatley assembly machine (Just kidding, I'm looking forward to what you're doing next, even if it's not portal related :) )
I love everything about this video & am thoroughly enjoying watching all of your work - but I mainly want to comment on respecting the actor's performances & not building an AI model without consent. You already had my respect for your amazing work, but it's somehow increased even more. 🙏
10:10 have you considered talking to a professional animation rigger? If the animation could be extracted, it's possible that it could be translated to the other model using inverse kinematics if an expert was at the wheel. At that point though, original animation is probably the move
it's a shame you can't use the animation files. wish there was another way to replicate them exactly. i think i just have his animations ingraved into my head.
i actually have an idea to give wheatley more life without using the orginal voice actor's. You could hire someone like harry10k who does a good wheatley impression and get his permission
What i rhink would he neat is face tracking, translated to ps3 controller inpults. It would take hella code but it would allow you to record voice lines ans facial animations at the same time. Voice line play, synced animation play
Ok, i have a SERIOUS QUESTION! I haven't asked this yet, but i haven't yet seen any movement of his eye piece move inwards or outwards yet, i get that outwards may be hard given this shell is exactly the size of the eye piece, but inwards should not be a problem. Why hasn't it happened yet?
I like how Valves response wasn't "No, you can't sell our character", but "No, that sounds like an untenable business idea."
Valve is very kind to allow “fan projects” or fan made things like recreations
lol yea they pretty much said “ with what money?”
@@reauxbloxx6963 they are steam, steam is valve, steam is all about game designers selling stuff so why wouldn’t the dip their fat corporate toes into selling things made buy a maker? (As long as it’s reasonable enough)
@@isaacmurray8490 so that’s why I saw a video on steam multiplayer on “valve”’s RUclips channel
@@reauxbloxx6963Contractors: Team Fortress 2 VR? Team Fortress 2: Source 2?
"If someone wants to pay me a lot of money to make personality cores.... there's ways to reach me", I approve of this
Now you only have to make combustible lemons and you can be the new Cave Johnson!
If that happens, then there must be an international lore to forbid any women that goes by the name ‘Caroline’ to interact with this man.
Last time that happened, a robot committed mass murder on its company’s employees with gas.
He's an entire Aperture Science team in a nutshell.
What about the
Genetic
Lifeform
and
Disk
Operating
System
@@AUTISM.GAMINGHe did it earlier, and I am very GLaD about it.
There are already lemon liquers on the market that are alcoholic enough to readily ignite, so in a sense combustible lemons are already a thing.
Things get more complicated if you want the lemons to still be lemon-shaped.
Barring doing serious genetic engineering to create a cultivar of lemon which produces some highly flammable chemical(s) and concentrates enough of it in the fruit to behave as a proper incendiary; there are a few methods that come to mind.
A) Inject a highly flammable substance into regular lemons, e.g. with large needle and syringe.
B) Slit open the skin of a regular lemon, remove the inner lemon flesh, replace with some highly flammable substance, close and seal the cut in the lemon skin.
C) Use the fact that with sufficient chemists, time, energy, and budget you can rearrange any chemical(s) into any other chemical(s) with the same number of each type of atoms to build an arbitrary incendiary device from the lemons.
D) Use (C) along with the fact that with sufficient high-energy physicists, time, energy, and budget you can rearrange chemical(s) into any other chemical(s) via fission/fusion elemental transmutation to build an arbitrary incendiary device from the lemons.
E) Heavily ferment the lemons into flammable substances such as methane, ethanol, acetone and declare the resultant product to be combustible lemons.
F) Buy a Ford Pinto, it's already a combustible lemon.
0:50 LOVE the huge side eye Wheatley gives you
This is the part where he kills you
@@Valveiscool.well this is the part where he kills us
@@OmarGarcia-kr4jz"The part where he kills you"
They told me never to disengage from my management rail. But im going to have to... 3 2 1 *fucking dies*
Roll credits
😂😂
If you do still have the the thought of eventually making an original personality core with the voice of a consenting person, I'd suggest contacting a youtuber called Harry101UK. He's done voices for multiple Portal related fan projects and has replicated the voices of some canon characters as well. I think he'd be great for something like that.
Harry and Mr. Volt would be fantastic
Harry101UK... *drags a cigarette* that's a name I ain't heard in a long time...
@@luckygitane he posted cerveza crystal a few days ago
this
@@sd1gaming typical has-been behavior
This man is fulfilling one of every Portal 2 player's dreams right here
so real, I’m excited for the life-size GLaDOS
@@S14M07gonna cost at least 100k
@@litillmeow millions because of working ai
@@cememems283 and Neurotoxin! wait... Neurotoxin?
@@captain02rex yes!
Using the Wrangler is actually Genious
reading this without watching the video is crazee
@@lloydmartel I still don't get it.
@@parkerlreed4:21 it’s a device from Team Fortress 2 designed to control an AI sentry gun. Without it, the turret slowly points toward a target.
As an engie main THIS makes my day.
Now if only he made his Wrangler control the sentry he made the same way it functions in game...(foreshadowing???)
i love your take on the ai part, glad you actually took in consideration the fact that it's a real human being who voiced wheatley considering a large part of creators don't seem to care
Huge props for not using AI to generate more Wheatley content. Way too many people out here comfortable with AI just casually being trained on unauthorized data sets then et loose to produce unique content for profit
Also bad impressions are cooler than... cgi Leia...
But i doubt he is making profit with the voice, he can only profit with the views and the view amount wouldnt change just because he put a few AI generated voices
He has used AI to do GLaDOS voice many times too
Theres a somewhat straight forward solution to the animation mapping issue. You could just have a manual keyframing system in which you sort of eye-ball the in-game animation positions and then manually copy those into your animatronic via keyframes for each motor over time. Would allow for very specific movements to be made without any puppeteering system.
In regards to the physics problem, you could get around this by simply limiting how far you move a given piece of Wheatley in a given timeframe, though it might be challenging. Either way I think the end result would be pretty cool.
@@turtletrimmings I just wrote this too haha. It's totally doable, but as everything it requires time.
Respect on the stance of AI, consent always matters and I thank you for that
It's a losing battle, but one worth fighting
honestly good on Valve for actually talking with you. Love the project!
It’d be so cool if you took your Wheatley and glados and animated them to talk to each other to re enact in game conversations
Thank you for not using an AI voice of the actor for your project 🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏 (And gosh i really want to make a Wheatley too !!! Congratulation for you work!!)
So much respect to you for honoring Stephen over the AI part. I'm happy Valve got to see your design!
DJ's "priceless" comment was priceless.
No way is this man actually using a sentry wrangler to puppeteer a personality core
Wheatley is awesome. Now the fun bit. Wheatley and Gladous chatting together
I love how the animatronic turned on! it's hilarious how he looks at you at 0:43. Great Work!
So very happy to see you aren’t using ai for his voice. I already have so much respect for you but that has absolutely grown in that little explanation, so very happy to hear that.
2d animator here, but I guess it applies here too.
You could make some key poses and program it so the values of the mechanism interpolate between the values of each pose (and make the eye blink separately). Also you could add in-out acceleration to make it look better
There are two different Wheatley voices. Obviously Stephen Merchant is in the released game. But there was an early preview where Richard Lord, an artist from Valve and a Brit spoke the lines. He's actually also the original modeller and rigger of Wheatley. When the game came out, lots of people were disappointed because of course they would be, because they already fell in love with Lord's performance. Works at Double Dagger now, i think.
A way to deal with differences in the rig is animation retargeting, and it is an active research field with some approaches. The only plausible way i know of for differences in fundamental rig design is through IK, you calculate from the animation where the final position of important endpoints is, normally say on a vaguely humanoid character that's done with say toe, wrist, pointing fingertip, you want to boil down the complex animation to as little data as possible, plus constraint guide angles, so say elbow position is not used directly but it is used in the solving process, do some transformation to bring your target points within the target rig movement range, and then calculate (solve) back to the joint positions that are needed to achieve that pose with your rig. Obviously IK is a whole can of worms in and of itself, since not every position has a valid solution, and not all solutions are temporally coherent, but it's a "solved" problem of sorts. And then probably filtering and fixup, so basically insert magic here. So for sure it's going to be a whole research endeavour with no 100% solid recipes.
Harry101 does a good Wheatley impression
It's damn near spotless
"He didnt even read it"
Wheatley:*bombastic. Side eye.*
very respectable response on the ai/voice actor question! I’m glad you’ve gone this route
2:17 You can solve your anxiety that easily, it's how I solve mine at least.
The Afton method
So, on the topic of transferring animations from the game. Inertia, can't do anything about that. But as far as retargeting, if you wanted to play with this (because it might make a good video, or not idk) YOLO Pose could, in theory, be used to collect animations from pertinent animation sequences recorded from the game. If these bones mapped to your control inputs you could at least collect the motions even if they are impossible to perform at the same speeds in the real world.
Disney also has a lot of literature on retargeting skeletal animations specifically for animatronics, I'd honestly be shocked if there's not a paper on going from animated character -> animatronic from them because it feels like the exact sort of problem they have a lot.
Not saying you should do any of this, but since you seem to enjoy creating game characters it might be a toolset that's useful down the road.
I've brought it up before, but he could semi-easily design keyframe software for the various servos in Wheatley and just manually add in keyframes for the in-game animations by just eye-balling the motor positions based on a video of in-game Wheatley's animations each frame.
Honestly, how the battery connected looks in the side is cooler than the og rail, it feels nice to look at!
such an amazing animatronic and so cool to learn more about how it works! also, thank you for taking such a respectful stance on AI in this instance; to a fellow artist, it means the world
The mechanical arm version of the management rail has an almost identical design to the arms of the panels, so if you ever decide to make either of those, knowing that might be a good time saver.
I like how most designs do not include Wheatley’s fractured eye. Shows that it comes BEFORE he awakens GLaDOS.
Dude, amazing work, design and engineering. So awesome!
Thank you!
Firstly I wanted to say this project is AMAZING! I l aways wondered if something like this were possible, and your version turned out even better than I imagined. I completely understand and agree with your stance on using an AI model as a TTS, but I do think it would be interesting to see a simpler system, using a LLM like Mixtral 8*7b or similar running on a remote computer, to control devices like lights/AC without any auditory response (maybe just a nod of the head). Thanks for the awesome video as usual!
Hopefully Mr. Merchant answers soon! He has to see this wonderful recreation. As a compromise, you might try contacting Nolan North, voice actor of the other cores.
11:17 SPAAAAACCEE!
Love your stuff man! Thanks for making these videos. I enjoyed just listening to you talk about this stuff :)
havent seen it yet but im exited
using the wrangler on wheatley is the most cursed thing i've seen today. also, when sentry gun
i betcha there are likely at least a few animators out there that would want to manually reanimate a few of the game animations into the irl wheatley. i guess the only thing necessary would be some kind of animation software for the animatronic
Eventually you’ll make Jerry and the nanobot crew that tried to fire Wheatley
Regarding using the animations, not going in depth with what I did, but I have personally wanted to build wheatley for years but haven't had the financials for it. But software work is free 😉 I was able to extract the model and animations into a threeJS and also PyOpenGL instance, as the skeleton is different you don't need the mechanical movements, just the rotation in physical space. So extracting a couple of the bone rotations or physical positions such as the eye or faceplate could be translated back onto your robot. But it is a big endeavour and honestly idle animations could just be hand coded as sin equations.
Hey Loulogio gringo que gran trabajo hermoso!
love u AND wheatley, sending all my support as always!! 👍
It is absolutely amazing what you have done and if I ever win the lottery you will not be able to refuse the amount of money I offer you to build me a Wheatley. Plus if I do win the lottery I'm paying you obscene amounts of money to create me a glados hahahahahahaha
looks good, brother.
Thank you for sharing this part of the build!
Great project and thank you for sharing the files!!
alrighty, im a lover of personality cores, and ive even been thinking about replicating your IRL version into a game model as a mod for Portal 2.
1. for "boss" Wheatley (the model of Wheately that takes control of the facility), youd have to add at *minimum* four extra servos for moving the side plates and top and bottom bits. in your current design, that would cause an issue since your design mounts to the rail from the side, therefore takes power from the side. it would add way too much complexity, plus might cause problems with the design keeping *any* stability. plus, you'd have to change the handles to be rotated independently from each other on both sides. besides, the boss Wheately model doesnt even have the handles lol
2. the "stick" from the intro sequence literally stabs *into* the model awkwardly
3. i would 100% be down to recreate animations that mimic the in-game ones, but that actually follow your rigs. its not too hard to import the animations into another software (for me, it ends up being Blender)
4. my inner child is SCREAMING, cause i WANT ONE, but im *damn* too poor to make one, even with changing the prints to PLA and whatnot.
all in all, friggin love this! i cant wait to see more of your projects, and id honestly be curious to see you do more Portal stuff in the future. there's a couple of other bots in P2 you could do (id love to see you do the animated arms that the test chambers use), but id also like to challenge you with the Portal 1 personality core. its has a completely nonsensical design in terms of IRL replication (its basically a sphere sandwiched between two floating plates), but im curious as to how one could possibly work around that. maybe that could be a challenge for myself later on... who knows!
Thanks :D
This guy is underrated
The music for the ad segment for onshape. It's the music from Worlds Adrift. Miss that game.
I LOVE THIS SO MUCH
Just imagine that Wheatley would just randomly turn on while he was off in the backround
My FNaF awareness would just go wild
3:15
Inaccurate!1!111!!
everything in aperture can survive in apocalyptic environments as low as 1.1 volts!!!1!!1!!
I mean changing some stuff would make that possible, like the mother board and power crintrol stuff
wrangling wheatly is HILARIOUS
I'm sure you probably trusted that that your wheel arrangement would hold him, but I believe you can hold more weight, a bigger one, hang something from him like clothes or whatever, if you were to add another one to the top and configure them something like a w
3:20 - They told me that if I ever disconnected myself from my management rail, I WOULD DIE.
Dude you deserve alot
@Harry101UK would be a great choice for a personality core's voice, he makes great videos about cores. (He's also a Valve employee [or former, idk])
3:35 where can i get these
When you unplug Wheatley make him scream which gets cut off😈
Wheatley would do nice with an internal battery (if possible) that you can charge with those magnetic chargers... things
so if he ever disconnects from his rail, he should not die.. yet
It’s crazy that people are turning valves characters that are Ai and making them real
In Lamens terms, its really sick
wheatley but he's your morning alarm
Wouldn't it be possible to use some sort of inverse kinematics and use the eye as the "point" the IK follows?
This is so dope.
You should make more of them like the one that’s based off the turret
Harry101UK can do a wheatly voice maybe get him to do animation and James Burton does a lotta work in inverse kinematics that could help
Next project: Building a Aperture Sience Wheatley assembly machine
(Just kidding, I'm looking forward to what you're doing next, even if it's not portal related :) )
Kudos!!
Maybe adapt the management rail so that when you enter the room it comes out of a hatch/box and slides into the room
I love everything about this video & am thoroughly enjoying watching all of your work - but I mainly want to comment on respecting the actor's performances & not building an AI model without consent. You already had my respect for your amazing work, but it's somehow increased even more. 🙏
This guy's great!
10:10 have you considered talking to a professional animation rigger? If the animation could be extracted, it's possible that it could be translated to the other model using inverse kinematics if an expert was at the wheel.
At that point though, original animation is probably the move
What if you used facial tracking software, like Vtubers use, to have Wheatley mimic your expressions in real time?
10:47 now i kinda wanna see you make wheatly in his central core mode
(you know when he replaces glados
12 volts? I wonder if there's a way to get it to as few as 1.1 volts for apocalyptic low power environments
remind me to come back here when i’m a billionaire so i can commission a fleet of aperture animatronics
I would love to voice a personality core some day, kinda makes me want to workshop a character in theory for fun
image if you used the newest versions pieces to make a more accurate wheatley crab!
it's a shame you can't use the animation files. wish there was another way to replicate them exactly.
i think i just have his animations ingraved into my head.
i actually have an idea to give wheatley more life without using the orginal voice actor's. You could hire someone like harry10k who does a good wheatley impression and get his permission
When you detached the power source from Wheatley, he just died so dramatically
neat
Wheatley runs on 12 volts? Gotta make sure he can still function at 1.1 volts.
Your living my dream
Aperture is becoming real… just very slowly…
You should try to make panels!
That wrangler is sick, did you release any information about it? Would love to try and build one
People might make like the combine advisor
Vengo de ver el short de glados después de que me diera nostalgia y escuchara still alive, sus robots se mueven muy bien, es increíble
What i rhink would he neat is face tracking, translated to ps3 controller inpults. It would take hella code but it would allow you to record voice lines ans facial animations at the same time. Voice line play, synced animation play
Apple vision pro: ❌
Animatronic Wheatley: ✅
8:40 So real for that.
Ok, i have a SERIOUS QUESTION! I haven't asked this yet, but i haven't yet seen any movement of his eye piece move inwards or outwards yet, i get that outwards may be hard given this shell is exactly the size of the eye piece, but inwards should not be a problem.
Why hasn't it happened yet?
Wheatley could have won home assistants year of the voice contest. I believe they’ll be doing another……
You know what, I think Valve was think right for once on this one. Totally understandable. :)
Hey when Will You make the video with the steps it's because i m making one