I prefer something more Doctor Octopus style XD! But are you gonna stuff that arm with sensors, tools and gadgets? Please stuff it with Gadgets....Laserpointer, mini vacuum cleaner, Walkman, nerf gun, abacus, ...
Rotation matrices can also avoid the numerical singularities which we experience as gimbal lock, and they are much easier to understand so maybe he could start with that. The only downside is that rotation matrices carry 9 numbers instead of 4 for quaternions.
The problem is, his arm is quite literally a physical gimbal. It's a limitation of the anatomy to an extent, and as a result (if you want something that is attached to your arms) you will have to duplicate sensors I believe in different orientations.
Very cool! Using EEG (brain electric signals) combined with an AI is more or less what is being done by Neuralink. EMG (muscle electric signals) is also being used for moving prosthetics. Seeing these topics explored here is exciting! I can't wait for what's next.
@@jamesbruton Can this project be made into a much bigger project,(like the hacksmith industries' alien loader) with higher budget, crew and longer time. I would love to see something like this as opensource, cause we as humans are not that far.
This wouldn't just benefit people with disability, but also people working with heavy machinery. As they could use a much stronger and buffed version of this arm for lifting heavy stuff or working in un-survivable environments(such as deep sea and space).
I get what you're saying, but I want to nitpick: generally EEG as a term is restricted to noninvasive, through-the-skull electrical signal detection, not the more direct route Neuralink is taking.
**thump** **thump** "No, bend more in this direction." **whirring noises** **thump** **thump** "Stop jerking so much." **thump** **thump** **whirring noises** Imagine what his neighbors think when they hear that coming from his attic.
This is exactly the kind of thing I’ve wanted to do since I mistakenly choose to focus on computer engineering in Jr. High. 🙏 Thank you so much for letting me live the science vicariously through you 😂
Changing careers is hard but not impossible. Make sure you do hobby projects in your desired field and document them properly, it'll help you get a job later!
@@ashwin372 If you document your private projects with the same quality as you are expected to document a professional project, you can use them to impress employers. "Holy shit if this is how he documents is hobby code, his work must be amazing"
You’ve done it mate 1Mil Awesome job. Iv been watching you for years and you are continuously improving your content and builds. Keep it up. Love your work
james i love you content, i'm 14 and i love build things rn i'm trying to learn how to code for aurdino and i hope that i can be as good as you one day
This is pretty impressive for a few minutes' training on a microcontroller with limited sensors. I wonder what it could do with something like an Xsens mocap suit worn for a few days and trained on a powerful GPU.
You could consider Deep Reinforcement Learning with added imitation learning in the reward function. PPO and DDPG are some solid options if you are considering. For my thesis I'm working on the control of an active transfemoral prosthesis during normal walking using DRL.
We should make public research like this more common! For people who are serious about researching new technology or ideas but A. don't have access to equipment, B. aren't interested with getting a doctorate degree and doing research at an academic institute, and C. have a practical knowledge background on use cases (aka real world). There is lots of untapped potential innovation to be had out there in the world!
IMUs are so fiddly to get working if you want anything more complicated than "did the sensor tip." Have you looked at the 9 axis versions with magnetometers? You might be able to filter out the roll/pitch confusion. I'd love to see a whole series making a full IMU digital motion capture suit if you wanted to make one.
Now to scale it up, add the rest of the limbs, an omni treadmill, and make a mech. Can use the learning as a feedback for a 'fly by wire' controller, even if you have all your limbs. Theoretically if it can predict your movements, it would react quicker than a direct 1:1 controller, with enough training.
You make me wonder why I’m even in college for mechanical engineering when I could have just done this type of stuff for a living. You have the dream job I wish I could do stuff like this everyday instead I’m doing math and memorizing formulas and stuff
Would love to see an open source build for an assisted walking device, exoskeleton. Possibly for older people with walking troubles. Thanks for your great content!
Thank you _so much_ for this - been working on a conceptually identical project for a few years now (but on the tensorflow/cuda level and non-public) and this honestly was a huge asset to those works. Really appreciate the upload - you may have just supercharged a bunch of efforts to make the concept closer to a deployable reality thanks to a lot of the ideas you have introduced here. All the best!
After thinking about this for a while, I realized that you had already designed a pretty good arm concept for what you're trying to do. Remember the Performance Robots? They had shoulders with a good range of motion, elbows, forearms that rotate and grippers. Anyway, it just seems like those arms would have the range of motion to adhere more consistently to the trained behavior.
This is awesome, this is where people working on prosthetics should focus on, not more electrodes on the skin, but more use of modern neural networks, I can hear Karol from "2 minutes papers" saying "2 more papers down the line" I can't wait for that to happen.
I see absolute no one so far, sooo: Congratulations for reaching finally the 1 Million Mark you deserve it sooo much and I hope the Next Years are going to be Interesting AF and also Awesomer, Better and much more Complicated haha again Congratulation for the 1 Million
Functionality aside, this is the best looking stuff you have designed, imo. Sleeker and seems even more refined. Great work and presentation, as allways !
This plus the muscle monitors could be amazingly unique. Not only would you have accurate motor controls, but unique situation awareness relating strength of signal and force feedback. This would allow someone the ability to differentiate between crushing a brick and shaking a hand whereas currently gripping is much more of an on off signal.
@@Matty.Hill_87 yup, and about 60% of their nervous system is located in their tentacles, so essentially each tentacle has a mind of it's own, which makes the octopus an excellent multitasker. also, scientists have observed how a severed tentacle is still able to move, and do stuff, and have even been seen grabbing and bringing food to where the octopus's mouth should be (if the tentacle was still on it)
@@Matty.Hill_87 Cephalopods are weird and almost completely alien when compared to other animals. Granted, chordates are pretty freaky too. Sharks are literally all teeth on the outside, avians/dinosaurs grow super fast, and then there are mammals.
You should wear a camera on the head and gloves with markes to train a hand that picks up various shaped objects. You could add the same markes on the robotic arm and having it mimicing your gestures based on the object you want to grab.
I don’t think EMG would be helpful in this situation since you already have sensors on each joint but the best way to get the best results with a system similar to this is to try and get data from everything you can. Like try combining an EEG hat thing with this system using machine learning and work on improving response time then you should end up with a very good prosthetic system to then compact, improve, and finalize the prosthetic. It’s quite ambitious but I believe with your skill and knowledge you could create a fully functioning prosthetic system using machine learning, “motion capture”, and EEG. I’m excited for the future of this project and this channel. Maybe you could even create a prosthetic company. Good luck!
This tech could be used for something like a mech suit (cosplay or otherwise). Of course a direct read from the arm sticker things would probably be the better option in that particular scenario.
yes james, this is what I've been talking about! you can 3D print a Dexter's Laboratory exosuit if you throw in some aluminum extrusion. just make opendog longer and you're done!
This is an amazing video, Jame really is a genius. I dabble in electronics & robotics, so I know how difficult it is to learn and build stuff like this, especially stuff this complicated and at the rate that he releases videos. The only critique/question that I have about this one is why didn't he just use potentiometers and MPUs/gyroscopic mechanisms in the motion suit, with a machine learning algorithm receiving data from those sensors. It would also simultaneously be fed data from another pre-trained AI algorithm to recognize his arms. He could put long strips of reflective tape down his arms, as well as adding strips around their circumference, would make the arm-recognition algorithm easier to train and more reliable. Once he's trained the model for long enough, and is wearing the arm, he could add buttons to tell the algorithm whether that was a good or bad motion, using it to progressively tune the algorithm to exactly how you like it. Also, I'm not criticizing in any way, his method got the job done, I'm just probing a little to see if I might have any good ideas on improvements.
Idk why but this is so funny, all I can imagine is your wife or someone from your family just entering the room and see you marching in the same place with a plastic arm, I wonder what their reaction would be ahahahah, but very nice video and interesting comcept like usual in all your other videos ;) keep it going, I love robots and motors and electronics, all I wich is that I had more money to mess around doing projects like yours, with robot arms and automatic stuff :)
You should definitely make a choreographed slow motion fight. Have sensors on one arm and set the robot arm to react to those sensors. Then make the outputs be motions that would block slow punches and attacks. That would look really cool.
This is pretty much exactly what I've been wanting to do - but lack the money and expertise. My ultimate would be a wearable backpack rig that has at least one robotic arm that can, at the very least, reach out and hold objects.
Thank you so very much for sharing your process and results. I have been thinking of working on a similar concept, but utilising fpv quad flight controller boards, as they work on all axis' in a very small form factor. They could be used to record the data of one limb, and also to output the data to manually control a prosthetic. The flight controllers are small enough to attach just below each point of articulation on the "learning" limb. Combined with your outlined machine learning, they should then be able to output motion commands to result in a more natural movement style.
You should make a wearable, low-profile and lightweight powered exoskeleton that can be worn while driving a car, and can assist in driving heavy vehicles on rough terrain without power steering.
I think it would do you good to create a wireless IMU tracker "puck" you can mesh into different configurations. An IMU unit + arduino nano + BT/Wifi module "unit" you can attach to your body, and with some configuration, attach to an IK rig. This way any time you need to read limb position/rotation for a project, you just strap on pucks on the required body parts, assign each puck to a rig body part, and read out the IK limb positions. It would require a bit of research and development, but would free you from further case-specific tracker development.
in the future we'll be able to add prosthetic limbs and control them with our mind,i'd imagine it would take about a week of real use to master them but the possibilities of things you can do once you do will be so fun
You might want to get some 6-axis IMUs, 3-axis magnetic sensors, piezoelectric flexing sensors, velocity encoders and PID controllers all involved. The more sensors, the better. (And let's not forget a camera-based motion capture rig, and maybe some additional sensors on your chest to detect your abdominal muscle movements.) If you combined this with both OpenBCI and MyoWare's stuff, you might get some pretty awesome stuff. Multiple layers of context-sensitive filtering might help as well. Honestly with the current state of brain-computer interfaces, they really are based suited for supplementing other input methods, than they are for replacing them.
This is fascinating! I'm sure Ogma can open a lot of possibilities for future projects but I wonder if you could use it to improve the efficiency of past projects like OpenDog and/or one of your Walking Robots (which I always like to see)!
It would have been really interesting to see you wear motion capture for a day as you went about your normal day training a model, and then worn the arm with the other motion capture and seen what the arm did
Just casually makes a motion capture suit. This would be really good for people who do 3D animation and vr! If you can do a video of purely making a motion capture suit I’m sure it would help a lot.
Would you mind controlling a large-scale excavator by the motions of a toy excavator? Or a backhoe or a bobcat/skidsteer/whatever British word there is for them. I think it would make excavator operation more accessible, which... might be good.
James, you could give the IMU BNO080 a try. It has internal processing that gives quaternions which are highly useful to find global orientation. That will reduce the processing cost and allow you to sample data at a higher rate, also solving your vertical orientation problem. Amazing video!
I wonder if full body tracking for like a VR suit would provide better data, or perhaps a camera system? The other source of data could be measure EKG on the limbs as apposed to the head, might yield less noisy data.
Honestly this would be great for a Grievous or Goro cosplay!
Or a transformer based on a Fiero.......
If someone managed to use this successfully in a goro cos play that would be insane
_"A fine addition to my collection!"_
It has to have the spinning hands to make it just right lol.
I prefer something more Doctor Octopus style XD!
But are you gonna stuff that arm with sensors, tools and gadgets? Please stuff it with Gadgets....Laserpointer, mini vacuum cleaner, Walkman, nerf gun, abacus, ...
Using Quaternions and spatial rotation will address your gimbal lock (where the axis rotations get confused when aligned with another axis when moved)
Rotation matrices can also avoid the numerical singularities which we experience as gimbal lock, and they are much easier to understand so maybe he could start with that. The only downside is that rotation matrices carry 9 numbers instead of 4 for quaternions.
The problem is, his arm is quite literally a physical gimbal. It's a limitation of the anatomy to an extent, and as a result (if you want something that is attached to your arms) you will have to duplicate sensors I believe in different orientations.
ruclips.net/video/hhDdfiRCQS4/видео.html
you might like this
@@paul_vantieghem ruclips.net/video/hhDdfiRCQS4/видео.html
Geometric Algebra has a concept of bivector rotations which is also useful
Very cool! Using EEG (brain electric signals) combined with an AI is more or less what is being done by Neuralink. EMG (muscle electric signals) is also being used for moving prosthetics. Seeing these topics explored here is exciting! I can't wait for what's next.
Still planning my next move. I want to see what other suggestions there are on the video and have some time to think about what's practical.
@@jamesbruton Can this project be made into a much bigger project,(like the hacksmith industries' alien loader) with higher budget, crew and longer time. I would love to see something like this as opensource, cause we as humans are not that far.
This wouldn't just benefit people with disability, but also people working with heavy machinery. As they could use a much stronger and buffed version of this arm for lifting heavy stuff or working in un-survivable environments(such as deep sea and space).
I get what you're saying, but I want to nitpick: generally EEG as a term is restricted to noninvasive, through-the-skull electrical signal detection, not the more direct route Neuralink is taking.
@@Taygetea yeah that's why I tempered my statement with. "more or less" I just wanted brevity as this is just a RUclips comment.
**thump**
**thump**
"No, bend more in this direction."
**whirring noises**
**thump**
**thump**
"Stop jerking so much."
**thump**
**thump**
**whirring noises**
Imagine what his neighbors think when they hear that coming from his attic.
This is exactly the kind of thing I’ve wanted to do since I mistakenly choose to focus on computer engineering in Jr. High. 🙏 Thank you so much for letting me live the science vicariously through you 😂
Changing careers is hard but not impossible. Make sure you do hobby projects in your desired field and document them properly, it'll help you get a job later!
@@experimentalcyborg can you please explain more about the documentation part?
@@ashwin372 If you document your private projects with the same quality as you are expected to document a professional project, you can use them to impress employers. "Holy shit if this is how he documents is hobby code, his work must be amazing"
Junior high has computer engineering focus? What?
We’re one step closer to getting a real Battle Droid. Roger roger.
Roger Roger
Roger Roger
Roger roger
Roger Roger
Roger Roger
You’ve done it mate
1Mil
Awesome job. Iv been watching you for years and you are continuously improving your content and builds. Keep it up. Love your work
james i love you content, i'm 14 and i love build things rn i'm trying to learn how to code for aurdino and i hope that i can be as good as you one day
Stick with it and you'll get there dude
Go on champ! You will be even better than him one day
You can, and you will, mate :)
Congrats on 1 million James, plz make video on your 3d modeling
This is pretty impressive for a few minutes' training on a microcontroller with limited sensors. I wonder what it could do with something like an Xsens mocap suit worn for a few days and trained on a powerful GPU.
You could consider Deep Reinforcement Learning with added imitation learning in the reward function. PPO and DDPG are some solid options if you are considering. For my thesis I'm working on the control of an active transfemoral prosthesis during normal walking using DRL.
HAPPY 1 MILLION SUBS JAMES!!!! AND WHAT A GREAT PROJECT TO START HEADED FOR 5 MILLION!!!!
We should make public research like this more common! For people who are serious about researching new technology or ideas but A. don't have access to equipment, B. aren't interested with getting a doctorate degree and doing research at an academic institute, and C. have a practical knowledge background on use cases (aka real world). There is lots of untapped potential innovation to be had out there in the world!
IMUs are so fiddly to get working if you want anything more complicated than "did the sensor tip." Have you looked at the 9 axis versions with magnetometers? You might be able to filter out the roll/pitch confusion. I'd love to see a whole series making a full IMU digital motion capture suit if you wanted to make one.
Dam he works quick I've been working on this exact same project for 2 years. Definitely gonna take some stuff from this video.
This was an insane project and it would be amazing to see this used for actual disabled people in the future
Another cool project, you have me hooked. Congratulations on the 1M !!!!!
James: *making an exoskeleton*
CIA: "Interesting"
Stuff like this is why subscribing and being a Patreon of you is such an easy choice.
Now to scale it up, add the rest of the limbs, an omni treadmill, and make a mech.
Can use the learning as a feedback for a 'fly by wire' controller, even if you have all your limbs.
Theoretically if it can predict your movements, it would react quicker than a direct 1:1 controller, with enough training.
Would probably more like, assume I'm still doing what I was doing and continue it until you get a different input, rather than predict the future.
You make me wonder why I’m even in college for mechanical engineering when I could have just done this type of stuff for a living. You have the dream job I wish I could do stuff like this everyday instead I’m doing math and memorizing formulas and stuff
Wow Just Amazing!
Would love to see an open source build for an assisted walking device, exoskeleton. Possibly for older people with walking troubles. Thanks for your great content!
I have no idea how this man has this many ideas and consistently puts out these insane videos
Thank you _so much_ for this - been working on a conceptually identical project for a few years now (but on the tensorflow/cuda level and non-public) and this honestly was a huge asset to those works. Really appreciate the upload - you may have just supercharged a bunch of efforts to make the concept closer to a deployable reality thanks to a lot of the ideas you have introduced here. All the best!
After thinking about this for a while, I realized that you had already designed a pretty good arm concept for what you're trying to do. Remember the Performance Robots? They had shoulders with a good range of motion, elbows, forearms that rotate and grippers. Anyway, it just seems like those arms would have the range of motion to adhere more consistently to the trained behavior.
I really hope you continue this project! It could end up helping a lot of people.
Seems if you walk around for a day or more normally training the algorithm and switch it to playback, it would probably surprise you how good it is.
Just noticed you are already 1million subscribers, congratulations James!!!!
Congrats on one mil!
Cool! Carry on with the useful robot project too
This is awesome, this is where people working on prosthetics should focus on, not more electrodes on the skin, but more use of modern neural networks, I can hear Karol from "2 minutes papers" saying "2 more papers down the line" I can't wait for that to happen.
You're just amazing. How come this video didn't get a million views is just beyond my comprehension?
I see absolute no one so far, sooo:
Congratulations for reaching finally the 1 Million Mark
you deserve it sooo much and I hope the Next Years are going to be Interesting AF and also Awesomer, Better and much more Complicated haha
again
Congratulation for the 1 Million
Thank you so much 😀
Functionality aside, this is the best looking stuff you have designed, imo. Sleeker and seems even more refined.
Great work and presentation, as allways !
11:38 "That's why there's a slight lag" - like your positivity :)
Congratulations for 1 million subscribers. You DESERVE more. 👏👏
This plus the muscle monitors could be amazingly unique. Not only would you have accurate motor controls, but unique situation awareness relating strength of signal and force feedback. This would allow someone the ability to differentiate between crushing a brick and shaking a hand whereas currently gripping is much more of an on off signal.
This project reminds me of how the multiple brains in an Octopus are thought to function.
Octopus have multiple brains? That explains how they're so good at problem solving
@@Matty.Hill_87 yup, and about 60% of their nervous system is located in their tentacles, so essentially each tentacle has a mind of it's own, which makes the octopus an excellent multitasker.
also, scientists have observed how a severed tentacle is still able to move, and do stuff, and have even been seen grabbing and bringing food to where the octopus's mouth should be (if the tentacle was still on it)
@@this_commenter_had_a_stroke that's crazy, I'm going to have to watch some Documentarys now
@@Matty.Hill_87 Cephalopods are weird and almost completely alien when compared to other animals. Granted, chordates are pretty freaky too. Sharks are literally all teeth on the outside, avians/dinosaurs grow super fast, and then there are mammals.
@@KnightsWithoutATable no source needed for mammals being freaks XD
This could really update your 2 leg walking robots. Train the AI on walking/standing still motion + remote control input. Cool stuff man!
Longtime member congratulations on hitting a million subs.
You’re my favorite kind of nerd !!!!!!! Love your stuff bubba
You should wear a camera on the head and gloves with markes to train a hand that picks up various shaped objects. You could add the same markes on the robotic arm and having it mimicing your gestures based on the object you want to grab.
Happy Gold Play Button James :)
Excellent demonstration with your prototype!
much excitement for this series
I don’t think EMG would be helpful in this situation since you already have sensors on each joint but the best way to get the best results with a system similar to this is to try and get data from everything you can. Like try combining an EEG hat thing with this system using machine learning and work on improving response time then you should end up with a very good prosthetic system to then compact, improve, and finalize the prosthetic. It’s quite ambitious but I believe with your skill and knowledge you could create a fully functioning prosthetic system using machine learning, “motion capture”, and EEG. I’m excited for the future of this project and this channel. Maybe you could even create a prosthetic company. Good luck!
This is really cool, can’t wait to watch what you are doing next.
1 Million subs! I've been watching you since your The RPF days. Congratulations!
James hit 1 Million Subscribers great job. 😊
CONGRATS ON THE 1M!!!
Anyone else noticed at 3:43 the company's name Ogma is also written in what I think is Ogham, thanks to a 2 year old Tom Scott video.
Great video again, please keep up the work as it's so interesting to watch the progression while you piece all the components together. Thank you.
Thank you so much James, may god bless you and your family, and youtube.
This tech could be used for something like a mech suit (cosplay or otherwise). Of course a direct read from the arm sticker things would probably be the better option in that particular scenario.
Yeah I was thinking with a bigger motor you could strap steel to it and it wouldn’t be exhausting to wear as a suit of armor.
Very cool to see it gone into
Thanks!
CONGRATS ON 1 MILLION JAMES 🎉🎉🎉
yes james, this is what I've been talking about! you can 3D print a Dexter's Laboratory exosuit if you throw in some aluminum extrusion. just make opendog longer and you're done!
Very cool. I'm really impressed with the results you got! You are a super skilled individual.
This is an amazing video, Jame really is a genius. I dabble in electronics & robotics, so I know how difficult it is to learn and build stuff like this, especially stuff this complicated and at the rate that he releases videos. The only critique/question that I have about this one is why didn't he just use potentiometers and MPUs/gyroscopic mechanisms in the motion suit, with a machine learning algorithm receiving data from those sensors. It would also simultaneously be fed data from another pre-trained AI algorithm to recognize his arms. He could put long strips of reflective tape down his arms, as well as adding strips around their circumference, would make the arm-recognition algorithm easier to train and more reliable. Once he's trained the model for long enough, and is wearing the arm, he could add buttons to tell the algorithm whether that was a good or bad motion, using it to progressively tune the algorithm to exactly how you like it. Also, I'm not criticizing in any way, his method got the job done, I'm just probing a little to see if I might have any good ideas on improvements.
“The next stage, of course, is to build an actual robot arm” is a very good sentence.
Congratulations for 1 million subscribers 😎
Hey you have 1M subscribers. Congratulations 👍👍👍👍👍
Congratulations on the 1 million subscribers 👏👏👏
Idk why but this is so funny, all I can imagine is your wife or someone from your family just entering the room and see you marching in the same place with a plastic arm, I wonder what their reaction would be ahahahah, but very nice video and interesting comcept like usual in all your other videos ;) keep it going, I love robots and motors and electronics, all I wich is that I had more money to mess around doing projects like yours, with robot arms and automatic stuff :)
*wish
I designed something like this in middle school, cool to see someone with the money making it real.
WOW. simply amazing. Keep up the great work
Divide up arm to forearm, elbow, upper arm, shoulder. They can then be combined for the whole movement. Same done for legs and other body parts.
Awesome! loving this project already. Also congrats on 1 Mil
You should definitely make a choreographed slow motion fight. Have sensors on one arm and set the robot arm to react to those sensors. Then make the outputs be motions that would block slow punches and attacks. That would look really cool.
you deserve so much more
This is pretty much exactly what I've been wanting to do - but lack the money and expertise. My ultimate would be a wearable backpack rig that has at least one robotic arm that can, at the very least, reach out and hold objects.
You never sieze to amaze me
phenomenal, hoping to see more!
Thank you so very much for sharing your process and results.
I have been thinking of working on a similar concept, but utilising fpv quad flight controller boards, as they work on all axis' in a very small form factor.
They could be used to record the data of one limb, and also to output the data to manually control a prosthetic.
The flight controllers are small enough to attach just below each point of articulation on the "learning" limb. Combined with your outlined machine learning, they should then be able to output motion commands to result in a more natural movement style.
You should make a wearable, low-profile and lightweight powered exoskeleton that can be worn while driving a car, and can assist in driving heavy vehicles on rough terrain without power steering.
I think it would do you good to create a wireless IMU tracker "puck" you can mesh into different configurations. An IMU unit + arduino nano + BT/Wifi module "unit" you can attach to your body, and with some configuration, attach to an IK rig.
This way any time you need to read limb position/rotation for a project, you just strap on pucks on the required body parts, assign each puck to a rig body part, and read out the IK limb positions.
It would require a bit of research and development, but would free you from further case-specific tracker development.
in the future we'll be able to add prosthetic limbs and control them with our mind,i'd imagine it would take about a week of real use to master them but the possibilities of things you can do once you do will be so fun
The rudimentary motion capture if worked on could be a good system for VR full body tracking
you the really life dr.octavius
3:00 Quaternions!
Just saw a Scott Manley video about Gimbal Lock and how quaternions solve this issue
You might want to get some 6-axis IMUs, 3-axis magnetic sensors, piezoelectric flexing sensors, velocity encoders and PID controllers all involved. The more sensors, the better. (And let's not forget a camera-based motion capture rig, and maybe some additional sensors on your chest to detect your abdominal muscle movements.)
If you combined this with both OpenBCI and MyoWare's stuff, you might get some pretty awesome stuff. Multiple layers of context-sensitive filtering might help as well.
Honestly with the current state of brain-computer interfaces, they really are based suited for supplementing other input methods, than they are for replacing them.
This is fascinating! I'm sure Ogma can open a lot of possibilities for future projects but I wonder if you could use it to improve the efficiency of past projects like OpenDog and/or one of your Walking Robots (which I always like to see)!
you are an amazing engineer who looks like he is still living in his childhood bedroom
Always wanted some Doc Ock arms; this is the way to do it :)
Nice work!
It would have been really interesting to see you wear motion capture for a day as you went about your normal day training a model, and then worn the arm with the other motion capture and seen what the arm did
Just casually makes a motion capture suit. This would be really good for people who do 3D animation and vr! If you can do a video of purely making a motion capture suit I’m sure it would help a lot.
congratulations for 1 million subs , mr james! i hope your channel grows even more
awesome stuff as always James
This is just damn cool
Congrats on 1mil!
very cool project, got ,e excited to fire up the ol' 3d printer again
Awesome work, James!
Coolest channel on RUclips
Would you mind controlling a large-scale excavator by the motions of a toy excavator? Or a backhoe or a bobcat/skidsteer/whatever British word there is for them. I think it would make excavator operation more accessible, which... might be good.
James, you could give the IMU BNO080 a try. It has internal processing that gives quaternions which are highly useful to find global orientation. That will reduce the processing cost and allow you to sample data at a higher rate, also solving your vertical orientation problem. Amazing video!
I wonder if full body tracking for like a VR suit would provide better data, or perhaps a camera system?
The other source of data could be measure EKG on the limbs as apposed to the head, might yield less noisy data.
Is Ultron on display in a science center or an amusement park somewhere?
Recycled unfortunately (I still have the head).
This is awesome! Great start!
What a wild project. This is so cool!