MIT News: news.mit.edu/2... Paper: arxiv.org/abs/... Authors: Sandra Liu (MIT CSAIL) & Ted Adelson (MIT CSAIL) Videographer: Mike Grimmett Director: Rachel Gordon PA: Alex Shipps
good progress all in all, but still a long way to go to make a holistic robotic hand that is as compliant as a human hand and adept at varying actuation pressure based on the object being grabbed.
Can you please elaborate upon: 1) What parameter is being sensed? Is it sense the extent (angle) of flexion/extension? .. or is it a pressure-analog for the loading on the digits? 2) What is the inference-parameter? ..and what does it have to do with color? What does the camera "see" to make its inference?
You guys are amazing!
good progress all in all, but still a long way to go to make a holistic robotic hand that is as compliant as a human hand and adept at varying actuation pressure based on the object being grabbed.
The machines rose from the ashes of GPT.
go ted
This is so cool, a deeper dive into this would be amazing!
Can you please elaborate upon:
1) What parameter is being sensed? Is it sense the extent (angle) of flexion/extension? .. or is it a pressure-analog for the loading on the digits?
2) What is the inference-parameter? ..and what does it have to do with color? What does the camera "see" to make its inference?
Very inspiring demos. Simple and minimalistic, yet paratactical approach.