Robotic palm mimics human touch

Поделиться
HTML-код
  • Опубликовано: 27 сен 2024
  • MIT News: news.mit.edu/2...
    Paper: arxiv.org/abs/...
    Authors: Sandra Liu (MIT CSAIL) & Ted Adelson (MIT CSAIL)
    Videographer: Mike Grimmett
    Director: Rachel Gordon
    PA: Alex Shipps

Комментарии • 7

  • @arods
    @arods 4 месяца назад +1

    You guys are amazing!

  • @shivamduhan7700
    @shivamduhan7700 4 месяца назад +1

    good progress all in all, but still a long way to go to make a holistic robotic hand that is as compliant as a human hand and adept at varying actuation pressure based on the object being grabbed.

  • @Lagoad
    @Lagoad 4 месяца назад

    The machines rose from the ashes of GPT.

  • @macratak
    @macratak 4 месяца назад

    go ted

  • @paulalubbe
    @paulalubbe 4 месяца назад +2

    This is so cool, a deeper dive into this would be amazing!

  • @AdityaMehendale
    @AdityaMehendale 4 месяца назад +1

    Can you please elaborate upon:
    1) What parameter is being sensed? Is it sense the extent (angle) of flexion/extension? .. or is it a pressure-analog for the loading on the digits?
    2) What is the inference-parameter? ..and what does it have to do with color? What does the camera "see" to make its inference?

  • @AerialWaviator
    @AerialWaviator 4 месяца назад +1

    Very inspiring demos. Simple and minimalistic, yet paratactical approach.