2013 | OriginalPaper | Buchkapitel
Grasping with Your Face
verfasst von : Jonathan Weisz, Benjamin Shababo, Lixing Dong, Peter K. Allen
Erschienen in: Experimental Robotics
Verlag: Springer International Publishing
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by (Link öffnet in neuem Fenster)
BCI (Brain Computer Interface) technology shows great promise in the field of assistive robotics. In particular, severely impaired individuals lacking the use of their hands and arms would benefit greatly from a robotic grasping system that can be controlled by a simple and intuitive BCI. In this paper we describe an end-to-end robotic grasping system that is controlled by only four classified facial EMG signals resulting in robust and stable grasps. A front end vision system is used to identify and register objects to be grasped against a database of models. Once the model is aligned, it can be used in a real-time grasp planning simulator that is controlled through a non-invasive and inexpensive BCI interface in both discrete and continuous modes. The user can control the approach direction through the BCI interface, and can also assist the planner in choosing the best grasp. Once the grasp is planned, a robotic hand/arm system can execute the grasp. We show results in using this system to pick up a variety of objects in real-time, from a number of different approach directions, using facial BCI signals exclusively. We believe this system is a working prototype for a fully automated assistive grasping system.