Skip to content

Details

Exploration of Neural-Gestural Interfaces for the Control of Robots - by Rebecca Oet, Melissa Kazazic

This project aims to use gestures & neural feedback to create a natural communication between the user and a robot. The gestural data is processed by the Myo armband; the neural data (user’s concentration level) is processed by the Emotiv headset. Combined, this data directs the robot’s movement.

We’ll discuss:

  • Background on neural interfaces, gestural interfaces, and robots
  • Overview of technology used in the project: Emotiv headset, Myo Band, Raspberry Pi
  • Python scripts
  • Applications and future directions

---

Module of the Month: pybleno - by Gary Johnson

---

Talk 2: Open

Want to present a talk? Submit it here: https://www.papercall.io/clepy

Related topics

Sponsors

Python Software Foundation

Python Software Foundation

Thank you for supporting ClePy on meetup.com

Happy Dog

Happy Dog

Thank you Happy Dog for providing us with space and equipment.

You may also like