Haptics for XR – How designing great haptics can support use case accessibility


詳細
The hand interaction with virtual content in XR is mainly mediated by controllers equipped with haptics actuators.
Haptics feedback can enhance the perception and the perceived quality of XR content in several applications: gaming, training, marketing among them. Haptics can play a strong role for accessibility through sensory substitution, which means deliver through the sense of touch information otherwise mediated by sound or vision.
One of the issues of haptics in XR today is how to create compelling haptics experiences which are supporting and enhancing the XR content for the use case. The talk will focus on haptics design guidelines for XR, and a practical approach on how to implement and realize them thanks to the haptic composer of Interhaptics.
Tim will be following Eric's presentation with a brief overview of the way haptics are utilized on the Magic Leap Control and Mobile Companion App, as a means of expanding sensory feedback for a mixed reality headset, complimenting 2D/3D spatial interactions, outlining system functions and more. He will pay particularly close attention to the accessibility use cases at play.
Presenter Bio:
Eric is a haptics enthusiast breaking the barriers for haptics to scale.
Eric designed the haptics architecture of Interhaptics, lead a team of engineers and marketing colleagues to the launch of the Haptics Composer: the first multiplatform Haptics Design tool on the market.
He co-founded Go Touch VR exploring skin indentation technologies for haptics applications in VR. Go Touch VR raised 2 m $ to create new technologies for XR.
Eric was the CTO of Hap2u in charge of the software team designing the rendering stack to drive surface haptics technologies at full capabilities. Hap2U raised 6 m $ from Daimler and it is going to revolution mobile haptics as we know it.
He published 20 + scientific papers on haptics and human-machine interactions and deposed 5 patents on haptics technologies and applications.
Presenter Bio:
Tim Stutts is a multifaceted designer drawn to challenges touching on interaction, sensory feedback, systems design, data visualization and spatial computing. He is a cautiously optimistic technologist, and advocate for data privacy and inclusive design. Tim is currently Director of Product Design at Faceware, a prominent developer of facial motion capture software and hardware. Previous endeavors have included leading design of various aspects of the Magic Leap spatial computing platform, developing AI-driven news data visualizations at IBM Watson's Cognitive Visualization and prototyping augmented reality situational awareness head-up display concepts for Honda Research Institute.
Location:
Mozilla Hubs Space
https://hubs.mozilla.com/G6edX79/a11yvr-project/
YouTube Channel https://www.youtube.com/channel/UCqhCc1b6Cq69eg-iYeVKOog
Timeline:
Please NOTE: Meetup is not skilled in representing timezones. This Meetup is set in Tokyo, Japan so it will display differently than your local time zone. When you RSVP, use the "Add to calendar" link to receive a calendar invite transcribed to your time zone.
This timeline is based on EST time zone.
1:00 - 1:10
Login/explore/troubleshooting audio and video
1:10 - 1:15
Introduction to A11yVR
1:15 - 2:00
Presentation
2:00 - 2:10
Q&A with attendees
2:10 - 3:00
Breakout sessions in small groups to get to know each other in virtual reality. It is up to you if you want to stay and continue meeting other people in the room.
Accessibility Details:
Mozilla Hubs holds weekly office hours that can be used to practice and gain an understanding of how to interact in Mozilla Hubs.

Haptics for XR – How designing great haptics can support use case accessibility