Skip to content

Accessible Multimodal Input in Augmented Reality Training Applications

Photo of Meryl K. Evans
Hosted By
Meryl K. E. and Thomas L.
Accessible Multimodal Input in Augmented Reality Training Applications

Details

In this talk, Ashley Coffey will share recent PEAT research on making immersive hybrid workplaces more inclusive and accessible. She'll reflect on themes explored in an "Inclusive XR in the Workplace" white paper co-authored with XR Association:
https://www.peatworks.org/futureofwork/xr/inclusiveworkplacexr/

Tim Stutts will share his design experience with multi-modal inputs (hardware controls, gesture, voice) and sensory feedback (graphical, LEDs, sound, haptics) for augmented reality head-mounted displays, touching briefly on his previous work on Magic Leap’s Lumin OS, symbolic input and surrounding accessibility efforts, before diving into current Vuforia Work Instructions applications for HoloLens, RealWear, and most recently Magic Leap, with the debut of Capture and Vantage applications to the platform.

PRESENTER BIO
Tim Stutts (he/him) is a multifaceted designer drawn to challenges involving interaction, input, user experience, prototyping, sensory feedback, systems design, data visualization, and spatial computing. Tim is an ethical technologist who values data privacy and accessibility. he has experience as an individual contributor, as well as directing the efforts of small design teams to solve complex challenges.

He is a Principal Augmented Reality Product Designer at PTC Vuforia where he leads the design of work instructions applications for head-mounted displays. He previously worked at Faceware Technologies, Magic Leap, and IBM. https://bit.ly/3ySQll2

Ashley Coffey (she/her) is a Consultant on Emerging Technology Accessibility at the Partnership on Employment and Accessible Technology (PEAT). In this role, she works to advance the accessibility of emerging workplace technologies to increase employment opportunities for people with disabilities. Ashley also leads the Business Case XR workstream within the XR Access initiative and serves as a leader on the Community & Engagement Team. She is passionate about furthering the adoption of inclusive design for accessible technology.

Previously, Ashley has worked as an Emerging Technologies Librarian at the University of Oklahoma, implementing inclusive design practices for integrating XR tools for research, instruction, and innovation.

ACCESSIBILITY
The presentation will be captioned [cc]. The event will have real-time captions and a YouTube live stream.

LIVESTREAM
The location of the virtual event is to be determined. But wherever it is hosted, we will have video streaming of the event to YouTube via a partnership with the Internet Society Accessibility SIG (a11ySIG).

Link: https://youtu.be/HERoRreHNAQ

SPONSORS
Thanks to Equal Entry for sponsoring.

ACCREDITATION
All A11yVR meetups are pre-approved for IAAP Continuing Accessibility Education Credits (CAEC).

TIMELINE
Please NOTE: Meetup is not skilled in representing time zones. This Meetup is set in Tokyo, Japan so it will display differently than your local time zone. When you RSVP, use the "Add to calendar" link to receive a calendar invite transcribed to your time zone.

This timeline is based on the New York ET time zone.

7 - 7:15 PM
Thomas Logan introduction to A11yVR

7:15 - 7:50 PM
Presentation

7:50 - 8 PM
Q&A with attendees

Photo of A11yVR - Accessibility Virtual Reality Group group
A11yVR - Accessibility Virtual Reality Group
See more events