Non-Gaming XR #17: Eye-Tracking & Immersive Tech - Hardware, Content & Ethics

This is a past event

68 people went

Location image of event venue

Details

Eye-Tracking technology within VR/AR headsets is a breakthrough innovation enabling next-generations applications. Non-GamingXR is proud to bring you an opportunity to test this groundbreaking immersive technology and interact with our distinguished speakers about the future of VR/AR technology as it relates to eye-tracking!

Join us at our 17th meetup as we welcome a host of speakers who will discuss the broad usages of eye-tracking in immersive technology. From hardware, content and ethical implications, we will be covering a wide spectrum of innovation with real-world applications.

TICKETS HERE: https://bit.ly/2NZ6LEx

SPEAKERS

Cory Corvus - HTC Vive ProEye

Cory Corvus is a Developer Relations Engineer at HTC VIVE and has degrees in Psychology and Computer Science with a game programming specialization. He has been developing in virtual reality for over five years and has experience with desktop and mobile VR including eye tracking.

Cory will explain how eye tracking works with the new Vive Pro Eye and how to do Foveated Rendering with Nvidia VRS (Variable Rate Shading). Cory will review features of the Vive Pro Eye, getting started developing eye tracking, and Dynamic Foveated Rendering with NVIDIA VRS.

Pierre-Yves Laffont - Lemnis Technologies

Pierre-Yves is the Co-founder and CEO of Lemnis Technologies, a company tackling sensory conflicts that cause negative side effects, including dizziness, nausea, and eye strain in VR. After two years of development, the computer vision scientists and engineers at Lemnis Technologies have developed an accurate eye tracker which is calibrated only once per user. This significantly lowers the barrier of entry to eye tracking in a professional setting, with no recalibration needed when repositioning the headset or after taking it off.

Daniel Beeler - SyncThink

Dan joined SyncThink in 2010 and serves as CTO. His background is in Astrophysics and was recruited into SyncThink for expertise in image analysis. Dan is an inventor on three eye tracking patents. He led the technology development and regulatory effort for EYE-SYNC, SyncThink's first medical device. Dan has worked closely with Dr. Jam Ghajar, Stanford University, and the Brain Trauma Foundation to deliver their clinical innovations to users in need.

The SyncThink platform is a VR based eye tracking system that can assess brain health and improve visual performance. The platform provides a battery of assessments that can be used to rapidly detect visual impairments occurring with the ocularmotor and oculo-vestibular systems.

DEMOS

Cognitive3D will demo eye-tracking data captured from the Vive Pro Eye, including gaze data and fixations on their 3D data visualization tool, SceneExplorer. Additional tools for aggregate object engagement will be demonstrated. They will explore how using smooth pursuit in 3D engines has enabled accurate object engagement data. Use cases will include market research as well as training and simulation.