Free - Stanford SCIEN: “Co-Optimizing Human-System Performance in XR"


Details
You Must Reg in advance for zoom link.
Talk Abstract: Extended Reality (XR) enables unprecedented possibilities for displaying virtual content, sensing physical surroundings, and tracking human behaviors with high fidelity. However, we still haven’t created “superhumans” who can outperform what we are in physical reality, nor a “perfect” XR system that delivers infinite battery life or realistic sensation. In this talk, I will discuss some of our recent research on leveraging eye/muscular sensing and learning to model our perception, reaction, and sensation in virtual environments. Based on the knowledge, we create just-in-time visual content that jointly optimizes human (such as reaction speed to events) and system performance in XR.
Speaker Biography: Qi Sun is an assistant professor at New York University. Before joining NYU, he was a research scientist at Adobe Research. He received his PhD at Stony Brook University. His research interests lie in computer graphics, VR/AR, computational cognition, and human-computer interaction. He is a recipient of the IEEE Virtual Reality Best Dissertation Award, as well as ACM SIGGRAPH Best Paper Award.
Online: You can join online by registering in advance for this meeting:
https://stanford.zoom.us/meeting/register/tJIrf-GpqjwuHtLg5IBGXPAAlxIvR_V3eAQb
After registering, you will receive a confirmation email containing information about joining the meeting
***
scien_events mailing list
scien_events@lists.stanford.edu
https://mailman.stanford.edu/mailman/listinfo/scien_events

Free - Stanford SCIEN: “Co-Optimizing Human-System Performance in XR"