When: Wednesday, November 6, 2024, 12:00 PM - 1:00 PMWhere: Hybrid - (on campus) PL 4.24 (Peter Landin) or online
Speaker: Dr. Sean Banerjee
In this talk, Dr Sean Banerjee will present his research group's work on securing serious VR applications by leveraging the behavior of the user within the VR task as a biometric signature.
Dr Sean Banerjee will show their work on using deep neural networks to enable the first approach on cross-system VR biometrics where users provide enrollment data using one VR system, such as an HTC Vive, and use-time data from another, such as an Oculus Quest. He will demonstrate how temporal effects ranging from short, mid, and long-range time scales impact behavioral biometrics in VR. Using multi-range data, he will show how we provide resiliency to changing user behavior in a VR environment. While deep learning approaches for behavioral biometrics show high accuracy when using complete or near complete portions of the user trajectory, they struggle when provided limited data from the start of the task. He will present results from our work on the first approach that forecasts future user behavior using Transformer-based forecasting to perform user authentication. Finally, he will discuss my future vision on connecting an understanding of real-world multi-person interaction behaviors acquired using multi-modal sensing to inform widespread development of secure VR applications for areas such as healthcare, finance, and education.
About the speaker:Dr. Sean Banerjee is an Associate Professor and Lexis-Nexis Endowed Co-Chair for Advanced Data Science and Engineering in the Department of Computer Science and Engineering at Wright State University, Dayton, Ohio. He is co-founder and co-director of the Terascale All-Sensing Research Studio (TARS) and performs research at the intersection of human-computer interaction and AI. His research interests are on using human behavior data collected using multi-view multi-modal sensing systems in real and virtual environments to contribute AI algorithms that show awareness of the nuances of everyday human-human and human-object interactions for VR security, immersive learning environments, assistive robotics, and healthcare.
Key contact: Manolis Chiou