Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

VF-Lens: Enhancing Visual Perception of Visually Impaired Users in VR via Adversarial Learning with Visual Field Attention

IEEE via YouTube

Overview

Udemy Special: Ends May 28!
Learn Data Science. Courses starting at $12.99.
Get Deal
This IEEE conference talk presents VF-Lens, a generative adversarial model designed to enhance image perception for visually impaired users in virtual reality environments. Learn how researchers from The University of Sydney, Beijing Technology and Business University, and Shandong University developed an adaptive system that compensates for light sensitivity and generates customized hyperimages based on users' visual field parameters. Discover how the innovative "visual field attention" mechanisms prioritize critical visual information, enabling visually impaired users to experience perceptions closer to those with normal vision. The presentation covers the model's applicability to various types of visual impairments while avoiding engineering complexities, along with the extensive evaluations that demonstrate its effectiveness and the standardized evaluation process ensuring reusability for future research in VR accessibility.

Syllabus

VF-Lens: Enhancing Visual Perception of Visually Impaired Users in VR ...

Taught by

IEEE Virtual Reality Conference

Reviews

Start your review of VF-Lens: Enhancing Visual Perception of Visually Impaired Users in VR via Adversarial Learning with Visual Field Attention

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.