- VRSight: An AI-driven Scene Description System to Improve Virtual Reality Accessibility for Blind People (UIST 2025)
We present VRSight, an end-to-end system that recognizes VR scenes post hoc through a set of AI models (e.g., object detection, depth estimation, LLM-based atmosphere interpretation) and generates tone-based, spatial audio feedback, empowering blind users to interact in VR without developer intervention.
- Inclusive Avatar Guidelines for People with Disabilities: Supporting Disability Representation in Social Virtual Reality (CHI 2025)
Our work aim to advance the avatar design practices by delivering a set of centralized, comprehensive, and validated design guidelines that are easy to adopt, disseminate, and update. Through a systematic literature review and interview with 60 participants with various disabilities, we derived 20 initial design guidelines that cover diverse disability expression methods through five aspects, including avatar appearance, body dynamics, assistive technology design, peripherals around avatars, and customization control. We further evaluated the guidelines via a heuristic evaluation study with 10 VR practitioners, validating the guideline coverage, applicability, and actionability. Our evaluation resulted in a final set of 17 design guidelines with recommendation levels.
- Springboard, Roadblock or “Crutch”?: How Transgender Users Leverage Voice Changers for Gender Presentation in Social Virtual Reality (IEEE VR 2024)
We interviewed 13 transgender and gender-nonconforming users of social VR platforms, focusing on their experiences with and without voice changers to explore the connection between avatar embodiment and voice representation (Povinelli and Zhao 2024).
- A Diary Study in Social Virtual Reality: Impact of Avatars with Disability Signifiers on the Social Experiences of People with Disabilities (ASSETS 2023)
We conducted a diary study with 10 People with Disabilities who freely explored VRChat for two weeks, comparing their experiences between using regular avatars and avatars with disability signifiers (i.e., avatar features that indicate the user’s disability in real life) (Zhang et al. 2023).
- A Preliminary Interview: Understanding XR Developers’ Needs towards Open-Source Accessibility Support (IEEE VRW 2023)
We investigated XR developers' practices, challenges, and needs when integrating accessibility in their projects (Ji et al. 2023).
- “It’s Just Part of Me:” Understanding Avatar Diversity and Self-presentation of People with Disabilities in Social Virtual Reality (ASSETS 2022)
We explored people with disabilities’ avatar perception and disability disclosure preferences in social VR by (1) conducting a systematic review of fifteen popular social VR applications to evaluate their avatar diversity and accessibility support and (2) interviewing 19 participants with different disabilities to understand their avatar experiences (Zhang et al. 2022).
- VRBubble: Enhancing Peripheral Awareness of Avatars for People with Visual Impairments in Social Virtual Reality (ASSETS 2022)
We designed VRBubble, an audio-based VR technique that provides surrounding avatar information based on social distances. Based on Hall’s proxemic theory, VRBubble divides the social space with three Bubbles—Intimate, Conversation, and Social Bubble—generating spatial audio feedback to distinguish avatars in different bubbles and provide suitable avatar information (Ji, Cochran, and Zhao 2022).
Extended Reality technologies including Augmented Reality, Mixed Reality, and Virtual Reality are becoming increasingly adopted, and accessible solutions must be developed to maintain pace.
Funding Agency
