
SpatialSense is an AI-powered assistive eyewear system designed to enhance independent navigation.
The system converts complex depth and proximity sensor logic into immediate spatial audio cues. The final output is a video prototype simulation demonstrating the system's ability to translate the environment into intuitive auditory awareness.
Project Overview
Made video prototype using AfterEffects
Tools used:
SPATIALSENSE GLASSES
Research & Data Foundation
Empathy drove the initial research, studying how people interact with and navigate complex spaces. While primarily for assistive navigation, the core system logic is adaptable for spatial analysis and interior design contexts.
The system logic is built upon translating three critical real-time data types into sound:
Proximity Data
Directional Data
Pathfinding Data
Core Rationale: Spatial audio was chosen over haptic or visual cues because sound allows for 360-degree awareness without interfering with existing senses.
Technical Flow
Output
Action
Stage
Persona
Footage demonstrating product form
Video Prototype demonstrating real-time interaction
Video Prototype
------->
------->
------->
------->
We mapped out the most difficulties faced by our users
------->
Logic Definition
Chose a physical prop and filmed usage scenarios
------->
Prototyping
Edited video footage with digitally produced spatial audio cues corresponding to the obstacle proximity
------->
Simulation
Generated Video showing showing user experience with the spatial audio cues.
------->
Final Deliverable
Design & Interaction (Auditory UX)
Research & Data Foundation
Technical Flow
Design & Interaction (Auditory UX)
The user experience is entirely auditory and seamless, demonstrating an empathetic translation of data:
Spatial Audio Cues: Obstacles are represented by 3D sounds whose pitch or volume change based on proximity and direction. This is the core interaction logic.
Assistive UX: The technology is engineered to recede, providing information only when a hazard is detected, thus prioritizing the user's natural experience.
Minimalist Prop: A physical prop was built to create a contextually accurate visual for the usage simulation.
SpatialSense Glasses was the defining project for Conceptual System Design and Computational Empathy. It demonstrated how to architect an effective, intuitive system based purely on required interaction logic.
Turning Point: The process proved that video prototyping is a valid and powerful method for testing complex, sensory UX systems before physical manufacturing.
I learned that empathy can be engineered by systematically defining interaction logic. I plan to evolve this work into an open-source accessibility toolkit.
My Insight

Final Video Prototype





