To design and build a wearable smart goggle system that enables visually impaired individuals to navigate complex urban environments independently and safely using real-time feedback through computer vision and sensor fusion. PRODIGY is a lightweight wearable device offering visual place recognition, traffic light detection, obstacle avoidance, and audio & haptic feedback.
Navigating busy cities is a major challenge for the blind. Traditional aids like canes and guide dogs are limited:
The development incorporates cutting-edge AI and embedded systems technologies:
YOLOv8 model detecting traffic lights, pedestrians, and urban obstacles in real-time
Raspberry Pi, camera system, and haptic feedback components integrated into the goggle frame
Demonstration of the complete system with real-time object detection, haptic feedback, and audio guidance for visually impaired navigation
Real-time detection of people, crosswalks, vehicles, poles, bikes, and other urban obstacles using advanced computer vision
YOLOv8 model trained on self-collected Manhattan dataset for accurate traffic light detection
Proximity sensing and alerting based on surroundings with directional haptic feedback
Dual feedback system via earphones and vibration motors integrated in the frame
Directional guidance for urban mobility with step-by-step auditory instructions
8–10 hours of continuous use with power-saving mode optimization
Raspberry Pi bottlenecks for heavy real-time inference - addressed through model optimization and AI accelerators
Hardware resource constraints on Hailo - ongoing optimization of neural network architectures
Collection and labeling across multiple urban environments - custom Manhattan dataset creation
This assistive technology project addresses critical mobility challenges for visually impaired individuals in urban environments:
Intuitive feedback system replacing complexity with meaningful, actionable guidance
Real-time obstacle detection and traffic light recognition for safer navigation
Lightweight, comfortable, and intuitive design based on user feedback and testing
This project provided deep insight into how AI and robotics can directly improve lives, especially for vulnerable populations. From dataset collection in Manhattan to training YOLOv8 models, and integrating real-time haptic and auditory feedback, I was involved in both hardware prototyping and AI system design.
The experience exposed me to regulatory frameworks, real-world deployment constraints, and the critical importance of user-centered design in accessibility technology. PRODIGY represents a significant step forward in assistive technology, combining cutting-edge AI with intuitive human-computer interaction to address real-world challenges faced by visually impaired individuals.
The project demonstrates the potential for AI-powered wearable devices to serve as meaningful assistive tools, replacing complexity with intuitive feedback that enhances independence and safety in urban navigation.