Back to Projects
RoboticsPublic

Robot Guide Dog for the Visually Impaired

ROS-based quadruped guidance system using YOLO perception and audio feedback to support visually impaired users in navigation tasks.

PythonROSYOLOOpenCVGazeboUnitree Go1Audio FeedbackLinux

Platform

Unitree Go1 + Gazebo

Perception

real-time YOLO pipeline

Output

synchronized audio cues

Project Overview

Built and integrated perception modules for a Unitree Go1 guide-dog concept. The system detects obstacles, estimates scene context, and produces actionable audio cues.

Challenge

Assistive navigation requires dependable obstacle detection with low-latency, understandable feedback for end users.

Solution

Integrated YOLO-based object detection with ROS topics and an audio feedback layer to announce hazards and nearby objects.

Results

Demonstrated reliable perception behavior in simulation and on physical robot runs, validating practical assistive robotics integration.

Media Gallery

Simulation scene with perception overlays
Simulation scene with perception overlays
Obstacle detection and path context
Obstacle detection and path context
Perception-to-audio pipeline snapshot
Perception-to-audio pipeline snapshot
Navigation run in simulated environment
Navigation run in simulated environment
Robot Guide Dog for the Visually Impaired | Nasir Nasir-Ameen