Advanced obstacle avoidance and path planning system for autonomous mobile robots with real-time computer vision, achieving 99.0% collision avoidance in complex warehouse environments.
Developed a comprehensive autonomous navigation system for industrial AMRs operating in dynamic warehouse environments. The system combines computer vision, sensor fusion, and machine learning to enable safe autonomous operation in shared human-robot workspaces.
Custom AMR Platform, NVIDIA Jetson AGX Orin, 2D/3D LiDAR, Wheel Encoders
ROS2, PCL, OpenCV, TensorFlow Lite
YOLOv7, SLAM, Depth Estimation, Object Tracking
A* Pathfinding, RRT*, Dynamic Window Approach
A logistics company needed autonomous mobile robots for warehouse inventory management and delivery operations. Key challenges included:
Integrated cameras, LiDAR, IMU, and ultrasonic sensors to create a comprehensive perception system. The fusion algorithm provides robust 3D mapping and localization even in challenging lighting conditions.
Developed a lightweight YOLOv7 model optimized for edge deployment that detects and classifies obstacles in real-time. The system can identify people, vehicles, structures, and dynamic objects with 95% accuracy.
Implemented an adaptive path planning algorithm that combines A* for global planning with Dynamic Window Approach for local obstacle avoidance. The system recalculates paths in real-time based on changing environments.
Integrated Simultaneous Localization and Mapping (SLAM) for navigation in GPS-denied warehouses. The system builds and maintains 2D/3D maps while tracking the robot's position with centimeter-level accuracy using LiDAR and wheel odometry.
The perception system processes multiple sensor inputs:
The navigation system consists of:
Collision Avoidance Rate
Obstacle Detection Latency
Position Accuracy
Continuous Operation Time
The system includes comprehensive safety mechanisms:
The autonomous navigation system enables various applications:
Planned improvements include: