Back to Portfolio

Autonomous Mobile Robot Navigation AI-Powered AMR System

Advanced obstacle avoidance and path planning system for autonomous mobile robots with real-time computer vision, achieving 99.0% collision avoidance in complex warehouse environments.

Project Overview

Developed a comprehensive autonomous navigation system for industrial AMRs operating in dynamic warehouse environments. The system combines computer vision, sensor fusion, and machine learning to enable safe autonomous operation in shared human-robot workspaces.

Hardware

Custom AMR Platform, NVIDIA Jetson AGX Orin, 2D/3D LiDAR, Wheel Encoders

AI Framework

ROS2, PCL, OpenCV, TensorFlow Lite

Computer Vision

YOLOv7, SLAM, Depth Estimation, Object Tracking

Navigation

A* Pathfinding, RRT*, Dynamic Window Approach

The Challenge

A logistics company needed autonomous mobile robots for warehouse inventory management and delivery operations. Key challenges included:

  • Navigation in cluttered warehouse aisles
  • Real-time obstacle detection (forklifts, humans, pallets)
  • Dynamic path planning in changing layouts
  • Precise localization without GPS
  • Integration with Warehouse Management Systems (WMS)
  • Compliance with ISO 3691-4 safety standards

Our Solution

1. Multi-Sensor Fusion System

Integrated cameras, LiDAR, IMU, and ultrasonic sensors to create a comprehensive perception system. The fusion algorithm provides robust 3D mapping and localization even in challenging lighting conditions.

2. Real-time Obstacle Detection

Developed a lightweight YOLOv7 model optimized for edge deployment that detects and classifies obstacles in real-time. The system can identify people, vehicles, structures, and dynamic objects with 95% accuracy.

3. Dynamic Path Planning

Implemented an adaptive path planning algorithm that combines A* for global planning with Dynamic Window Approach for local obstacle avoidance. The system recalculates paths in real-time based on changing environments.

4. SLAM Integration

Integrated Simultaneous Localization and Mapping (SLAM) for navigation in GPS-denied warehouses. The system builds and maintains 2D/3D maps while tracking the robot's position with centimeter-level accuracy using LiDAR and wheel odometry.

Technical Implementation

Perception Pipeline

The perception system processes multiple sensor inputs:

  • Visual Processing: RGB and depth image analysis for obstacle detection
  • LiDAR Processing: Point cloud analysis for 3D mapping and obstacle detection
  • Sensor Fusion: Kalman filtering for robust state estimation
  • Object Tracking: Multi-object tracking for dynamic obstacle avoidance

Navigation Architecture

The navigation system consists of:

  • Global path planner for facility-wide navigation
  • Local path planner for dynamic obstacle avoidance (TEB/DWA)
  • Kinematic controller for smooth motion profiles
  • Safety-rated stop system

Results & Impact

99.0%

Collision Avoidance Rate

15ms

Obstacle Detection Latency

2cm

Position Accuracy

8 hours

Continuous Operation Time

Performance Metrics

  • Successfully completed 5,000+ km of autonomous travel
  • Reduced material handling costs by 40%
  • Achieved 99.0% collision avoidance in mixed traffic
  • Maintained 2cm docking accuracy
  • Processed obstacle detection in under 15ms

Safety Features

The system includes comprehensive safety mechanisms:

  • Emergency Stop: Hardware-level e-stop for immediate halting
  • Safety Zones: Dynamic speed reduction in high-traffic areas
  • Battery Management: Autonomous charging when battery is low
  • Load Stability: Active monitoring of payload stability
  • Manual Override: Joystick control for manual maneuvering

Applications

The autonomous navigation system enables various applications:

  • Intralogistics: Automated material transport in factories
  • E-commerce Fulfillment: Goods-to-person picking systems
  • Hospital Logistics: Autonomous delivery of medicine and linens
  • Retail Inventory: Shelf scanning and stock monitoring
  • Disinfection: Autonomous UV-C disinfection robots

Future Enhancements

Planned improvements include:

  • Fleet management for multi-robot coordination
  • Advanced semantic mapping for better context understanding
  • Integration with 5G for low-latency cloud processing
  • Reinforcement learning for optimized path planning
  • Opportunity charging optimization