Back to Portfolio

Autonomous Mobile Robot Navigation AI-Powered AMR System

Advanced obstacle avoidance and path planning system for autonomous mobile robots with real-time computer vision and LiDAR fusion, achieving 99.7% obstacle detection across three distribution centre environments.

Project Context

A regional logistics operator with 3 distribution centres across Yorkshire engaged YF Studio for an 8-month project from June 2023 to January 2024. The client handles parcel sorting and pallet movement for regional e-commerce fulfilment, operating 18-hour shifts across their facilities. Prior to this engagement, all internal material transport was handled by manned forklifts and manual pallet jacks, creating bottlenecks during peak periods and contributing to a high rate of minor workplace incidents (averaging 3.2 per quarter across all sites).

Timeline

8 months (Jun 2023 – Jan 2024)

Team

1 robotics engineer, 1 CV specialist, 1 embedded systems developer

Industry

Regional logistics & e-commerce fulfilment

Project Overview

Developed a comprehensive autonomous navigation system for industrial AMRs operating in dynamic warehouse environments. The system combines computer vision, sensor fusion, and machine learning to enable safe autonomous operation in shared human-robot workspaces.

Hardware

Custom AMR Platform, NVIDIA Jetson AGX Orin, 2D/3D LiDAR, Wheel Encoders

AI Framework

ROS2, PCL, OpenCV, TensorFlow Lite

Computer Vision

YOLOv7, SLAM, Depth Estimation, Object Tracking

Navigation

A* Pathfinding, RRT*, Dynamic Window Approach

The Challenge

The client had previously trialled a commercially available AMR platform from a European vendor in 2022 at their largest distribution centre. The off-the-shelf system relied on fixed magnetic tape guidance with limited obstacle response — when encountering an unexpected object, the robot would simply halt and wait for manual clearance. During peak season, this resulted in an average of 14 stoppages per shift, effectively negating the throughput benefits of automation. The system was also unable to handle the dynamic racking reconfigurations the client performs monthly, requiring expensive re-mapping by the vendor each time. After 5 months, the pilot was discontinued.

The client approached YF Studio to develop a truly autonomous solution that could navigate dynamically without fixed infrastructure. Key challenges included:

  • Navigation in cluttered warehouse aisles with variable racking configurations
  • Real-time obstacle detection for forklifts, pedestrians, dropped pallets, and shrink-wrap debris
  • Dynamic path planning that adapts to weekly layout changes without manual re-mapping
  • Precise localisation (sub-5cm) in GPS-denied steel-framed buildings
  • Integration with the client's existing Warehouse Management System (WMS) via REST API
  • Full compliance with ISO 3691-4 safety standards for industrial AMRs in shared workspaces

Our Solution

1. Multi-Sensor Fusion System

Integrated stereo cameras, a 2D LiDAR (SICK TiM781), an IMU, and ultrasonic proximity sensors into a unified perception pipeline. We chose the SICK TiM781 over the Hokuyo UST-10LX due to its superior angular resolution (0.33°) at the required range, and its IP65 rating suited to the dusty warehouse environment. Camera-only approaches were evaluated but rejected because of the highly variable lighting across the three facilities — some areas are well-lit while loading dock zones are intermittently dark. The fusion architecture uses an Extended Kalman Filter to combine sensor modalities, providing robust localisation even when individual sensors degrade (e.g., LiDAR reflections on wet floors, camera exposure shifts near dock doors).

2. Real-time Obstacle Detection

Developed a lightweight YOLOv7-tiny model quantised to INT8 via TensorRT for deployment on the NVIDIA Jetson AGX Orin. YOLOv7-tiny was selected over YOLOv8-nano after benchmarking showed it achieved comparable mAP (91.3% vs. 92.1%) with 20% lower inference latency on the target hardware, which was critical for meeting the 15ms detection budget. The model classifies 8 obstacle categories: pedestrians, forklifts, pallet jacks, pallets (loaded/empty), racking, debris, and unknown-dynamic. A DeepSORT tracker maintains object identity across frames for velocity estimation, enabling predictive avoidance of moving obstacles.

3. Dynamic Path Planning

Implemented a two-layer planning architecture: A* on a regularly updated occupancy grid for global route planning, and Dynamic Window Approach (DWA) for real-time local obstacle avoidance. We evaluated RRT* as an alternative global planner but found A* produced more predictable, human-readable paths in the structured aisle environment — an important factor for safety certification and operator trust. The global map is incrementally updated from SLAM data, meaning the system automatically adapts to racking reconfigurations without requiring manual re-mapping.

4. SLAM Integration

Integrated Cartographer SLAM (Google's open-source library) for simultaneous localisation and mapping in the GPS-denied warehouse environment. Cartographer was chosen over GMapping because of its superior loop-closure performance in large facilities (the client's largest site is 12,000 m²). The system builds and maintains 2D occupancy grid maps while tracking the robot's position with 2cm accuracy using LiDAR scan-matching and wheel odometry. Map updates are shared across the fleet via a central ROS2 map server, so when one robot detects a layout change, all robots receive the update within 5 seconds.

Technical Implementation

Perception Pipeline

The perception system processes multiple sensor inputs:

  • Visual Processing: RGB and depth image analysis for obstacle detection
  • LiDAR Processing: Point cloud analysis for 3D mapping and obstacle detection
  • Sensor Fusion: Kalman filtering for robust state estimation
  • Object Tracking: Multi-object tracking for dynamic obstacle avoidance

Navigation Architecture

The navigation system consists of:

  • Global path planner (A* on occupancy grid) for facility-wide navigation
  • Local path planner (DWA) for dynamic obstacle avoidance with 10Hz replanning
  • Kinematic controller for smooth trapezoidal motion profiles with jerk limiting
  • Safety-rated stop system with dual-channel redundancy (SIL 2 rated)

Limitations & Edge Cases

Thorough field testing across all three distribution centres identified several operational limitations that are documented in the operator handbook:

  • Wet-floor conditions: Performance in wet-floor conditions drops to 94% obstacle detection accuracy (vs. 99.7% on dry surfaces) due to LiDAR beam reflections off pooled water creating phantom obstacles. The system compensates by switching to camera-primary mode when floor reflectance anomalies are detected, though this increases detection latency to approximately 25ms.
  • Smoke and steam: Steam venting from nearby loading dock heating systems and exhaust from delivery vehicles occasionally triggers false positive obstacle detections. This was addressed through temporal filtering — transient detections that do not persist across 3 consecutive frames (150ms) are suppressed. This reduces false emergency stops by 87% with negligible impact on genuine obstacle response time.
  • Narrow aisles: Aisles below 1.2m width require the system to enter reduced speed mode (0.3 m/s vs. the standard 1.5 m/s) due to limited lateral clearance for obstacle avoidance manoeuvres. This affects approximately 15% of routes in the client's oldest facility where racking was installed before AMR deployment was planned.
  • Battery constraints: Battery-constrained operation limits autonomous runtime to 6.5 hours before requiring dock return for charging, compared to the client's 18-hour shift pattern. This is mitigated by deploying robots in staggered charging rotation, but means a minimum fleet size of 3 robots is needed per facility for continuous coverage.
  • Highly reflective racking: New galvanised steel racking installed in one facility wing produced LiDAR multi-path reflections that degraded localisation accuracy to approximately 8cm (vs. the standard 2cm). Applying anti-reflective tape to racking uprights at LiDAR height resolved the issue.

Results & Impact

99.7%

Obstacle Detection (dry surfaces)

Previous system: halt-and-wait only

15ms

Obstacle Detection Latency

Previous system: N/A (no detection)

2cm

Localisation Accuracy

Previous system: fixed-path only

6.5 hours

Continuous Operation Time

Per charge cycle

Performance Metrics

  • Successfully completed 5,000+ km of autonomous travel over 14 months of continuous operation across all three distribution centres
  • Reduced material handling labour costs by 35% through partial automation of internal pallet transport (remaining 65% of cost is manned forklift operations for elevated storage)
  • Achieved 99.7% obstacle detection rate on dry surfaces in mixed human-robot traffic (94% in wet-floor conditions, see Limitations)
  • Maintained 2cm docking accuracy for pallet pick-up and drop-off points
  • Processed obstacle detection in under 15ms end-to-end on Jetson AGX Orin
  • Reduced warehouse workplace incidents from 3.2 to 0.8 per quarter (75% reduction) by removing manned pallet jack trips from high-traffic zones

Safety Features

The system includes comprehensive safety mechanisms:

  • Emergency Stop: Hardware-level e-stop for immediate halting
  • Safety Zones: Dynamic speed reduction in high-traffic areas
  • Battery Management: Autonomous charging when battery is low
  • Load Stability: Active monitoring of payload stability
  • Manual Override: Joystick control for manual maneuvering

Applications

The autonomous navigation system enables various applications:

  • Intralogistics: Automated material transport in factories
  • E-commerce Fulfillment: Goods-to-person picking systems
  • Hospital Logistics: Autonomous delivery of medicine and linens
  • Retail Inventory: Shelf scanning and stock monitoring
  • Disinfection: Autonomous UV-C disinfection robots

Ongoing & Next Steps

The system has been in production operation since January 2024. YF Studio provides ongoing support and is working with the client on the following enhancements:

  • Fleet coordination (in progress): Expanding from single-robot operation to a coordinated fleet of 4 robots per facility, with centralised task allocation and traffic management to prevent aisle congestion. Initial multi-robot testing began in Q2 2024.
  • Opportunity charging: Implementing short-burst charging at intermediate docking points to extend effective operational runtime beyond the current 6.5-hour limit, targeting 95% uptime during 18-hour shifts.
  • Semantic mapping: Augmenting the occupancy grid with semantic labels (loading zone, pedestrian crossing, forklift-only aisle) to enable context-aware speed and behaviour policies rather than relying solely on geometric obstacle avoidance.
  • Wet-floor detection model: Training a dedicated floor-condition classifier to proactively switch sensor modes before LiDAR degradation occurs, rather than reacting to anomalies after the fact.
  • Third-party WMS integration expansion: The client is migrating from their legacy WMS to a cloud-based system; YF Studio is developing an updated API adapter to maintain seamless task dispatch.