Advanced obstacle avoidance and path planning system for autonomous mobile robots with real-time computer vision and LiDAR fusion, achieving 99.7% obstacle detection across three distribution centre environments.
A regional logistics operator with 3 distribution centres across Yorkshire engaged YF Studio for an 8-month project from June 2023 to January 2024. The client handles parcel sorting and pallet movement for regional e-commerce fulfilment, operating 18-hour shifts across their facilities. Prior to this engagement, all internal material transport was handled by manned forklifts and manual pallet jacks, creating bottlenecks during peak periods and contributing to a high rate of minor workplace incidents (averaging 3.2 per quarter across all sites).
8 months (Jun 2023 – Jan 2024)
1 robotics engineer, 1 CV specialist, 1 embedded systems developer
Regional logistics & e-commerce fulfilment
Developed a comprehensive autonomous navigation system for industrial AMRs operating in dynamic warehouse environments. The system combines computer vision, sensor fusion, and machine learning to enable safe autonomous operation in shared human-robot workspaces.
Custom AMR Platform, NVIDIA Jetson AGX Orin, 2D/3D LiDAR, Wheel Encoders
ROS2, PCL, OpenCV, TensorFlow Lite
YOLOv7, SLAM, Depth Estimation, Object Tracking
A* Pathfinding, RRT*, Dynamic Window Approach
The client had previously trialled a commercially available AMR platform from a European vendor in 2022 at their largest distribution centre. The off-the-shelf system relied on fixed magnetic tape guidance with limited obstacle response — when encountering an unexpected object, the robot would simply halt and wait for manual clearance. During peak season, this resulted in an average of 14 stoppages per shift, effectively negating the throughput benefits of automation. The system was also unable to handle the dynamic racking reconfigurations the client performs monthly, requiring expensive re-mapping by the vendor each time. After 5 months, the pilot was discontinued.
The client approached YF Studio to develop a truly autonomous solution that could navigate dynamically without fixed infrastructure. Key challenges included:
Integrated stereo cameras, a 2D LiDAR (SICK TiM781), an IMU, and ultrasonic proximity sensors into a unified perception pipeline. We chose the SICK TiM781 over the Hokuyo UST-10LX due to its superior angular resolution (0.33°) at the required range, and its IP65 rating suited to the dusty warehouse environment. Camera-only approaches were evaluated but rejected because of the highly variable lighting across the three facilities — some areas are well-lit while loading dock zones are intermittently dark. The fusion architecture uses an Extended Kalman Filter to combine sensor modalities, providing robust localisation even when individual sensors degrade (e.g., LiDAR reflections on wet floors, camera exposure shifts near dock doors).
Developed a lightweight YOLOv7-tiny model quantised to INT8 via TensorRT for deployment on the NVIDIA Jetson AGX Orin. YOLOv7-tiny was selected over YOLOv8-nano after benchmarking showed it achieved comparable mAP (91.3% vs. 92.1%) with 20% lower inference latency on the target hardware, which was critical for meeting the 15ms detection budget. The model classifies 8 obstacle categories: pedestrians, forklifts, pallet jacks, pallets (loaded/empty), racking, debris, and unknown-dynamic. A DeepSORT tracker maintains object identity across frames for velocity estimation, enabling predictive avoidance of moving obstacles.
Implemented a two-layer planning architecture: A* on a regularly updated occupancy grid for global route planning, and Dynamic Window Approach (DWA) for real-time local obstacle avoidance. We evaluated RRT* as an alternative global planner but found A* produced more predictable, human-readable paths in the structured aisle environment — an important factor for safety certification and operator trust. The global map is incrementally updated from SLAM data, meaning the system automatically adapts to racking reconfigurations without requiring manual re-mapping.
Integrated Cartographer SLAM (Google's open-source library) for simultaneous localisation and mapping in the GPS-denied warehouse environment. Cartographer was chosen over GMapping because of its superior loop-closure performance in large facilities (the client's largest site is 12,000 m²). The system builds and maintains 2D occupancy grid maps while tracking the robot's position with 2cm accuracy using LiDAR scan-matching and wheel odometry. Map updates are shared across the fleet via a central ROS2 map server, so when one robot detects a layout change, all robots receive the update within 5 seconds.
The perception system processes multiple sensor inputs:
The navigation system consists of:
Thorough field testing across all three distribution centres identified several operational limitations that are documented in the operator handbook:
Obstacle Detection (dry surfaces)
Previous system: halt-and-wait only
Obstacle Detection Latency
Previous system: N/A (no detection)
Localisation Accuracy
Previous system: fixed-path only
Continuous Operation Time
Per charge cycle
The system includes comprehensive safety mechanisms:
The autonomous navigation system enables various applications:
The system has been in production operation since January 2024. YF Studio provides ongoing support and is working with the client on the following enhancements: