Application of Monocular Visual SLAM on a Cost-Effective Microcomputer

Enoch Essien
By Enoch Essien

Published: 2023

Application of Monocular Visual SLAM on a Cost-Effective Microcomputer

This project explores the feasibility of running ORB-SLAM2 on a Raspberry Pi 4 to enable affordable autonomous mapping using a monocular camera.

Autonomous robots require efficient mapping methods to navigate and interact with their environment. LiDAR is widely used for accurate real-time mapping, but its high cost makes it inaccessible for low-budget applications. A cost-effective alternative is Visual SLAM, which relies on camera modules to generate maps of the surroundings.

This project investigates the feasibility of running ORB-SLAM2 on a Raspberry Pi 4. A mobile robotic platform was designed using a Raspberry Pi 4, Arduino for motor control, an L298N motor driver, and a Pi Camera Module, while maintaining a total budget under £200.

Robot hardware schematic

The implementation process involved assembling the robot chassis, setting up Ubuntu and ROS on the Raspberry Pi, and configuring remote control via ROS. ORB-SLAM2 was installed to process camera data and attempt real-time mapping of an indoor environment.

Several challenges emerged, including dependency conflicts and the computational limitations of the Raspberry Pi 4. The device struggled to meet the real-time processing demands of ORB-SLAM2, resulting in performance bottlenecks.

ORB-SLAM2 algorithm overview

Despite these challenges, the project demonstrated the potential of low-cost Visual SLAM solutions. Future improvements could involve lightweight SLAM algorithms or hardware accelerators such as the NVIDIA Jetson Nano or Google Coral.

Although ORB-SLAM2 could not run fully in real time on the Raspberry Pi 4, the research highlights how affordable hardware can still contribute to autonomous robotics development and informs future work on cost-effective mapping systems.

Share this post: