Autonomy Stack for quadruped - Addverb

Developed the autonomy stack—including SLAM and navigation—for Addverb’s in-house quadruped, TRAKR, enabling efficient navigation in unknown environments with minimal human intervention.

C++ Graph Optimization Bundle Adjustment Loop Closure Sensor Fusion Hybrid A* Pure Pursuit Controller Design Pattern OpenCV
Quadruped Robot

Project Overview

Challenge

Developing autonomy pipelines for wheeled robots is relatively easier since they have a reliable source of odometry. However, this is not the case for legged robots such as quadrupeds and humanoids, where obtaining accurate and consistent odometry remains a challenge.

Solution

To address this challenge, we developed a method that integrates a visual-inertial SLAM pipeline, which performs well in most scenarios. However, since it often fails in featureless or textureless environments, we introduced a LiDAR fallback mechanism. By using a 2D LiDAR, we were able to keep costs low while still achieving reliable results.

Implementation

For our SLAM system, we designed a visual-inertial frontend that leverages Structure-from-Motion (SfM) with ORB features and motion-only bundle adjustment. On the backend, we employ a LiDAR–visual full bundle adjustment to correct accumulated errors, along with a loop closure module based on a Bag-of-Words approach. For the navigation stack, we implemented a custom Hybrid A*-style planner optimized for faster computation, paired with a Pure Pursuit controller and a tailored decision tree for decision-making.

Overview Image 1 Overview Image 2
Overview Image 2

Project Results

IsaacSim Benchmarking
IsaacSim Testing
Autonomous Traversal
Autonomous Traversal

Check out my next project

VR Haptic Simulator - Addverb