I joined Addverb in 2023 as a Robotics Intern, where I initially focused on firmware development. Over time, I transitioned to working on the application layer of the robots, contributing primarily to Navigation, SLAM, and Computer Vision modules.
End-to-end autonomy for Addverb’s in-house quadruped (TRAKR): robust VI-SLAM with LiDAR fallback, global loop-closure, and a fast Hybrid A* planner + Pure Pursuit controller tuned for legged motion.
I developed a module to robustly estimate the ground plane for a quadruped robot by combining visual and kinematic inputs. The first estimate is obtained from a downward-facing camera mounted at the base, while the second is computed using singular value decomposition (SVD) to fit a plane through the four leg contact points. These two estimates are fused with an Extended Kalman Filter, yielding a more accurate and reliable measurement of the ground plane angle as the robot moves. The resulting estimate is then integrated into a higher-level control stack, where it is used to reject external disturbances and ensure stable torso posture during locomotion.
I developed a Unity-based haptic simulation platform aimed at training professionals to effectively adapt to and utilize haptic device feedback. The system provides real-time haptic force rendering for a wide variety of virtual objects, enabling users to experience realistic tactile sensations and interactions. By simulating different material properties, shapes, and resistance levels, the simulator helps users build familiarity with the nuances of touch-based feedback in a safe, controlled, and repeatable environment. This enhances skill development and improves readiness for real-world applications where precise haptic perception is critical.
I implemented a SIL-3 compliant safety pipeline by incorporating OS virtualization, enabling the integration of both a real-time operating system (RTOS) and a Yocto-based operating system. In this architecture, the Yocto-based OS is responsible for running the robot’s primary functions, while the RTOS continuously monitors the Yocto OS and other safety-critical peripherals for potential failures using On-Demand Cross Comparison (ODCC). This design achieved a safety response timeout of 4 ms, ensuring rapid detection and mitigation of faults. The solution has been successfully deployed and tested on a collaborative robot (cobot), demonstrating its reliability and effectiveness in meeting stringent functional safety requirements.
I developed firmware for a quadruped robot, an exoskeleton, and a haptic device, including actuator control over CAN and integration of sensors via I²C, SPI, and UART. I implemented a lightweight TCP/IP communication module with a custom encoding/decoding scheme, and designed a sensor fusion algorithm to reduce uncertainties in multi-sensor data. Additionally, I built a synchronization module to align data from multiple sources, ensuring reliable and coherent system performance.