I’m a Robotics Engineer specializing in SLAM, navigation, computer vision, and reinforcement learning. With over 3 years of experience, I’ve contributed to the development of diverse autonomous systems and robotic platforms.
I am currently pursuing a Master’s degree in Robotics at Purdue University, with a focus on multi-modal models, reinforcement learning, and classical control theory to drive more efficient and intelligent robotic operations.
Purdue University, West Lafayette
M.S. in Robotics • 2025–Present
IIIT Jabalpur
BTech in ECE • 2019–2023
Developing advanced navigation algorithms for Eli Lilly’s warehouse robots, integrating deep scene understanding, human behavior modeling, and reinforcement learning to enhance safety, efficiency, and human-robot collaboration.
Developed a complete autonomy stack for quadruped robots, enabling navigation in complex and unmapped environments.
Developed lower-level drivers, communication modules, and data filters to enable seamless sensor integration and efficient data processing.
Developed the complete software stack—including SLAM, navigation, firmware, safety layers, and communication protocols—for an Autonomous Mobile Robot (AMR) equipped with a collaborative robot (Cobot) for automated wall painting.
Research and Development Eng. Intern
May 2021 - Aug 2021
Developed a photogrammetry pipeline for 3D reconstruction of teeth using an in-house intra-oral scanner device, enabling highly accurate models for dental applications.
Research and Development Eng. Intern
Jan 2020 - Mar 2020
Benchmarked state-of-the-art computer vision models, including VGG-16 and AlexNet, for fault detection in metal 3D printing processes.
Developed the autonomy stack—including SLAM and navigation—for Addverb’s in-house quadruped, TRAKR, enabling efficient navigation in unknown environments with minimal human intervention.
Built a VR haptic simulator in Unity to train professionals in adapting to force feedback for long-distance teleoperation over 5G.
An intelligent navigation approach for dense and cluttered indoor and outdoor environments, utilizing the Boston Dynamics SPOT robot to interact effectively with its surroundings.
Developed an extensive software stack—including navigation, mapping, localization, communication, data filtering, and kinematics—for an AMR equipped with a robotic arm for wall painting, using ROS 2 as the core framework.
Developed a plug-and-play SMART BOX to track the 6-DoF pose of a crane’s hook, enhancing safety by enabling precise position and orientation tracking. Built as part of a robotics hackathon organised by L&T.
Built a smart sewer monitoring robot that maps underground pipelines and performs structural analysis, enabling efficient inspection, early fault detection, and data-driven maintenance planning. Built as a part of hackathon organised by La Trobe University Australia.
Designed and implemented an autonomous UV-disinfection robot for hospitals, featuring a custom algorithm that intelligently optimizes zap patterns to reduce power usage and maximize area coverage. Completed as part of an undergrad's capstone project.
Designed and implemented a RANSAC-based ground plane segmentation algorithm, using IMU readings as an initial estimate to reduce computation time and improve real-time performance.
Developed a mechatronic rehabilitation glove aimed at supporting motor recovery in children from refugee camps in India. Integrated with a game to reduce the physical and emotional strain of therapy by making the process playful and engaging. Built as a part of government project.
Engineered a fully functional 6-DoF robotic arm from scratch, writing low-level firmware for actuators and implementing ROS-based control architectures. The project included testing various motion planning strategies to enable precise and adaptive manipulation.
Developed a lightweight, ROS-independent A* path planning algorithm with path smoothing, fully implemented in C++ for deployment on resource-constrained boards.
Designed a humanoid robot controlled via a sensor-equipped wearable suit, enabling real-time imitation of human movements. Developed as part of a national-level hackathon project during high school.
Ayush Kumar, Co-author • published by Intellectual Property India, 2023
Ayush Kumar, Co-author • published by Intellectual Property India, 2023
Ayush Kumar, Co-author • published by Intellectual Property India, 2022
Awarded the D&M Proficiency Gold Medal for the best project across the entire B.Tech batch of 2023.
Nominated as Best Performer among all freshers in the Advanced Robotics division at Addverb.
Ranked in the top 10 at TIGC, an international robotics hackathon organized by La Trobe University, Australia, and received AUD 1,000 (~USD 670) in funding to develop a prototype of the proposed solution.
Secured AIR-1 among 800+ participants in a hackathon organized by Larsen & Toubro (L&T), winning a cash prize of INR 1M (~11K USD ) for developing a solution to mitigate crane accidents at construction sites.
Awarded first place in Technovate, a national-level exhibition at IIIT Jabalpur during the annual techno-cultural and design festival.
Secured 1st place in BullsEye, a national-level rocket design competition at IIIT Jabalpur’s annual techno-cultural and design festival, earning titles for both longest flight time and maximum range.
Finished as the runner-up in Robothon, a national-level all-terrain semi autonomous vehicle design competition at IIIT Jabalpur during the annual techno-cultural and design festival.
Won the CBSE National Level Science Exhibition for developing a motion-synchronized humanoid robot torso that mimics the movements of a human operator.