
Drones and robotics platforms have become essential tools for many industries. They can help construction crews map rough terrain, let energy companies inspect infrastructure, and take on tasks that were once impossible for people to do.
But knowing exactly where these devices are in three-dimensional space and maintaining their sense of direction isn't easy, especially in dense or remote areas where GPS signals go dark.
Shwetabh Singh, RESEPI Applications Engineer at Inertial Labs, works to overcome these challenges by integrating different sensor technologies, effectively processing all of the data the sensors gather, and building safe and accurate systems to test autonomous navigation in real time.
Learn how Singh's innovations are helping drones and other devices find their way, even in the most complex and harsh environments.
Shwetabh Singh and the RESEPI System
Shwetabh Singh's work in navigation technology started with a childhood fascination for flight. Growing up near an air base in India, he was mesmerized by the roar of military planes overhead, often rushing outside to watch them take off and land.
Singh is now a RESEPI Applications Engineer at Inertial Labs. The RESEPI system is basically an advanced platform that combines and analyzes information from different sensors placed on drones and autonomous vehicles. This process, called "sensor fusion," helps these vehicles navigate areas where GPS signals are weak or unavailable.
Each sensor brings something different to the table. LiDAR (Light Detection and Ranging), for example, uses laser pulses to measure distances. By bouncing light off nearby objects and timing how long it takes to return, it builds a detailed 3D map of its surroundings. IMU (Inertial Measurement Unit) tracks motion and rotation so the system knows if the vehicle is tilting, turning, or accelerating—even when other data is unavailable.
Then there's GPS (Global Positioning System), which uses satellite signals to determine the vehicle's location. While GPS is highly effective in open areas, its signals can weaken or disappear entirely in tunnels, dense cities, or forests, leaving devices without a reliable way to navigate.
By combining data from these sensors, the RESEPI system fills in the gaps where one sensor alone might struggle, creating a navigation system that's accurate, dependable, and built to handle tough environments.
Advancing Navigation with Visual SLAM and Odometry
A big part of Singh's work has been improving the RESEPI system's Visual SLAM (Simultaneous Localization and Mapping) and Visual Odometry—two key technologies that help devices understand where they are and how they're moving.
Visual SLAM allows drones and autonomous vehicles to simultaneously create a map of their surroundings and determine their position within that map. For a drone flying through a dense urban area, this means it doesn't have to rely solely on GPS signals, which can fail between tall buildings. Instead, Visual SLAM combines data from cameras, LiDAR, and motion sensors to interpret its environment and maintain its course.
Singh worked to improve how these different sensors communicate and share data so the RESEPI system could guide devices safely.
Visual Odometry complements this by tracking movement in real time, which is almost like giving the system a pair of eyes. By analyzing sequential images from a camera, it calculates how far the device has moved and in what direction. To improve RESEPI's Visual Odometry, Singh made sure the visual data from cameras aligned with inputs from other sensors, like LiDAR and IMU.
Part of this process involved boresighting, which involves correcting any slight misalignments between sensors. For example, if the LiDAR detected a building 15 meters ahead and slightly to the left, but the camera captured it at a slightly different angle, boresighting would adjust the system to synchronize the inputs so the drone wouldn't crash or go in the wrong direction.
Finally, to make sure the RESEPI system worked reliably in real-world conditions, Singh created detailed testing procedures for its hardware, firmware, and software. These tests were designed to catch problems early, whether it was a LiDAR sensor not syncing properly with the IMU or the software misinterpreting data. He also introduced automated tests to speed up the process, making it easier to find and fix issues with how the system's different sensors shared information.
Flight Testing and Real-World Applications
One of the most hands-on aspects of Singh's work has been extensive flight testing using small Unmanned Aerial Systems (UAS), essentially drones equipped with the RESEPI system. These tests weren't just about flying the drones—they were about putting the system through its paces in real-world conditions to see how well it performed.
For example, Singh tested the RESEPI system in environments where GPS signals were weak or non-existent, like dense cityscapes or remote areas. These flights gathered important data on how the system's sensor fusion algorithms performed under pressure.
Did the LiDAR and IMU provide accurate readings when the drone tilted or accelerated? Was the Visual SLAM able to correctly map its surroundings and maintain positioning in GPS-denied zones? The answers to these questions helped Singh and his team pinpoint areas for improvement.
"These tests have helped validate our systems' performance and led to improvements in our sensor fusion algorithms," Singh explains. "The insights gained from these flights have directly influenced our product development roadmap and helped us stay ahead of market trends."
The Future of Reliable Navigation with Shwetabh Singh
Keeping track of unmanned aircraft and other autonomous devices isn't easy—especially when GPS signals go dark. Nobody wants these systems crashing into buildings or falling out of the sky just because they can't figure out where they are.
Thanks to Shwetabh Singh's innovations, devices equipped with the RESEPI system are navigating more accurately and reliably than ever.
Looking ahead, Singh hopes to push the boundaries of navigation even further. He's working on systems that can handle increasingly complex environments, make split-second decisions with edge computing, and use AI and machine learning to improve sensor fusion's accuracy.