Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Applied Technology Review
Sensor fusion is becoming increasingly important as robots become more independent. Sensor fusion combines data from several on- and off-robot sensors to reduce uncertainty when a robot navigates or performs specified tasks.
By
Applied Technology Review | Wednesday, March 16, 2022
Stay ahead of the industry with exclusive feature stories on
the top companies, expert insights and the latest news delivered straight to your
inbox. Subscribe today.
Sensor fusion is a critical capability for autonomous robots to complete complicated tasks consistently. It frequently combines sensors on the robot and sensors in the surrounding environment to produce highly accurate location information.
FREMONT CA: Sensor fusion is becoming increasingly important as robots become more independent. Sensor fusion combines data from several on- and off-robot sensors to reduce uncertainty when a robot navigates or performs specified tasks. It provides multiple benefits to autonomous robots: increased sensor input accuracy, reliability, and fault tolerance; increased spatial and temporal coverage from sensor systems; increased resolution and recognition of surroundings, particularly in dynamic environments; sensor fusion can reduce robot cost and complexity by utilizing algorithms that handle data preprocessing and allow for the use of a variety of sensors without modifying the basic robot application.
Autonomous mobile robots, stationary robots, airborne robots, and marine robots use sensor fusion. A basic example of sensor fusion is using a wheel encoder and an inertial measurement unit (IMU) to assist in determining a robot's location and orientation. These robots are frequently encountered in fast-paced situations such as warehouses and factories. The addition of data from external cameras strategically placed across the facility can significantly boost a robot's ability to navigate numerous fixed and dynamic barriers successfully.
Another example is combining the Global Navigation Satellite System (GNSS) with IMU sensor data to provide more robust information about robot positions. This is especially valuable in unavailable GNSS locations, such as dead spots in many industrial buildings, where IMU data can be combined with encoder odometry to continue delivering reliable position information. Additionally, the inertial sensor adds the dimension of speed information. Sensor fusion results in increased resolution and data quality.
Another platform that utilizes the GNSS for location and navigation is aerial robots. On the other hand, industrial robots are increasingly being deployed in GNSS-denied environments such as warehouses, infrastructure, oil and gas, and other locations with considerable GNSS signal obstacles. Conventional sensor fusion results in increased computing complexity and energy consumption in the three-dimensional environment in which aerial robots and drones operate, neither desired. Additionally, the new capabilities add weight and constrain the platforms' payload capacities.
A range of other sensor technologies is being employed to circumvent these constraints. Cameras in motion capture systems, vision, LIDAR, radio beacons, and even map-based techniques are all examples. Each of the alternatives has several advantages and disadvantages.