element14 Community
element14 Community
    Register Log In
  • Site
  • Search
  • Log In Register
  • Community Hub
    Community Hub
    • What's New on element14
    • Feedback and Support
    • Benefits of Membership
    • Personal Blogs
    • Members Area
    • Achievement Levels
  • Learn
    Learn
    • Ask an Expert
    • eBooks
    • element14 presents
    • Learning Center
    • Tech Spotlight
    • STEM Academy
    • Webinars, Training and Events
    • Learning Groups
  • Technologies
    Technologies
    • 3D Printing
    • FPGA
    • Industrial Automation
    • Internet of Things
    • Power & Energy
    • Sensors
    • Technology Groups
  • Challenges & Projects
    Challenges & Projects
    • Design Challenges
    • element14 presents Projects
    • Project14
    • Arduino Projects
    • Raspberry Pi Projects
    • Project Groups
  • Products
    Products
    • Arduino
    • Avnet & Tria Boards Community
    • Dev Tools
    • Manufacturers
    • Multicomp Pro
    • Product Groups
    • Raspberry Pi
    • RoadTests & Reviews
  • About Us
  • Store
    Store
    • Visit Your Store
    • Choose another store...
      • Europe
      •  Austria (German)
      •  Belgium (Dutch, French)
      •  Bulgaria (Bulgarian)
      •  Czech Republic (Czech)
      •  Denmark (Danish)
      •  Estonia (Estonian)
      •  Finland (Finnish)
      •  France (French)
      •  Germany (German)
      •  Hungary (Hungarian)
      •  Ireland
      •  Israel
      •  Italy (Italian)
      •  Latvia (Latvian)
      •  
      •  Lithuania (Lithuanian)
      •  Netherlands (Dutch)
      •  Norway (Norwegian)
      •  Poland (Polish)
      •  Portugal (Portuguese)
      •  Romania (Romanian)
      •  Russia (Russian)
      •  Slovakia (Slovak)
      •  Slovenia (Slovenian)
      •  Spain (Spanish)
      •  Sweden (Swedish)
      •  Switzerland(German, French)
      •  Turkey (Turkish)
      •  United Kingdom
      • Asia Pacific
      •  Australia
      •  China
      •  Hong Kong
      •  India
      • Japan
      •  Korea (Korean)
      •  Malaysia
      •  New Zealand
      •  Philippines
      •  Singapore
      •  Taiwan
      •  Thailand (Thai)
      • Vietnam
      • Americas
      •  Brazil (Portuguese)
      •  Canada
      •  Mexico (Spanish)
      •  United States
      Can't find the country/region you're looking for? Visit our export site or find a local distributor.
  • Translate
  • Profile
  • Settings
Analog Devices
  • Products
  • Manufacturers
  • Analog Devices
  • More
  • Cancel
Analog Devices
Blog Stop, Look, Listen – How do Autonomous Mobile Robots Navigate their Environments?
  • Blog
  • Forum
  • Documents
  • Events
  • Polls
  • Members
  • Mentions
  • Sub-Groups
  • Tags
  • More
  • Cancel
  • New
Join Analog Devices to participate - click to join for free!
  • Share
  • More
  • Cancel
Group Actions
  • Group RSS
  • More
  • Cancel
Engagement
  • Author Author: SarahCrowe28
  • Date Created: 4 Mar 2026 3:25 PM Date Created
  • Views 43 views
  • Likes 1 like
  • Comments 0 comments
  • perception
  • ADI Mobile Robotics
  • sensors
  • Motor and Control
  • robotics
  • mobile robotics
  • time-of-flight
  • autonomous mobile robot
  • Industrial Automation Technology
  • Optical Sensing Technology
  • Motors & Motion Control
  • industrial automation
  • Time of Flight (ToF) Sensor and Solutions
  • ADTF3175
  • Inertial Measurement Units (IMU)
  • motor and motion control
  • IMUs
Related
Recommended

Stop, Look, Listen – How do Autonomous Mobile Robots Navigate their Environments?

SarahCrowe28
SarahCrowe28
4 Mar 2026
autonomous robot navigates through a spacious warehouse, showcasing the logistics of goods transportation and storage

by Sarvesh Pimpalkar

Previously we discussed the importance that inertial measurement units (IMUs) play in localization for autonomous mobile robots (AMRs) in a Localization: The Key to Truly Autonomous Mobile Robots - element14 Community. Today we will elaborate on how navigation relies on a fusion of sensor technologies working together to allow AMRs true freedom within dynamically changing environments.

So how do mobile robots learn to get around? As kids we are all thought to “stop, look and listen” before crossing a road, but does this same concept apply to robots. As humans we rely on our eyes and ears to help us “navigate” our environment, robots on the other hand use sensors to provide an awareness of their surroundings.

AMRs use Simultaneous Localization and Mapping (SLAM) techniques to navigate. The process involves the AMR being driven around the facility and scanning its environment. These scans are combined and generate a complete map of the area. AMRs utilize an array of sensors and algorithms for localization and navigation.  Sensor technology such as industrial vision time-of-flight cameras, radar and lidar are the “eyes” of an AMR, combined with data from IMUs and wheel odometry (position encoders).

However, no single sensor is perfect. The true power lies in the diverse sensor types working together to produce effortless navigation in dynamically changing environments.

Each sensor has strengths and weaknesses that are balanced out by having more than one sensor type being relied on for navigation purposes. Let’s consider how multiple sensors can enhance the overall AMR performance.

Environmental Factors while Navigating

Lidar sensors can be sensitive to various environmental factors, such as ambient light, dust, fog, and rain. These factors can degrade the quality of the sensor data and, in turn, affect the performance of the SLAM algorithm. Similarly, other sensor modalities can be affected by reflective surfaces, dynamic moving objects (other AMRs or workers) thus further confusing SLAM. The table below summarizes how environment affects different sensors modalities. 

image

Table 1: Comparison table of sensor modalities

While IMUs and wheel odometry are not affected by visual elements within the working environment, the use of this sensor data in conjunction with visual data means the AMR can operate better in any scenario encountered. Let’s consider the challenge of navigating on a sloping floor surface.

Navigating on a Slope 

While maneuvering on a slope, traditional SLAM algorithms encounter challenges when relying on lidar, as the 2D point data does not show gradient information. Consequently, slopes are misconstrued as walls or obstacles, leading to higher cost maps. As a result, conventional SLAM approaches with 2D systems become ineffective on slopes. IMUs help to solve this challenge by extracting gradient information to effectively negotiate navigating on a slope. 

image

How does the sensor data get combined?

In a typical ROS (Robot Operating System), vision sensors along with IMU and wheel odometry are combined through a process called sensor fusion. A widely used opensource ROS package is robot_locatlizaton1 which utilizes EKF (Extended Kalman filtering) algorithms at its core. By fusing data from diverse sensors such as lidar, cameras, IMUs, and wheel encoders, EKF helps in better estimating and understanding of the robot's state and its environment. Through recursive estimation, EKF refines the robot's position, orientation, and velocity while simultaneously creating and updating a comprehensive map of the surroundings. This fusion of sensor data enables mobile robots to overcome individual sensor limitations and navigate complex terrains with greater precision and reliability. By leveraging techniques like EKF help in collective insights of sensors, deriving meaningful sensor fusion of various sensor modalities allowing mobile robots to effectively perceive and interact with their environment and help navigate AMRs autonomously.

A future blog in this series will cover the Robot Operating System in more detail. However, the focus of this blog is to leave you confident that sensor fusion offers increased reliability, increases the quality of data, while providing greater safety for objects and people within the environment as AMRs aren’t relying on a single means to navigate. To learn more visit analog.com/mobile-robotics.

Reference / Resources:

1 https://docs.ros.org/en/melodic/api/robot_localization/html/index.html

ADTF3175 1 MegaPixel Time-of-Flight Module

ADTF3175BMLZ ANALOG DEVICES, Imaging Module, 1024 x 1024 Active Pixel, 3.5µm x 3.5µm | Farnell® Ireland

EVAL-ADTF3175 Time-of-Flight Evaluation Kit

EVAL-ADTF3175D-NXZ ANALOG DEVICES, Evaluation Board, ADTF3175, ADSD3100 | Farnell® Ireland

  • Sign in to reply
element14 Community

element14 is the first online community specifically for engineers. Connect with your peers and get expert answers to your questions.

  • Members
  • Learn
  • Technologies
  • Challenges & Projects
  • Products
  • Store
  • About Us
  • Feedback & Support
  • FAQs
  • Terms of Use
  • Privacy Policy
  • Legal and Copyright Notices
  • Sitemap
  • Cookies

An Avnet Company © 2026 Premier Farnell Limited. All Rights Reserved.

Premier Farnell Ltd, registered in England and Wales (no 00876412), registered office: Farnell House, Forge Lane, Leeds LS12 2NE.

ICP 备案号 10220084.

Follow element14

  • X
  • Facebook
  • linkedin
  • YouTube