element14 Community
element14 Community
    Register Log In
  • Site
  • Search
  • Log In Register
  • Members
    Members
    • Benefits of Membership
    • Achievement Levels
    • Members Area
    • Personal Blogs
    • Feedback and Support
    • What's New on element14
  • Learn
    Learn
    • Learning Center
    • eBooks
    • STEM Academy
    • Webinars, Training and Events
    • Learning Groups
  • Technologies
    Technologies
    • 3D Printing
    • FPGA
    • Industrial Automation
    • Internet of Things
    • Power & Energy
    • Sensors
    • Technology Groups
  • Challenges & Projects
    Challenges & Projects
    • Design Challenges
    • element14 presents
    • Project14
    • Arduino Projects
    • Raspberry Pi Projects
    • Project Groups
  • Products
    Products
    • Arduino
    • Dev Tools
    • Manufacturers
    • Raspberry Pi
    • RoadTests & Reviews
    • Avnet Boards Community
    • Product Groups
  • Store
    Store
    • Visit Your Store
    • Choose Another Store
      • Europe
      •  Austria (German)
      •  Belgium (Dutch, French)
      •  Bulgaria (Bulgarian)
      •  Czech Republic (Czech)
      •  Denmark (Danish)
      •  Estonia (Estonian)
      •  Finland (Finnish)
      •  France (French)
      •  Germany (German)
      •  Hungary (Hungarian)
      •  Ireland
      •  Israel
      •  Italy (Italian)
      •  Latvia (Latvian)
      •  
      •  Lithuania (Lithuanian)
      •  Netherlands (Dutch)
      •  Norway (Norwegian)
      •  Poland (Polish)
      •  Portugal (Portuguese)
      •  Romania (Romanian)
      •  Russia (Russian)
      •  Slovakia (Slovak)
      •  Slovenia (Slovenian)
      •  Spain (Spanish)
      •  Sweden (Swedish)
      •  Switzerland(German, French)
      •  Turkey (Turkish)
      •  United Kingdom
      • Asia Pacific
      •  Australia
      •  China
      •  Hong Kong
      •  India
      •  Korea (Korean)
      •  Malaysia
      •  New Zealand
      •  Philippines
      •  Singapore
      •  Taiwan
      •  Thailand (Thai)
      • Americas
      •  Brazil (Portuguese)
      •  Canada
      •  Mexico (Spanish)
      •  United States
      Can't find the country/region you're looking for? Visit our export site or find a local distributor.
  • Translate
  • Profile
Tech Connection
  • Learn
  • Learning Center
  • Tech Connection
  • More
  • Cancel
Tech Connection
Documents What is Sensor Fusion?
  • Blog
  • Forum
  • Documents
  • Events
  • Members
  • Mentions
  • Sub-Groups
  • Tags
  • More
  • Cancel
  • New
Tech Connection requires membership for participation - click to join
Actions
  • Share
  • More
  • Cancel
Engagement
  • Author Author: rscasny
  • Date Created: 1 May 2018 4:34 PM Date Created
  • Last Updated Last Updated: 11 Oct 2021 2:33 PM
  • Views 548 views
  • Likes 11 likes
  • Comments 3 comments
Related
Recommended

What is Sensor Fusion?

image

Return to Tech Connection image

Sensor technology has evolved to the point where there are now many physical single sensors available to measure temperature, pressure, position, location, humidity, moisture, chemical composition, smoke, gas, and many other environmental parameters, with each having the ability to provide data in analog or digital form. However, these single sensors have limitations, such as:

 

  • Sensory Deprivation or Fault Tolerance: A sudden failure of sensor which causes loss of perception on the targeted object.
  • Limited spatial coverage: Usually a single sensor only covers a limited region. For example, if we read the temperature of a big container using a single thermometer it will give the temperature value near the thermometer and fail to correctly measure the average temperature of the whole container.
  • Limited temporal coverage: Some sensors have a time delay before they execute the process and transmit the measurement values, which may be detrimental to the response of a real-time system.
  • Imprecision: If we take measurements from only individual sensors, the measurements will not be as precise for the employed sensing element.
  • Uncertainty: This depends more on the object being observed rather than the observing device. If we use a single sensor, we cannot measure all relevant parameters of the perception, so the observation may be ambiguous and uncertainty will arise.

 

Sensor Fusion Technology

Sensor fusion technology is a solution to overcome the limitations of a single physical sensor. Here, the system acquires data from many different sensors and the overall data is analyzed using specific algorithms to provide the desired accurate and precise result.

 

We can define sensor fusion as a process (software) to combine data from multiple sensors that provide a more accurate picture of the object environment, rather than using information from a single sensor alone.  A smart phone is the most popular example of sensor fusion. In order to determine its location, the smart phone processor acquires data from three different sensors, the accelerometer, gyroscope, and magnetometer and combines it to provide the accurate geographical location of the phone.

image

Figure 1: Sensor Fusion Block Diagram

 

Sensor Fusion Example: The Human Body

The human body’s method of analyzing the environment is analogous to sensor fusion technology. It detects the external environment by way of various senses, namely vision, hearing, touch, smell and taste. The various sensors in our body collect corresponding data about our surroundings and pass it to the brain through the nervous system. The brain works as a processor and makes calculations on the data in real-time, and transmits the results through the nervous system allowing the brain to further decide on its response to the change in environment.

 

Generally, the brain makes a decision on the basis of several sensory inputs to validate an event and compensate for a lack of information from any one sensor. For example, if there is fire in any corner of a building area, we may or may not be able to see it from our location, but we can smell it and sense the temperature and our brain makes a decision to leave the area.

 

Categorizations of Sensor Fusion

Sensors, used as data sources in a fusion process, are not generally identical. We can differentiate the types of sensor fusion as direct fusion, indirect fusion, and the combination of the outputs of the direct and indirect fusion. Direct fusion can be defined as the fusion of sensor data from a set of similar or different types of sensors. Indirect fusion is based on the different kinds of information about the environment already available with us, and is not real time.

 

Sensor Fusion can be categorized based on the methodology of implementation such as level and extent of fusion, types of Inputs and Outputs, and Sensor configuration methodologies.

 

Fusion processes are normally classified into three levels: low, intermediate, and high level fusion:

 

  • Low-level or Data level fusion (also called the direct approach) is a process in which the raw data from several sources are combined and more informative data representing the physical environment is provided without removing predefined features.
  • Intermediate or Feature-level fusion is a feature-based approach, which compresses raw data into predefined features like edges, lines, and textures to represent the images of objects in the physical environment.
  • High-level or Decision level fusion combines the several decisive inputs and makes a decision, using methods like fuzzy-logic or statistical modelling.

 

Another categorization based on the three-level model is derived from the abstraction level of the input and output data and the typical pattern of fusion where the input and output belongs to different levels. For example, pattern recognition and pattern processing run between feature and decision level. These confusing fusion-patterns sometimes are allocated according to the level of their input data, and other times are allocated according to the level of their output data. This amounts to five fusion categories:

 

  • Raw Data Output from Raw data input
  • Feature Output from Raw data input
  • Feature Output from Feature input
  • Decision Output from Feature Input
  • Decision Output from Decision input

 

On the basis of sensor configuration, sensor fusion is categorized in three categories, namely Complementary, Competitive, and Cooperative.

 

In the complementary configuration, sensors do not directly depend on each other, but are combined to give a more complete image of the data under observation. This rectifies the incompleteness of sensor data.

In the competitive configuration, each sensor conveys independent measurements of the same parameter. There are two possible competitive configurations: one in which fusion data comes from different sensors and the other where fusion of measurements from a single sensor is taken at different instants.

 

In the cooperative configuration, information is taken from two independent sensors to derive information that cannot be derived from a single sensor. A simple example of cooperative configuration is stereoscopic vision, where we use two cameras in different viewpoints for two dimensional images and combine the data of these two cameras, deriving a three dimensional image of the observed section.

 

Sensor Fusion Models and Algorithms

As the fusion sensor configuration depends heavily on the application, until now there has been no broadly accepted model of sensor fusion. It is questionable if there is any universal technique which will be a uniformly superior solution. However, there are standard architectures like JDL Fusion, Waterfall Fusion Process, Boyd, and the LAAS Model, which can be adopted as the application demands.

 

Sensors generally provide the data in the environment by taking measurements. Since these measurements can be noisy, we have to rectify it and reconstruct the parameters of observation. Sensor fusion uses some specific algorithms for smoothening, filtering, and prediction like the Central limit theorem, Kalman filter, Bayesian networks, Dempster-Shafer and Convolutional neural network to obtain the optimal result. Such algorithms are used in aircraft altitude detection, traffic situation analysis, and the orientation of systems in three-dimensional space.

 

Driverless cars, which require accurate information about their surroundings to make driving decisions, are one of the most discussed applications today that can use sensor fusion. Various consumer and industrial applications such as industrial robots, automotive, traction control, smartphones, tablets, IoT and fitness bands require sensor fusion capability.

 

Modern Trends

Silicon technology has evolved to the point where sensor fusion can be achieved by the integration and fabrication of multiple sensors into a single MEMS device. It may be accompanied by a Sensor Hub using an on board microcontroller which integrates and processes data from different sensors and reduces the load and power consumption of the central processor.

 

There are sensor fusion chipsets available like the SSC7102-GQ-AA0 Controller from Microchip that has sensor fusion firmware featured with self-contained 9-axis sensor fusion, sensor data pass-through, fast in-use background calibration of all sensors and calibration monitor,  magnetic immunity: enhanced magnetic distortion, detection and suppression, gyroscope drift cancellation, ambient light sensor support hosted on a 32-bit embedded controller.

image

SSC7102 Sensor Fusion Hub More InformationMore Information

 

The FEBFIS1100MEMSIMU6D3X development Board from On Semiconductor is another complete solution for 3D motion tracking with optimized 9D sensor fusion library It incorporates  FIS1100FIS1100 Inertial Measurement Unit(IMU with AttitudeEngine motion co-processor and sensor fusion library It has a 3-axis Gyroscope a 3-axis accelerometer and a 3-axis magnetometer

 

image

FIS1100FIS11006D Inertial Measurement Unit with Motion Co-Processor and Sensor Fusion Library More Information

  • sensor hub
  • mems
  • iot_design
  • sensor
  • sensor fusion
  • Share
  • History
  • More
  • Cancel
  • Sign in to reply

Top Comments

  • DAB
    DAB over 4 years ago +4
    I was first involved with sensor fusion ideas about twenty years ago. Your first thought is that the more date the better, but then you realize that too much data just results in more confusion. When combining…
  • snidhi
    snidhi over 4 years ago +1
    And to add to the topic also one has to consider the overall increased processing time from all the sensors. So the decision-making matrix gets quite complex. Which in turn may result is slower action…
  • oksbwn
    oksbwn over 4 years ago in reply to snidhi

    Ya there is always a trade-off.....

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • More
    • Cancel
  • snidhi
    snidhi over 4 years ago

    And to add to the topic also one has to consider the overall increased processing time from all the sensors. So the decision-making matrix gets quite complex. Which in turn may result is slower action and feedback from the sensor fusion box to the external world.

     

    Cheers

    • Cancel
    • Vote Up +1 Vote Down
    • Sign in to reply
    • More
    • Cancel
  • DAB
    DAB over 4 years ago

    I was first involved with sensor fusion ideas about twenty years ago.

    Your first thought is that the more date the better, but then you realize that too much data just results in more confusion.

     

    When combining data from sensors you have to be very careful to define what information you want and under what conditions.

     

    Trust me, this is not as easy as it sounds.

    You must study the environment in which the sensor are placed and you have to understand what is just data, what is information, and what is really useful.

     

    DAB

    • Cancel
    • Vote Up +4 Vote Down
    • Sign in to reply
    • More
    • Cancel
element14 Community

element14 is the first online community specifically for engineers. Connect with your peers and get expert answers to your questions.

  • Members
  • Learn
  • Technologies
  • Challenges & Projects
  • Products
  • Store
  • About Us
  • Feedback & Support
  • FAQs
  • Terms of Use
  • Privacy Policy
  • Legal and Copyright Notices
  • Sitemap
  • Cookies

An Avnet Company © 2023 Premier Farnell Limited. All Rights Reserved.

Premier Farnell Ltd, registered in England and Wales (no 00876412), registered office: Farnell House, Forge Lane, Leeds LS12 2NE.

ICP 备案号 10220084.

Follow element14

  • Facebook
  • Twitter
  • linkedin
  • YouTube