Including Human Motion in the R&D of GNSS/Inertial Sensor Systems

By Rahul Gupta On September 13, 2012
No tags assigned.

Accelerometers, barometers, magnetometers and gyroscopes are all common components in the latest generation of smartphones—and the information such inertial sensors provide can be used to augment a device’s positioning capabilities, improving accuracy where GNSS is weak or denied.

Marrying the output of a device’s GNSS receiver with that of its MEMS inertial sensors, however, is far from simple. It requires the development of unique sensor fusion algorithms—a process that’s especially challenging in the case of handheld devices.

The Handheld Challenge

Traditionally, the R&D of sensor fusion algorithms has relied on extensive field-testing, with the device under test being placed on a moving platform, specially constructed to ensure consistency of trajectory between each test run. By its nature, such an approach is:

  • Time-consuming
  • Expensive
  • Inflexible

Moreover, when testing algorithms for use in handheld devices, these field tests allow little scope for incorporating the effects of common human motions and gestures—for example, of removing a smartphone from a pocket.

Given that a smartphone will often be pulled from a pair of jeans immediately before it’s used for a positioning-based application, this represents a significant testing weakness.

Solving the Challenge with Simulation

Spirent recently released SimSENSOR, a tool that accelerates and enhances the R&D of sensor fusion algorithms, by enabling these crucial tests to take place within the controlled conditions of the R&D lab. 
SimSENSOR extends the capabilities of our Multi-GNSS simulators, simulating the output of MEMS inertial sensors along the same, fixed trajectory as the simulated GNSS signals.

In addition to eliminating the time, cost and accuracy implications of field trials, a sensor simulation approach can grant R&D teams far greater facility to test an algorithm’s performance when a device is subjected to human motion—from being carried down a street on foot, to being removed from a pocket and raised to a user’s face.

To this end, SimSENSOR features a library of common human motion and gesture models that can be quickly and simply layered onto the device’s simulated linear trajectory.

Testing that takes advantage of these models is:

  • More comprehensive
  • More rigorous
  • More representative of actual device use

More comprehensive, nuanced testing ultimately supports the development of higher performance handheld devices, which in turn supports higher levels of customer satisfaction—and, in a market increasingly driven by online reviews and recommendations, customer satisfaction plays an increasingly pivotal role in driving sales.

Find out more

We’re currently working to further enhance the capabilities of this innovative feature. If you’d like to know more about what we’ve got planned, don’t hesitate to get in touch.

comments powered by Disqus
× Spirent.com uses cookies to enhance and streamline your experience. By continuing to browse our site, you are agreeing to the use of cookies.