As part of the Sonova Global Media Day, we were taken on a tour of the new Real Life Lab which will undertake research for Sonova brands like Phonak. I think by now, you all know we are nerds here at Hearing Aid Know, but you actually should be excited about this as well. I think the Real Life Lab is going to provide key new insights that will lead to ever greater innovation in hearing aids moving forward, let me explain why.
Real World Data
The lab is set up to completely replicate real-world environments, more than that, for the first time, it is set up so that test subjects can move around the lab while being continuously tracked. That means that the data and results gathered are for the first time are as close to real life as they can get while still being able to monitor and control all of the variables. Sonova says:
The Real Life Lab allows Sonova to let hearing impaired persons experience audiological testing in a completely new way. It enables any test person to move around freely in controlled acoustic settings. They can move and interact with acoustic sources as they would do in everyday listening situations. The complex setup allows Sonova’s researchers and audiologists to test hearing aids and hearing aid algorithms in close to realistic environments. Furthermore, it is possible to investigate the behavior of people in social interactions with and without hearing aids. On the screens surrounding the room 360° pictures or videos can be presented to increase the plausibility of the used audiological scenes. Standard setup can be used for specific sound sources or for diffuse background scene. The raised floor construction even allows to cater for sounds from the floor, for example steps, dropping objects or reflections of sound waves – to name a few.
That means that they can pretty much emulate any real-life situation including the reverberation and other acoustic properties of any environment. You want to know how best to handle sound in a noisy cafeteria, no problem, an outdoor event with a thousand people, we got your back, at home with five people at the dinner table and music in the background, no worries! That is an amazing boon to research, but there's more, oh my god much more, and this is what I am really excited about.
Motion Tracking & The Difference it Makes
In the blurb, they say that the can investigate the behaviour of people in social interactions. That dry sentence doesn't convey the true meaning in any way shape or form. Damn, this one of the most fascinating aspects. It should read something truly sexy like:
For the first time ever, our testing lab will give us a clear understanding of your physical activity and listening intent, so we can design hearing aids that assist it
The lab set-up includes limited motion tracking of the subjects including head aspect, this adds a further layer of data which allows researchers to understand how people interact and react to different sound situations. It will allow them to understand how you physically interact with sound situations and exactly how you physically act, so positioning, and head positioning when trying to focus on a sound source or speaker. For the first time ever, we may be able to gather data on listening intent.
Listening intent is manna from heaven for hearing aid brands and so-called mind-controlled hearing aids that we are hearing about now are trying to use listening intent to steer focus. We don't really have hard data on listening intent and how we physically act. For instance, the hearing aid brands believe that when you are in a noisy environment, your intent more often than not is to get the clearest signal from who is in front of you. Makes sense right? That's why directional microphones work the way they do.
The thing is though when we are interacting with someone, we don't look straight at them, our head is often turned about 30 degrees from them while our eyes look more directly at them. In fact, if you try to look directly at someone face on, it actually feels a little weird. That means that head direction is not necessarily a precise factor in listening intent.
If they can gather enough data on different sound situations and how test subjects act when focusing on a sound source, it will mean that they can begin to understand listening intent and the physical cues that go along with it. That would be a big breakthrough, let me tell you why.
Big Data and Listening Intent
So they have all this big data on listening intent, then they feed it into the number cruncher and use machine learning to spit out an intelligent algorithm to utilise it. Add a set of accelerometers (sensors that track positioning, in this case of the head) to the hearing aids and voila, for the first time ever, we have a set of hearing aids that are intelligent enough to at least half understand what you actually want to focus on and it steers all of the features to make it happen. How cool would that be? Damned cool!
So you can probably see why I am basking in nerdy excitement, like I said, if you wear hearing aids, you should be too because they are probably going to be better than ever before within a few years. Like us on Facebook by clicking the button below, I ain't joking, Steve's wife gives us a quota, and if you lot don't meet it, it is bed without tea for us.