Autonomous vehicles have eyes – camera, lidar, radar. But ear? What did researchers at Fraunhofer Institute do for digital media technology Oldenberg branch for hearing, speech and audio technology Building in Germany Hearing carThe idea is to outfits vehicles with external microphones and AIs, which are to detect, localization and classify environmental sounds, with the goal of helping cars react to the hazards that they cannot see. For now, it means contacting emergency vehicles – and eventually pedestrians, a puncture tire, or unsuccessful break.
“This is about giving the car another meaning, so it can understand the acoustic world around it,” says Moritz brandsA project manager for a hearing car.
In March 2025, Fraunhofer researchers held a prototype car hearing 1,500 km from Oldenberg at a proven ground in northern Sweden. Brands say the journey tested the system in dirt, ice, slops, road salt and cold temperatures.
How to build a car that hears
The team had some important questions to answer: What if microphone housing gets dirty or frost? How does this affect localization and classification? Once the module cleaned and dried, the test showed less than expected. The team also confirmed that microphone cars could avoid the wash.
Each outer microphone module (EMM) has three microphones in a 15-saint package. Mount behind the car-where the air noise is the lowest-they capture the sound, digitize it, convert it into spectrograms, and train it to an area-based conversion neural network (RCNN) for audio event detection.
If RCNN classes an audio signal as a siren, the result is cross-checked with the vehicle cameras: is the blue flashing lights? In this way, by adding the “senses”, it promotes the reliability of the vehicle by reducing the obstacles of false positive. Audio signals are localized through beamforing, although Fraunhoffar refused to provide nuances on technology.
All processing occurs on the ship to reduce delay. It also “eliminates concerns about what will happen in an area with bad internet connectivity or (radiofrequency) noise from noise,” brands. The charge, he says, can be handled by a modern raspberry pie.
According to the brands, the initial benchmark for the hearing car system involves detection of sirens 400 meters in a low speed event. This figure, they say, shrinks 100 meters from the speed of the highway due to air and road noise. The alert is triggered in about two seconds – enough time to react to drivers or autonomous systems.
This display doubles as a control panel and dashboard to activate the “hearing” of the vehicle to the driver.Fraunhofer
History of listening cars
The roots of the hearing car increase for more than a decade. “We have been working on listening to cars since 2014,” says Brands. Early experiments were modest: detecting a nail in a tire to open the trunk through rhythmic tapping or voice command on the pavement.
A few years later, a Tier-1 supplier (a company provides a complete system or major component such as transmission, braking system, battery, automobile manufacturers directly provides driver assistance (ADAS) system to automobile manufacturers. Pushed the work in the development of automotive-grade, soon joined by a leading automaker with EV adoption, the vehicle manufacturer began to see that the ear is as much as the ear.
“A human hears a mermaid and reacts – even before seeing where the sound is coming from. An autonomous vehicle also has to do so if it is safely co -existence with us.” -Iin King, University of Gaulway Sound Lab
Brands recall a moment: sitting on a test track, inside an electric vehicle that was well Untouched against road noiseHe failed to hear an emergency siren until the vehicle was almost on him. “This was a big ‘ah-ha!’ He shows how important the hearing car will be as EV adoption, βhe says.
Eon KingOn a mechanical engineering professor University of gallway In Ireland, AI from Physics sees a jump as a transformative.
“My team took a very physics-based approach,” he says, remembering them 2020 work in this research field But University of heartford In connecticut. “We saw the direction of arrival as a sound, there to triangle the delay between the microphone.
Physics still matters, King says: “It is almost like physics-informed AI. The traditional approach shows what is possible. Now, machine learning systems can better generalize throughout the environment.”
Audio’s future in autonomous vehicles
Despite progress, the king, who guides Gaulway Sound LabResearch in sounding, noise and vibration is cautious.
“In five years, I see it a niche,” they say. βTechnologies take time to become standard. Street warning Once there were niche – but now they are everywhere. Hearing technology will reach there, but step by step. , Near-deployment will probably appear in premium vehicles or autonomous fleet, with large-scale adoption.
The king does not imitate words about why audio perception matters: autonomous vehicles should have co -existence with humans. “A human hears a mermaid and reacts – even before seeing where the sound is coming from. An autonomous vehicle also has to do so if it is safely co -existence with us,” they say.
Kings vision are vehicles with multicencice awareness – for camera and lidar vision, microphone to hear, perhaps also for vibration sensors Road monitoring“Smell,” he jokingly, “one step can be far away.”
Fraunhofra’s Swedish Road Test revealed that stability is not a major obstacle. The king points to another area of ββconcern: false alarm.
“If you train a car to stop, when he helps someone shouting, ‘what happens when children do it as a mischief?” He asks. “We have to test these systems completely before putting on the road. This is not a consumer electronics, where, if the chatgate gives you the wrong answer, you can simply do the question again – people’s lives are at stake.”
The cost is less than an issue: microphones are cheap and rugged. The real challenge is ensuring that the algorithm can create an understanding of horn, garbage trucks and construction noise city sounds.
Fraunhofer is now refining the algorithm with a broad dataset, including sirens from the US, Germany and Denmark. Meanwhile, the Kings Lab is improving sound detection in the indoor contexts, which can be rebuilt for cars.
Some landscape-like a auditory car is visible before detecting a red-light-raner engine-but the king emphasizes the principle: “With the right data, it is possible in theory. The challenge is possible. The challenge is receiving data and training for it.”
Both brands and king agree that no meaning is enough. Cameras, radars, lidar- and now microphones- work together. “Autonomous vehicles that only rely on vision are limited to the line of vision,” the king says. “Adding sounding adds another degree of safety.”
From your site articles
Related articles around web

