Researchers at Rice University have developed EyeDAR, a compact, orange-sized radar sensor for safety in autonomous vehicle (AV) technology.
EyeDAR provides a second “set of eyes” that functions reliably where cameras and lidar often fail.
Mounted on streetlights, it uses radar to track traffic through the mist and communicates that vital information directly to the vehicles below.
This low-power millimeter-wave radar sensor gives self-driving cars a clearer, more accurate view of traffic by feeding them essential data from the surrounding road.
Instead of cramming more expensive computers into the cars themselves, the team led by postdoctoral researcher Kun Woo Cho decided to put the brains into the road.
“EyeDAR is an example of what I like to call ‘analog computing,’” said Cho.
“Over the past two decades, people have been focusing on the digital and software side of computation, and the analog, hardware side has been lagging behind. I want to explore this overlooked analog design space,” the researcher added.
Luneberg lens design
Autonomous vehicle safety has mostly focused on onboard tech, but true reliability may require upgrading the roads themselves.
Standard sensors like cameras and lidar often fail in heavy rain, thick fog, or low light, creating dangerous blind spots.
To solve this, researchers turned to infrastructure-based radar, which maintains high accuracy in all weather conditions and can even detect hazards hidden behind physical obstacles.
The secret to EyeDAR’s power is in a 3D-printed Luneburg lens.
Modeled after the human eye, the lens features over 8,000 tiny, uniquely shaped elements.
When a radar signal hits it, the physical structure of the resin naturally bends the waves toward a focal point. It’s “analog computing” — using the shape of an object to do the math that a digital processor usually struggles with.
Interestingly, EyeDAR sensors can be installed on streetlights and traffic signals. This would enable cities to create a safety net that catches lost radar signals that would normally bounce away from a vehicle.
These low-cost, strategically placed units can detect hidden hazards — such as a pedestrian obscured by a truck or a car approaching an intersection — and relay that data to autonomous vehicles in real time.
It will extend a car’s sensing range and ensure reliable detection even when onboard sensors are limited by distance or poor visibility.
200 times faster
Standard radar is often a one-way street. A car sends out a signal, and if it hits a cyclist, only a tiny fraction of that signal bounces back to the vehicle. Most of the data simply scatters into the void.
EyeDAR catches those lost reflections and communicates.
It alternates between absorbing and reflecting radar waves to send data back to the car in a sequence of 0s and 1s.
Cho describes it as “blinking Morse code.” Interestingly, it is the first “talking sensor” to integrate sensing and communication into a single low-power design.
“EyeDAR is a talking sensor ⎯ it is a first instance of integrating radar sensing and communication functionality in a single design,” she noted.
In testing, EyeDAR resolved target directions 200 times faster than existing digital radar.
The newly designed, inexpensive, and compact tech could enhance urban safety. In theory, cities could pepper them across every stop sign and traffic light.
Beyond cars, this technology could empower drones, robots, and wearable devices to “see” through shared data networks.
First Appeared on
Source link
Leave feedback about this