Byron SpiceThursday, July 31, 2025Print this page.

To find an apple amidst the leaves, listen to the fruit and it will tell you where it is.
That may sound like advice from a mystic, but researchers in Carnegie Mellon University's Robotics Institute have, in fact, invented a sensing technology called SonicBoom that can find apples or other objects based on the sound they make. The novel sensing technology, still in the early stages of development, could someday be used by farm robots for tasks like pruning vines or locating ripe apples hidden among the leaves.
The device could be the answer to a manipulation problem that has long befuddled agricultural robotics researchers. Farm workers can simply thrust their hands through the leaves toward what looks like an apple and use their sense of touch to grasp the fruit. But robots depend solely on cameras to guide their arms and manipulators, said Moonyoung (Mark) Lee, a fifth-year Ph.D. student in robotics.
"One of the reasons manipulation in an agricultural setting is so difficult is because you have so much clutter — leaves hanging everywhere — and that blocks a lot of visual inputs," Lee said. In an orchard, "the fruit itself can be partially occluded and the path the arm must take to reach it can be very occluded."
Adding a touch or haptic sensor to a robot's camera "is a no-brainer," Lee said, but existing sensors have come up short. Tiny, camera-based tactile sensors, encased in protective gel, can quickly wear out or suffer damage when in frequent contact with vegetation. Pressure sensors, another current option, have to be applied to large areas of the robot arm, making the approach impractically expensive.
By contrast, SonicBoom relies on contact microphones, which sense audio vibrations when they are in contact with an object rather than through the air like a conventional microphone.
Contact microphones aren't top-of-mind for most robotics researchers, Lee said, but his adviser, RI Associate Professor Oliver Kroemer, used the devices to perform classification tasks, such as identifying the properties of materials.
For this latest application, the research team — which also included RI Ph.D. student Uksang Yoo and three additional faculty members — used an array of six contact microphones placed inside a piece of PVC pipe. When the pipe touches an object, such as a tree branch, the microphones detect the resulting sound/vibration. By analyzing the differences in amplitude and other aspects of the audio signals detected by each microphone, the researchers could use triangulation to determine where the contact took place. SonicBoom can localize contacts with a precision between 0.43 and 2.2 centimeters.
The PVC pipe protects the contact microphones from damage. (It also gives the appearance of a microphone boom, inspiring the SonicBoom moniker.) Ultimately, the microphone array could be installed inside the robot arm itself, rather than as a separate module.
The researchers used a data-driven machine learning module to develop the ability to map the signals from the microphone array. To do so, they collected audio data from 18,000 contacts between the sensor and a wooden rod.
Different materials create different vibrations, but that proved not to be an issue for the localization task. The key was the difference in vibrations detected by each microphone, not what was causing the vibrations. SonicBoom worked almost as well when it was in contact with aluminum or plastic as when it contacted the wooden rod.
For this experiment, the researchers calibrated SonicBoom to determine the location of hard or rigid objects, but Lee said changing its configuration should enable it to also sense less rigid objects, such as soft fruits and vegetables. Lee has also led subsequent research that explores the arrays' ability to identify the object, not just its location.
Though SonicBoom was developed for agricultural use, Lee can imagine it in other applications, such as safety devices when robots are used near people or in robots explicitly designed to interact with humans. It also could be used for applications in dark places.
"Even without a camera, this sensing technology could determine the 3D shape of things just by touching," Lee said.
In addition to Lee, Kroemer and Yoo, the research team included RI faculty members Jean Oh, Jeffrey Ichnowski, and George Kantor.
A report explaining SonicBoom appeared in the July issue of IEEE Robotics and Automation Letters. The research was supported by the National Science Foundation and the U.S. Department of Agriculture. Visit the SonicBoom project page for more information.
Aaron Aupperlee | 412-268-9068 | aaupperlee@cmu.edu