About This Project

Like bats and dolphins, some blind humans are able to echolocate — using acoustic reflections to sense, interact with, and navigate the world. However, bats and dolphins echolocate ultrasonically, at frequencies that offer higher resolution but are inaudible to humans. Inspired by advances in computing technology, we are developing a simple, low-cost device that makes ultrasonic echolocation sounds audible and allows us to explore the most useful signal properties for human observers.

Ask the Scientists

Join The Discussion

What is the context of this research?

Environments like deep caves and murky water are too dark and silent for typical vision and hearing. Animals in these environments often have the ability to echolocate: They make sounds and listen for their reflections. Like turning on a flashlight in a dark room, echolocation “illuminates” objects and spaces actively using sound.

Some blind humans also echolocate using tongue clicks or cane taps. Human echolocation isn’t an exotic skill restricted to special individuals, but an innate mode of perception that can be developed with practice, and augmented by technology. For example, bats and dolphins echolocate at much finer resolution using ultrasonic frequencies. When slowed to be human-audible, ultrasonic echoes are also more informative than unprocessed audible echoes, at least to non-expert echolocators. Thus, in this project, we explore the benefits of assisted ultrasonic echolocation: how can a simple device make ultrasound echoes both audible and useful to a human listener?

What is the significance of this project?

For the blind and low-vision (BLV) community, everyday activities like navigating a room or commuting to work present significant challenges, especially in unfamiliar or dynamic environments. People often rely on sound for these essential activities, as it can provide information about the distal environment (beyond arm’s reach).

Blind human echolocators have well documented perceptual and navigational advantages over non-echolocators. At the same time, although sonar technology has been maturing for over a century, fuller adoption and benefits of sonar-based perceptual aids are limited. We would like to reconcile these facts by identifying the factors that limit or facilitate echolocation use; improve the evidence base for artificially and naturally mediated echolocation skills; and pave the way for a potentially transformative echolocation aid that may supplement, complement, or even substitute human instruction in resource-constrained training contexts.

What are the goals of the project?

With the project funding we're raising here, we have two main goals:

1. Recruit up to 30 blind and sighted study participants. In controlled computer-based experiments, they will identify or distinguish various object/scene echoes at different slowdown, pitch, or other attribute. This extends the lab's previous studies as well as our ongoing experiments, telling us which particular (combination of) settings are best for different tasks such as identification, navigation, etc.

2. Improve the hardware, e.g. upgrading the onboard processor or 3d-printed housings. An improved Robin makes cleaner signals and can be worn in more realistic experiments, making the results more practically interpretable.

3. Publish the results in an appropriate journal or conference proceedings, to benefit the public and wider research community.

Budget

Please wait...

Our budget serves two immediate goals: to help us recruit research participants, and to develop and maintain the hardware. As with all our research, human participants are key to this project’s success. Part of the budget would help us recruit both typically sighted as well as blind/low-vision participants, who may have to travel longer distances to visit our laboratory. As we collect data and feedback from participants, we also wish to refine the hardware platform with enhancements such as an updated processor, ultrasonic transducers, or power source. Successful completion of this project would be crucial to planning future developments, ideally through larger grant proposals that would use the data collected here.

Endorsed by

Human echolocation affords a unique way of understanding how the brain makes sense of its sensory inputs. Although some human echolocators achieve proficiency unaided, ultrasonic wearables promise freedom of navigation to a much larger set of users. This project bridges human psychology with engineering approaches and, through the lab's excellent hands-on mentorship, has inspired numerous University of San Francisco undergraduates to pursue research careers.
This project is important since it promises to make echolocation, a fundamental yet underappreciated Orientation & Mobility skill, more accessible to a range of people with visual impairments. The wearable it develops will serve as a valuable learning tool to help people acquire echolocation skills that may ultimately enable some of them to perform echolocation without the aid of a device.
Having taught echolocation to thousands of blind people worldwide, I understand both its transformative power and the barriers to its broader adoption. Santani’s team is pioneering something remarkable: making the higher-resolution world of ultrasonic echolocation accessible to humans through elegant technological translation. This project bridges the gap between the theoretically possible and the practically achievable, and could accelerate echolocation learning or even provide its benefits to those unable to develop the skill naturally.

Project Timeline

We are currently conducting experiments that explore how well users of our device can distinguish everyday objects and scenes. The funded project would extend this work by allowing us to improve our hardware, and to expand our participant pool to blind and low-vision participants. Over late summer and fall, we aim to collect a full preliminary data set, in time for conference and grant deadlines that cluster from December through February 2026.

Sep 26, 2025

Project Launched

Oct 15, 2025

Begin recruiting new participants; data collection

Nov 01, 2025

Launch project; begin hardware design update

Jan 31, 2026

Dissemination (conference submission) to report interim results

Feb 16, 2026

Extend project via additional grant proposal

Meet the Team

Santani Teng
Santani Teng
Associate Scientist

Affiliates

Smith-Kettlewell Eye Research Institute
View Profile
Ryan Tam
Ryan Tam
Lab Assistant

Affiliates

Smith-Kettlewell Eye Research Institute
View Profile
Pushpita Bhattacharyya
Pushpita Bhattacharyya
Research Associate

Affiliates

Smith-Kettlewell Eye Research Institute
View Profile

Team Bio

The Cognition, Action, and Neural Dynamics Laboratory at Smith-Kettlewell seeks to better understand how people perceive and move throughout the world, especially when vision is unavailable. We apply neuroscience, experimental psychology, engineering, and computational approaches to our work, currently focusing on echoacoustic perception and tactile (braille) reading in blind people.

Santani Teng

Hello! I’m an Associate Scientist at Smith-Kettlewell, where I investigate auditory spatial perception, haptics, echolocation, and assisted mobility in sighted and blind persons. Previously, I completed my Ph.D. at UC Berkeley and postdoctoral work at MIT, where I remain affiliated.

Ryan Tam

Hello I am a lab assistant with the Cognition, Action, and Neural Dynamics Lab at the Smith-Kettlewell Eye Research Institute. I enjoy learning about the various underlying neural mechanisms of cognitive and behavioral functions! I am a recent graduate of the University of San Francisco with a major in Psychology and a minor in Neuroscience. In the future, I aspire to pursue a PhD in cognitive neuroscience.

Pushpita Bhattacharyya

Hello! I’m a Research Associate at Smith-Kettlewell, where I study how the brain processes touch and sound in blindness. I earned my M.S. in Cognitive Neuroscience at the University of Delaware, where I worked with materials science engineers to develop tactile aids for blind people. I aspire to create inclusive tools for people with sensory disabilities through my research.

Additional Information

Check out the Teng Lab for Cognition, Action, and Neural Dynamics at Smith-Kettlewell Institute

Check out our other project campaign here: Brain dynamics in braille reading: Letters to language


Project Backers

  • 28Backers
  • 130%Funded
  • $5,200Total Donations
  • $185.71Average Donation
Please wait...