Hackathon • Spatial audio

Around

Project Overview

Using spatial sounds and augmented reality to help visually-impaired people navigate their surroundings.

Challenges

Discovering new ways of environment perception by combining spatial sounds and location-based data.

Outcomes

We were able to create a working solution that ended up winning in 3 categories:

• Best AR/VR for Good
• Best Everyday Mobile Hack
• Leadership Award.

*This project was created back in 2017 during a 2 day hackathon
ARound team posing togetherAround interface element - mobile

What is it about?

ARound brings a new perspective to the possibilities of Augmented Reality. While most of AR today is centered on augmenting one's visual systems, we think that there's immense unrealized potential in augmenting one's sense of hearing. While current-generation AR systems can only augment your visuals in a tiny field of view, spatial audio enhances the user's perception in all 360 degrees.

We believe that there are huge opportunities to create platforms based entirely on sound. And it seems like our beliefs are coming to fruition with the releases of of mainstream products such as Ray-Ban Stories, Bose Frames, and AirPods Pro Spatial Sound SDK.

Team formation

Upon arriving at MIT Media Lab, I met with several candidates who reached out and were interested to take on this challenge. One of them was Sunish Gupta, an accessibility expert who was visually-impaired. That's when all the pieces have fallen into place, and we could actually make an impact. We ended up having a diverse team with various backgrounds and experiences:

Dhara Bhavsar - back-end engineer
Sunish Gupta - accessibility expert
Vedant Saran - unity developer
Anandana Kapur - project manager
Maxym Nesmashny - product designer

A panorama of a room where our team held hackathon developments

The room we all worked in during the MIT Hack

Meeting Sunish - visually impaired team member

One of the most crucial moment in driving the use case of our sound navigation idea was meeting Sunish - a visually impaired hackathon participant who became an important part of our team and a good friend. He told us all about how it feels to be visually impaired and everyday troubles it brings about, especially during navigation. We pitched the idea of anchoring 3D sounds to the map, it instantly resonated with him, so he joined the team.

We were building the revolutionary product and had a person directly benefitting from it right in our team. That gave us a big boost of motivation to press on and start developing the idea into reality.

ARound team posing together during hackathon
Around interface element - desktop
Web user interface
Around interface element - mobile
Mobile app prototype

Hacking our way to product

Based on our finding identified 3 main features that seemed to be feasible enough to build during the hackathon:

Discovery Mode, which would use the combination of GPS data and spatial sounds to create an interactive soundscape around the user to navigate in.

Breadcrumbs Mode, to record the path you are taking in order to return using the same path.

Share the Meeting Spot, which would allow a 3rd party user, usually a friend or family of visually impaired, to put a custom sound pin on the map. Upon arrival to the area, a visually-impaired person would be guided directly to the meeting point, again with the help of spatial sound.

Results

We are thrilled to have built a working prototype. We were able to implement a directional spatial audio soundscape as well as a breadcrumbs feature set. We are proud of the idea being user-validated by our team member who is visually impaired.Our solution ended up winning in 3 categories: Best AR/VR for Good, Best Everyday Mobile Hack, and Leadership Award.

ARound team posing with prizes - huge checks of $2000 and $1000 dollars

Have a similar project in mind?

say [email protected]