Working with local youths involved in a hip-hop education program in the city of Pittsburgh, we explored how to spread awareness about their uplifting program as well as the music generated by them within the city and other local communities. We created 412 Beats, an augmented reality mobile platform that aims to help local musicians be discovered through "sonifying" the city through an interactive, serendipitous music experience. It is also designed with the intention to bridge the gap between the segregated communities within Pittsburgh, and personify the voice of underserved communities.
Our service, 412 Beats, allows musicians to link audio content to images that can then be placed in urban spaces. The system leverages computer vision and AR to more naturally augment tagged images allowing any user with a mobile camera and an internet connection to retrieve the original audio. With this project, we ultimately aim to showcase how AR, mobile phones, and new interaction techniques can be used to empower communities to place self-generated digital content into physical space.
Our service, 412 Beats, allows musicians to link audio content to images that can then be placed in urban spaces. The system leverages computer vision and AR to more naturally augment tagged images allowing any user with a mobile camera and an internet connection to retrieve the original audio. With this project, we ultimately aim to showcase how AR, mobile phones, and new interaction techniques can be used to empower communities to place self-generated digital content into physical space.
On the website, musicians can upload their songs and attach metadata tags such as genre, location, mood etc. Each song uploaded gets assigned a unique visual marker, which is placed as a pattern in the physical installation. With the aim of making the website a collaborative system, musicians can choose which site locations they want their song to be placed. They can then edit their pattern placement on the installation.
We visualized to combine multiple patterns/images (each one linked to a unique soundtrack) together to form a physical installation that passersby might encounter in public spaces. Aiming to keep the installation simple yet attractive, we ideated with basic shapes for scanning. We decided on the rhomboid-shaped images as visual markers, as they would be easy to scan and can be produced in a home setting – the idea being that ultimately musician users themselves might create personalized designs on the web system and physically setup the installation themselves.
The mobile AR application runs from a web browser allowing passerby's to scan images within the physical installation to retrieve and experience music. After accessing the website, a camera-based interface guides users to position their phones on top of a visual marker. Once the scan is complete and the visual marker is recognized, the song associated with that particular visual marker begins playing on the user's phone. While a QR code would have redirected the user to a standard web page, by using AR we can provide a more continuous experience. The soundtrack will play as long as the user's camera can recognize its corresponding marker correctly; simply moving the phone's camera to another marker will automatically play a different song. The app also contains information about the artist and track, in case the users seeks to learn more information about the music being played.
We were given a very open brief for this project – To design for Social Impact in the city of Pittsburgh. With such an open brief, we did explore many ways in how we as designers can create change in the community. Our explorations led us to interviews with urban designers, social activists, public educators, and environmental activist. We also attended a hackable city workshop, where we learned about the many issues that Pittsburgh and its communities deal with. From our explorations, we created a framework on which we would focus our design project on.
- Bridging Connections Between Communities
- Creating Playful & Intuitive Interactions
- Truly Engaging Community Members
One of our explorations was with a music education program in the city of Pittsburgh, known as Arts Greenhouse, a hip-hop music education program that that serves Pittsburgh teens through music technology classes, music recording projects, hip-hop performances, and workshops on special topics relating to hip-hop. We spent time in recording sessions with many of the students, having formal and informal conversations with them, and we interviewed several of the program directors, understanding the core essence of the space – To empower kids from underserved communities through music, music education & performance.
Based on our initial explorations, we brainstormed about the customer journey map. We decided to create a service, which involved two user groups – musicians & audiences.From the musician's point of view, our service would be to provide a music uploading platform where unique identification markers would be generated. Musicians would also be able to select and customize installations, where their songs would be exhibited. For the audiences, we wanted to create an intuitive and serendipitous music exploration experience, which they could access easily from their smartphone. Data collected from these interactions would be shared with musicians so that they can track where their music is most being heard, shared and saved, so as to help them know their audience better. We initiated an initial service map that tracked a seamless digital and physical service design.
We used a rapid prototyping method for this project, where we would create ideate and create quick models for testing. We body-stormed many creative solutions in quick succession and tested them with both user group. Using serendipitous encounters as the take-off point, we explored how we might expose the student's music in the real world, using physical form. We focused on creating forms that would guide the audience intuitively towards further interactions. We explored many forms of technology and products – from headphones to augmented reality maps, to screen-based interactions. However, because this was based for a not-for-profit, we took cost into consideration and wanted to create an experience setting that the Arts Greenhouse could do on their own, something simple and versatile.
Based on the feedback and insights we gained from our rapid prototyping session and production, we went on to finalize our service map. We created two high fidelity working prototypes for the mobile application, one in Vufuria and the other in InVision. We had a UI/UX testing, based on which we made our final UI application. For the physical installation, we made two installations that we placed around campus and had users, as well as the musicians, live test it.
We carried out user testings with both musicians as well as users. The overall response we got was very positive. We received a few insights towards artist onboarding as well as installation execution:
- The musicians wanted a bit more flexibility towards image creation. They would like to upload their own album art or custom artwork as visual patterns.
- They wanted to collaborate with more local services as installation venues.
Many users understood the functionality of the system by simply interacting with the prototype. Some users noted the rhomboid frame to be more natural. For one group of users the frame suggested that the phone could be moved and oriented freely using only one hand, whereas, for another group, the frame became confusing because they expected a square form, being more familiar with the experience of scanning a QR code.
In order to test the design and receive feedback from our users, we implemented a technical prototype of our design. We did so by using Argon.js, a Javascript framework for adding augmented reality content to web applications being developed at Georgia Tech. This was a limitation to our prototype as one has to download a native Argon application that functions as a browser. However, our belief is that in the near future AR will become a feature of standard mobile browsers (Chrome, Safari, IE). We purposefully used HTML, CSS, and Javascript – the basic building blocks of the web – so audience users would not have to download an application before experiencing the installation, but simply navigate to a website on their mobile browser. The actual image recognition is implemented by using Vuforia, a computer vision SDK for target recognition.
The flexibility of our service and the easy implementation of the installation, makes us believe that we can expand to many cities. The service can be used by many more music educational programs to spread awareness to their organizations and local artists. A future outlook would be to create a DIY toolkit for local musicians and organizations.