Todays technology provides the ability for people to interact with devices at all time, but it also makes them gaze down and lose touch with their surrounding while doing so.
This project explored alternatives to the current way of providing guidance while walking by designing in the context, rather than design for the context. This led to several iterations of prototypes that were tested with people in the context.
The findings are showcased in a form of a multi modal guidance system called UP, that provides reassurance in all steps of the way without having to look down at the screen.
Up - How it Works
Explanatory illustrations
Setting the guidance
Person choosing the destination, type of guidance and shows the patch(haptic guidance) on the table.
Borut Kerzic
Up App
Showcased Up App main page
Patch(haptic guiding apparel)
Showcasing inputs/outputs on patch.. Tap to initiate guidance and receive specific haptic animations in order to be guided in silence.
Borut Kerzic
UP guidance blueprint
Showcasing users/systems interaction points throughout the persons journey from start to the desired destination.
Designing in context
Showcasing how the prototypes were tested, iterated,.. throughout the project by peoples input.
Prototyping with sound
Written scenarios, using Amazon AWS to prototype the narrational guidance system.
Haptic prototype built
Building a prototype t-shirt with integrated vibrator motors, connected to Arduino mini and an IR antenna in order to guide(and observe) people from a distance with haptic interactions.
UP guidance onboarding
Presenting guidance to participants via an onboarding animations done in After Effects...
Guiding with haptics
Guiding participants in an outdoor setting and observing their reactions.
Guiding with Narration & haptics
Guiding participants in an outdoor setting and observing their reactions.
Nicole Waniowska - filming the testing
Project final concept explanatory video
Showcasing the interaction highlights in a GIF format with subtitles
This project was my interaction design master thesis at Umea Institute of Design. The aim of the project was to explored beyond current interaction types, and rather focus on the context in which devices are being used and how might we design for it. I was heavily intrigued by the way people interact with their devices while in motion and how it effects their daily life. In order to narrow down the scope of the project I put the project in the context of guidance while in an urban environment and looked into "how might we guide people, while maintaining their ability to be aware of their surrounding".
I approached the project with a human centred design lens, which led me to more to 18 in-depth interviews with people(visually impaired & blind people, millennials, teenagers, Gen Z, mobility experts, etc.). This approach helped me understand that the biggest add on that technology brings to people when used on the go is guidance.
Immersive experiences such as being guided blindfolded through Tate modern, being guided by a guide dog,.. and testing existing digital navigation system helped me gain first hand experiences and led to several ideas, that were later designed and tested in context with people. In order to conduct these tests I have used several prototyping techniques with which I have verified, discarded or build on top of by testing with people(wiz-of-oz style) in the context of being guided while walking.
The project findings were showcased in a form of a multi modal guidance system(UP), that provides reassurance in all steps of the way without having to look down at the screen. UP guides people with visual cues or haptic feedback while people are in motion, as wells as provides the ability to recall the guiding information with a simple tap gesture. The information is provided in a way that eliminates the need to stop-to-interact to gain route information and gives people the ability to focus on their surrounding while walking without having to look down at the screen.