Group #1: Jeremy, Kasia, Vanessa
We chose option #1, to critique and improve the interactions provided by an existing product
Product: Elevators
User/Context: Blind people using elevators
Interaction:
The only affordance provided for a blind person in an elevator is the braille that is below each button used to select the floor. But if this person is alone how do they know they are getting off on the correct floor? And if they can’t reach the braille buttons due to a crowded elevator; how do they know which floor they are going to?
Solution:
We brainstormed a smartphone app that interfaces with the elevator and tells a person via audio what floor they are on. This could manifest it self as a small networked device hooked into the elevator that reads its state and communicates this information to an app and the person. This ensures that the blind person is accurately informed as to which floor the elevator is on and allows for them a greater degree of autonomy.
Blog Comments!