🇬🇧 How to design for visual impaired people

This is the post that I wrote after the event “Design beyond design boundaries” for the Milan’s group of the Interaction Design Foundation.

Last Saturday September 20th, IDF Milan organized its 4th event “Design beyond visual boundaries” in collaboration with the Italian startup Horus Technology. The main objective of the workshop was to start designing the User Interface of their product.

Horus is a device that supports visually impaired people like a virtual assistant. It will be positioned on normal glasses and it will interact through audio bone conduction and a manual controller with buttons. Horus will have two 5mpx cameras, a separated battery pack and it won’t rely on internet/bluetooth connections to be functional.

occhiali1-1100x400

 

Horus main functions are:

– text recognition
– navigation (obstacles, road signs and shops signs recognition)
– face and objects recognition (description and memorization)

During the first part of the event, the CTO Luca Nardelli introduced Horus and gave us a document that he wrote with the Business Developer Benedetta Magri. After the introduction the IDF Milan people shooted dozens of questions to Luca and Benedetta, creating one of the most amazing brain storming that I ever done.
We had a list of functions and we had the important mission to create an interface that we can’t really figure out. We were going beyond our visual and intellectual boundaries!

Thank God, Antonino Cotroneo, a visually impaired Engineering Student, joined us and helped us to mash the Horus functions with the real users needs, feelings, problems and so on. The opportunity to work on a real problem was amazing. Having Antonino there with us increased the value of our design process adding that human kind involvment that we can’t have for any other “commercial” product.

Watch the Horus introduction and the hard questions and answers session.

After two hours of team work, the three groups presented their works focused on a single function and a device UI interaction pattern proposal. Obviously none of us had the time to analize all the day-life implications. Following you can find the three teams full presentations, but first I want to list the most interesting design suggestions that the participants have had:

– the device should not be embarrassing to use in social context: avoid audio commands like “What’s in front of me?“, “Scan the room“, “Read the text
– the device should have some natural gestures related with some usual glasses movements
– the device should simulate the real reading approach: if the readers is watching the text Horus will read, if don’t Horus will stop reading even if it has already the full text in memory
– the device should understand the written text hierarchy and formatting
– the device should mix simple audio and spoken messages (not words) for mapping road obstacles and their positions like Doppler effect or Bat’s radar
– the device should have different usage modes for enabling/disabling functions or their sensibility
– the device should understand what to recognize, notify and save based on the user’s contextual needs
– the device should develop different device’s editions for sport, online navigation, free-time etc.

All of us understood that creating an unobtrusive UI that could anticipate or simply support the users contextual needs in a few hours is almost impossible. We tried to design different interaction patterns, but none of us found a concrete solution. We enjoyed this “design mission impossible”. As IDF Milan we are proud of our collaboration with Horus Technology and we thank them for the opportunity to contribute to the projects that will improve the Antonino’s life.

Special thanks to our host WCAP Milan that gave us a great room and the connectivity.

Follow IDF Milan on Facebook, LinkedIn and Twitter for the next event.

Lascia un commento

Il tuo indirizzo email non sarĂ  pubblicato. I campi obbligatori sono contrassegnati *