Earlier this week I had an appointment at the Dutch Microsoft office to talk about people with disabilities and Apps. (You'll have to forgive me: I am Dutch and we don't care that much about being politically 100% correct; we'd rather get our point across than thinking hours and hours about the right term for a person with challenging visibility abilities: we call them blind people, so I will do that here as well. Not being insensitive, I'm just being Dutch).
I have to be honest. Although I work a lot on software for people with autism, I hardly ever think about making software for people who can't are almost can't see the screens. Or who can't hear a thing. Or who have no hands to make those nice gestures with.
But, I was told that a staggering large amount of people "suffer" from such conditions. Soon during the discussion the question arose: why aren't we all paying attention to this group? Are these people less likely to buy or apps? Or is it just not economically worthwhile to redesign your app so this group will also spend their hard earned money on the developers who built the app? For me, the answer is simple yet still embarrassing: I just don't think about that group. Later in the discussion it turned out how stupid that attitude I have is: I have a very mild form of colorblindness, something that I hardly even notice but that sometimes is a bit annoying when using apps that rely on recognizing colors. Yet I never even think about designing apps for people with disabilities (another disclaimer: I am not saying not being able to distinguish between similar shades of yellow and green is comparable to not having hands (disclaimer in a disclaimer: I CAN see the difference between every color very well, as long as they are written down in their hexadecimal RGB notation)).
The discussion that followed had me thinking. How would one make an app that is suitable for a blind person? Screen readers worked fine in the old days of command line applications. Some websites are optimized for screen readers as well. But how do blind people operate an iPad? Or a smart phone? I find the absence of not having tactile feedback (i.e. not feeling keys) on those devices a huge drawback: it means I have to look at what I am doing. Now, I have that choice: a blind person doesn't.
What can we do to help people who have limited abilities? How can we make software smarter so that everybody has access to the information they need? I have some ideas and I invite you to come up with some ideas as well. Please leave them in the comments and we can make this world a bit more accessible.
Oh, and if you're in the Netherlands around Saturday April 6th / Sunday April 7th I invite you to come to http://www.devcamp.nl and we can try out some of the ideas we might have. It will be fun, I promise you that.
I can't wait to see what you come up with!