Mobile Development

 

mBrailler: Braille Android Keyboard
mBrailler: Braille Android Keyboard

Prior to this project it was merely curiosity that would bring me to the Accessibility settings on my devices. The interviews and research from this project truly opened my worldview to the challenges many people face that can sometimes go ignored by designers and developers.

How can the blind typing experience on Android be improved beyond the available tools?

Bēhance sketches: https://www.behance.net/gallery/33415917/mBrailler

Github: https://github.com/RandolphDukeII/mBrailler 

The app in action

 

I had been extremely fortunate to work with my advisors Dr. Vicki Hanson [her portfolio] and Dr. Hugo Nicolau [his portfolio] who had an idea to continue a project. The mBrailler project continued on the approach of creating a gesture controlled input to access special typing function while allowing for the back of the phone to map to traditional Braille keys.

Approach

Gaining a thorough understanding of the problem required me to do several research activities. The most obvious of which was to begin creating a working mental model of what challenges blind technology users face.

Early Android Braille keyboard prototype.
Early Android Braille keyboard prototype.

I created some sketches of interactions that would be able to utilize both thumbs for input [I have numerous sketches here: Bēhance sketches]. My decision to use both thumbs was because both hands would be required to operate the keyboard, so they would already be in place while holding the phone.

Results

Using Android Studio Randolph Duke II
Using Android Studio and the emulator to view my app.

I had created a capstone project that encompassed the altered physical and software keyboards. The conversations I had with other accessibility software developers, blind and low vision training staff, and other software engineers led to the creation of mBrailler.

The challenge with creating a software and hardware typing solution is that it needed to complement each portion well or it would come part of every other broken user experience. I had to create new interactions and gestures to allow for the complex editing functions. After clearly defining the editing functions made available with the built in system clipboard, I created a table to map related unctions to one another. I had created a menu system that used TTS to read what had been typed and what function a user had swiped to access.

Accessing the text editing menu.
An example gesture of accessing the text editing menu.

 

Impact

Just sharing some work with accessibility researchers.
Just sharing some work with accessibility researchers at a conference, no biggie.

I had learned from so many people while working on this project. I had learned so much, in fact, that I felt it important enough to share my work at the Rochester Institute of Technology Effective ACCESS conference focused on accessibility services and research in the Upstate NY area.