Graduate Capstone
I created a multimodal braille input system for smartphones starting with Android. The system consisted of a hardware keyboard for typing and keyboard app for text editing gesture input.
How can the blind typing experience on Android be improved beyond the available tools?
Conducted rounds of secondary research on existing mobile typing challenges for the blind and low vision population (including error correction, privacy, and cost of add-on tools).
After reviewing existing Android gestural inputs built into the accessibility settings I sketched interactions that could be able to utilize both thumbs for input. My decision to use both thumbs was because both hands would be required to operate the keyboard, so they would already be in place while holding the phone.
Results
I had created a capstone project that encompassed the altered physical and software keyboards. The conversations I had with other accessibility software developers, blind and low vision training staff, and other software engineers led to the creation of mBrailler.
The challenge with creating a software and hardware typing solution is that it needed to complement each portion well or it would come part of every other broken user experience. I had to create new interactions and gestures to allow for the complex editing functions. After clearly defining the editing functions made available with the built in system clipboard, I created a table to map related unctions to one another. I had created a menu system that used TTS to read what had been typed and what function a user had swiped to access.
Impact
I had learned from so many people while working on this project. I had learned so much, in fact, that I felt it important enough to share my work at the Rochester Institute of Technology Effective ACCESS conference focused on accessibility services and research in the Upstate NY area.