Payment for the Visually Impaired

For my capstone project at Carnegie Mellon, I was part of an interdisciplinary team, working together to better the quality of life for people who are blind or have low-vision. In conjunction with our clients, Bank of America, we engaged in a 7-month project focused on the check-out and payment process. Our team was passionate about exploring a non-visual design space as a way to create accessible solutions usable for everyone.

A process overview can be viewed on the project website which is designed for accessibility.
Full documentation for the research and synthesis can be seen in our research report (PDF).

Due to current patent application restrictions, explicit details about the final product are not available here. The design process and general description of the outcome are described below.

Our Goal
To develop a universally accessible payment system that builds the confidence of visually impaired users.

My roles as Design Lead:
• Evaluated data and worked with teammates to optimize research opportunities
• Ensured universality of prototypes by coordinating decisions within the group
• Crafted (and re-crafted) wireframes and physical mockups for prototyping
• Guided the team through the synthesis process
• Communicated research to client groups
• Communicated design specifics to software team
• Acted as chief editor and primary writer for final research and design reports

/ (1 of 1)

We observed the shopping, checkout, and payment processes of nine visually impaired individuals (seven are completely blind, two are low-vision). We interviewed four more users remotely and talked to a number of local research experts and associates of organizations that work with the blind to develop a full understanding of the issues. Of course, our clients were able to provide a strong understanding of the financial and payment spaces.

We took time to get to know each of our participants to understand each of their needs and attitudes.

During an extensive synthesis process, we gathered our notes and ideas on post-its and spent our time hanging out at our white board, working in groups, splitting up, conversing, sharing out thoughts and digging down into the data and identifying opportunity areas for design. We sifted through all of our data and determined the greatest needs of our users and provide design direction for future ideation. We developed an understanding of the design space by articulating a shopping framework which we used as a tool to focus the final form of the design.

/ (1 of 1)

Through brainstorming, bodystorming, crafting, and coding we explored technological and non-technological solutions. Given the directive to think about 3-5 years out, we chose to pursue multiple ideas to craft a broad vision of an ideal experience for the visually impaired. Like our preliminary research, we utilized the shopping space as a close metaphor for the payment process and tested new interactions in this space. We performed active tests with users as concept validation sessions to understand the interactions and received feedback, helping us understand the feasibility of each idea.

From continued competitive analysis and follow-up interviews, we brought our attention back to the transaction and payment space to build a solution. We determined the best opportunity to make a difference would be to develop an interface from the ground up. NFC (near-field communication technology) is growing in adoption and is expected to be commonplace in the future. If we could develop a solution that is accessible, we have a chance to make sure that new infrastructures can be built with accessibility in mind.

/ (1 of 1)

Our end-goal was to show a vision for a future product 3-5 years out, so we have applied our knowledge of the resources and design space to develop multiple ideas at once to determine the best interactions available in design. Our testing included mock-ups using lo-fidelity construction and wizard-of-Oz style simulation. Later on we moved to lo- and medium-fidelity mobile mock-ups coupled with interactions programmed to be able to simulate a real. For our final iteration, we intend to test in a real shopping environment. After four iteration cycles, our team presented our prototype and final report to the clients.

Our initial prototypes were split because we had two core experiences: visual and non-visual. Though our designs were cohesive from the beginning, we needed a coordinated effort to understand the multiple experiences. Our first prototypes were paper mockups with a foam-core frame for sighted users (pictured below) and a coded setup including a laptop, barcode scanner and phone running a very low-level version of the Android application. This latter setup provided an approximation of the experience for those who cannot see. Our latter versions were able to bring what we learned of the visual interface interactions and combine with the knowledge from the embodied interaction.

/ (1 of 1)

by James Mulholland, Jooyong Lee, Zhenshuo Fang, Brendan Kiu, and Nastasha Tan
Sponsored by Bank of America
Coursework for Capstone Project
Masters of Human-Computer Interaction
Carnegie Mellon University, Spring 2011