Enhancing safety and ease of item searching for those blind or visually impaired
Jun 2023
(48 Hours)
UX Designer
Christine Park
Jimmy Tang
Figma,
After Effects
I entered my first Hackathon with fellow designer Christine, joining WaffleHacks 2023 under the "Accessibility Track" with the prompt to, "Create a solution to address specific, real-world challenges faced by people with disabilities". With 48 hours, we competed against 124 other teams, and in the end, our efforts were recognized with an "Honorable Mention for UI/UX" award!
During the Hackathon, my teammate and I worked collaboratively in every stage of the process from research to the final designs. With this, I used Adobe After Effects to produce our screen reader prototype walkthrough.
Relying on memory and touch can be limiting, and the frustration of not being able to find items independently can be overwhelming, impacting daily routines and productivity. Moreover, seeking assistance from others is not always a viable or efficient solution.
With our short amount of time, we spent much of it trying to understand our target audience better as we had never designed for the visually impaired before. Given the constraints in accessing our users directly, we initiated our research by exploring sources like the American Foundation for the Blind. Subsequently, we visited online platforms including Reddit, Quora, and various articles to gather firsthand experiences and insights.
This allowed us to gain a better understanding of our users' common frustrations, along with how they interact navigate their environment on a day-to-day basis.
Through our research, we found a few recurring information themes during our review:
01| Screen readers are essential
Users commonly use screen readers to navigate digital interfaces and apps on their phones and computers
02| Requesting for help
Users expressed that they often need assistance from others when looking for items but would rather not for sense of independence
03| Searching for items
Common methods of searching for lost items include searching in a "grid-like" manner and sweeping the surroundings
With our research insights, we created some design goals for areas of focus to keep in mind while creating our solution. This helped us refine our primary features and concerns for the designs:
Accessibility
To accommodate for our users, designs need to be accessible in all aspects
Confidence
Users should feel confident to perform actions independently with the design
Familiarity
The design should present a process that is similar to users' current methods
I suggested the idea for us to try out accessibility functions in our mobile devices to empathize further with our users. This allowed us to get a deeper understanding of how visually impaired users navigate their interfaces, along with helping us identify common patterns used for accessibility.
We took a look at a variety of different tools such as Google Lens, our device's native screen reader, accessibility in the Camera app, and a few others.
With a better understanding of our users and their problems, we came up with our solution idea:
EyeSpy, an app which allows users to keep track of items and find them through an assisted camera scan.
Our app has a focus on accessibility and ease of use to allow for users to confidently search for their item independently.
Considering the fact that our users have problems with vision, it was essential that we created a user flow that was simple and easy to follow. This prevents them from having to go through unnecessary complex steps that would further complicate their experience.
We began our design ideation through sketches, before moving on to lo-fi wireframes after getting a more solid overview and agreement of our app specifics.
While creating this app, our largest concern to keep in mind was making the app design as accessible as possible. This was also a primary focus during our research phase, in which we came up with the following considerations based on the information we found.
Due to the fact that our users primarily navigate their phones with screen readers, we had to ensure that the app's structure followed guidelines that would make screen reader navigation easier. We found during our research, we found that blind and visually impaired individuals often use patterns when navigating, where they would memorize basic app layouts for standard actions.
With this information, we made sure that our designs followed a standard layout that was commonly found in most other apps. We used larger button styles for easier touchable areas for screen readers.
While researching about color accessibility for the visually impaired, we learned that contrast is important when determining best color choices. People with visual impairment have an easier time reading light text on darker backgrounds. Additionally, high contrast colors such as black and yellow also help with the overall experience.
We used the "Color Contrast" plugin in Figma to ensure that the colors passed AAA WCAG standards.
For individuals with limited vision, it's crucial for the app to have dynamic scalable views for text legibility. The app should seamlessly adjust in size, catering to the user's readability preferences. This allows for images and text to be more accessible depending on the user's level of visual impairment.
As mentioned previously, users who are blind or visually impaired rely on screen readers greatly when navigating their phones. Screen readers read out buttons, texts, and other objects on the screen, allowing for users to interact with the interface without needing to see it.
Alternative text labels are often disregarded during the design of interfaces, making it hard for those who use screen readers to navigate the app. Due to this, it was important for us to have descriptive alternative text on buttons and images to assist our users.
Fun fact, I added alternative text on all the images for this portfolio website as a result of this project!
I created a walkthrough of our "room scan" feature for a simulation of navigating the app with a screen reader to show further examples of descriptive alternative text. (Sound recommended)
Locate and identify lost items in your surroundings through an accessible room scanning function, featuring audio guidance.
Take a photo of any item you want to keep track of and add details for a descriptive overview of your belonging.
Quickly and easily find your items through the inventory list. Navigate to your essentials, or search for a specific item through the app.
After a sleepless night and many hours of additional work, we ended up being receiving an "Honorable Mention for UI/UX" award across 124 teams!
With more time I would have liked to test our designs with our target user group for more further refinements. Along with this, being able to interview blind and visually impaired individuals directly for more personal insight with navigation would help our app's design greatly.
Design for accessibility
Working on this project as taught me so many things about accessibility within design, which is something I am very grateful for. I had never designed an app specifically targeted for the blind and visually impaired so every step of the process during research and ideations of the app really helped me learn how to make design more accessible for those with vision problems.
Teamwork during time constraints
With only 48 hours to work on this project, it was essential to collaborate and communicate efficiently with my teammate. We had to layout clear goals during our short amount of time and work together to ensure we finished our project on time. We spent a bit too much time on research, which resulted in us submitting the project with only 3 minutes to spare. Moving forward I will try to improve on time management!