Earlier this month, I attended a meetup with other designers, developers and user experience (UX) specialists from across the Chicago area. The session was about building quality accessibility practices into your project processes, and it was fascinating.
The event speakers were a front-end software engineer and a quality assurance engineer, and they each spoke about how addressing accessibility plays a role in their day-to-day responsibilities.
Photo by +Simple
As a senior UX architect here at Duo, accessibility has always been something I've cared about, and I've written about topics like designing for a universal accessible experience and integrating ADA compliance into your development process. There's an accessibility topic I've been thinking about for a while that I haven't heard many clients address, and hearing the conversations at the meetup only further fueled my interest.
The topic is mobile accessibility.
In the posts I've written the past, the accessibility issues we've discussed have pretty much been platform agnostic; if there was a primary platform being discussed, it was desktop computers. In this day and age, however, it's frankly irresponsible to not consider mobile whenever you're talking about websites and user's digital behavior.
After all, as of 2017, approximately 80% of Internet users owned a mobile phone (according to comScore). Now consider this: 91% of users with disabilities use wireless devices.
When it comes to mobile accessibility, the same best practices for accessibility in general still apply, but there are four additional elements you should consider as you begin designing and building out your site.
The issue of zooming is most pressing for low-vision or tunnel-vision (no peripheral vision) users, although it is something to think deliberately about for the entire user population. There are two ways to handle zooming in on a mobile screen: either through a website itself, or through the phone interface.
Now, it may seem easy to just let the user rely on their phone's zoom function, but here's the problem: you don't know the extent of the zoom or what will ultimately appear on their screen. You lose control of the user experience, because, well, you don't know what the user experience will be.
Say, for example, that your user is filling out a form on your site. They choose to zoom in on the screen to make sure they're entering the correct information in the correct location. When they reach the end of the form, they hit submit. Now, all of a sudden, their screen turns black. If they weren't zoomed in to the site, they would see that a pop-up appeared saying "Thank you for submitting your form. Please email us if you have any additional questions." They don't see that form, though, because it's off the screen in a different area of the page - an area that is not visible when zoomed in or using a screen reader.
If your site is text heavy, or if your text content is the reason people are coming to your site, then you should give them the opportunity to zoom within your own site. That way, you can control how the content appears at different views.
Like zooming, adjusting the contrast of a page is something that can be built within a website or done natively by a phone. This option gives users the opportunity to invert colors so that text can be seen more easily.
This is another area where your content will dictate whether you want to give users the opportunity to tweak contrast within your site or rely on their phone to make its best estimation at an improved appearance.
Additionally, this idea of contrast is something that should be addressed as your organization (or the design team within your organization) is creating its color palette, as well as its logo. You want to make sure your brand colors not only look good together, but contrast each other in a way that will allow them to stand on their own - and not need to be manipulated.
- Screen readers
I've written about desktop screen readers on multiple occasions here on this blog, but mobile devices have screen readers as well. Apple has its VoiceOver screen reader. Android has Google TalkBack. Each makes it easier for the visually impaired to understand the content on their screen. A lot of work goes into constantly improving these tools, and I've heard from a number of people within the accessibility community that low- and no-sighted users appreciate those efforts and attention to detail.
As you are building out your site, it's important to understand how these native apps work and how they understand your content. This is something you should consider at the outset of a project, but it's also something to remain aware of throughout the duration of your project. The best way to do that is through manual testing. Take an iPhone and a Google Pixel and see how the two's screen readers share your content.
- Image handling
Screen readers can't describe pictures. They can't interpret data. They read text, and that's it. That's why it's critical to use the "alt" tag in your code; it provides a text-based description of an image. This allows the screen reader to read what is pictures.
You also need to be careful about hiding images; this will sound weird, but you need to make sure pictures are not hidden in a way that's solely visual. You can't just move it off the screen or turn the opacity off, because if the element isn't fully hidden, then the screen reader is going to find it, and that could lead to all sorts of confusion for the user. The best way to do this is to use CSS to turn the images on or off.
These are four aspects of accessibility to consider. I'm sure there are countless more, and like I said, the same rules for accessibility in general still apply.
Do you have a question about mobile accessibility? Shoot us a note and let us know. We'd love to talk.