In this, the second article in this series on website accessibility, we will walk through the audit process we use with our clients. We will take you through each step using the TEN7.com site as a test case and show you how we arrived at a prioritized list of initiative for improving our site. Perhaps you’ve heard about other site accessibility initiatives, and you’d like to take the first steps towards bringing your site into conformance. You can get started right here.

accessibility working at computer

Where To Start

When clients request to make their website accessible, we generally recommend a Site Accessibility Audit. Audit findings can be helpful in planning your accessibility initiative. Depending on specific client needs or known user population, our process for a site accessibility audit usually includes:

  • Running accessibility audit tools and reviewing reports
  • Assessing accessibility of the site using screen readers, contrast checkers, keyboard-only navigation, etc.
  • Reviewing site analytics for high-traffic pages
  • Making a prioritized list of accessibility improvements

For illustration purposes, let’s step through the audit process and findings for our own site, ten7.com.

For step one, we used AChecker which is an open-source WCAG 2.0 audit tool developed by the Inclusive Design Research Centre at the University of Toronto. It allows you to enter your site’s base url and select a level of compliance (A, AA or AAA). For an initial accessibility initiative, level AA is a good place to start.

The audit report for the TEN7 site reveals one known problem and 124 potential problems. This is not unusual, as there are many judgement calls involved in evaluating accessibility, and automated tools typically find a handful of known deficiencies and scads of potential improvements that must be evaluated by a human. Keep in mind that our site was built by our developers who are well-versed in standards based HTML5 markup (we’re a web-development agency, after all!), and with Drupal 8, which was created with accessibility in mind. Your results may vary.

In our case, the known problem is an HTML tag attribute that has an ID that is not unique, which can be confusing for screen-readers. This falls under Success Criteria: Parsing (Level A), so it is something that needs to be fixed. The report provides a reference number for the line of code in question as a handy resource for our development team.

The potential problems requiring evaluation can be grouped according the 12 WCAG 2.0 accessibility guidelines. Here are some examples:

  • For guideline 1.1 “Provide text alternatives for any non-text content” potential issues include images that may need long descriptions, unnecessary alt text on images that may be decorative, and alt text that may not convey the same meaning as the image.
  • For guideline 1.3 “Create content that can be presented in different ways without losing information or structure” issues include: visual lists may not be properly marked, and text may refer to items by shape, size or relative position alone.
  • For guideline 1.4 “Make it easier for users to see and hear content” we have alerts for images that may contain text with poor contrast, images that contain text that is not in alt text, and information in images may need to be available in another form like context or markup.
  • For guideline 2.1 “Make all functionality available from a keyboard” potential issues include scripts where a user interface may not be accessible.
  • For guideline 2.3 “Do not design content in a way that is known to cause seizures” all scripts that may cause screen flicker are listed.

I could go on, but this is enough to give you a flavor for the issues that are highlighted by an automated tool like AChecker. Each alert cites a line of code, which will enable evaluation by our development team. The alerts that are determined to be true limitations to accessibility will be added to our wishlist of improvements. If sorting through your own WCAG 2.0 audit results becomes overwhelming, it may be valuable to consult with a web agency with accessibility experience.

From Technical Accessibility to Usable Accessibility

Guidelines are just guidelines--following them to the letter is not a guarantee that your site will be accessible for all, which is why we have to go beyond WCAG audits. If you want to experience your site the way a person with limited sight would, a few trials with a screen reader will be instructive.

The most commonly used screen readers, based on the latest survey of screen reader users, are JAWS (Windows), ZoomText (Windows), Window-Eyes (Microsoft Office) and NVDA (Windows) with use of ZoomText and Window-Eyes growing quickly. Still, according to this 2015 survey, JAWS is used by 30% of screen reader users. Don’t have access to JAWS? Spend a few minutes watching this video which demonstrates how JAWS is used to navigate a web page.

Testing screen readers is tricky because there are two factors to consider: which screen reader paired with which browser? The possible combinations are too numerous to test. In terms of winnowing down options, we would normally consult with the client, as they know their users best. Since we are our own client in this case, we chose to test ChromeVox with Chrome, VoiceOver with Safari, and JAWS with Internet Explorer.

Google Chrome is the most frequently used browser of all. In our testing we used the free ChromeVox screen reader installed as a Chrome extension. We found the following issues:

  1. The screen reader could not access the menu
  2. The screen reader jumped around the content in the page in some non-intuitive ways, for example, on the home page:
    • From the main title to the logo in the upper right corner, then across to the hamburger menu (described only as ‘internal link’)
    • After the header content, it skipped subheads and body text in favor of button links
      For blog posts, it read taxonomy tags first, then the article title.
  3. Social media links were described generically as “link list’ items

As a Mac-based shop, we chose to test with the built-in screen reader, VoiceOver, using the built-in browser, Safari. It provided a much better experience. VoiceOver accessed all the page elements and read them in a fairly logical order. Still the screen reader could not access the menu, a major shortcoming, and all the social media icons in the footer were ignored.

JAWS is still the most commonly used screen reader. We tested version 18 with Internet Explorer 11 in a Windows environment. When directed to read the whole page JAWS skipped over the top menu bar. It seemed to get stuck on the page heading, reading it several times. It progressed through the rest of the page smoothly, in the intended order and with surprising speed. In the footer, the social media links were described unhelpfully as “link TEN7 interactive” six times.

The differences in these experiences highlights an important consideration for sites that choose to optimize for certain technologies. In these situations, it may be helpful to publish a statement on accessibility on your site, explaining which assistive technologies your site was optimized for.

Sometime aesthetic decisions can make a page difficult to discern for those with low vision who access the page without screen readers. We assessed our home page with a contrast checker (Color Contrast Analyzer, a Chrome extension created by the IT Accessibility Office at NC State University) set for Level AA, Medium Bold and Large Non-Bold Text (3:1). It produced a mask that showed areas with sufficient contrast between text and background according to WCAG 2.0 contrast guidelines (see image below).

Any items not outlined are too low in contrast for the conformance level selected. The checker revealed that text over images, colored text and reversed text in buttons were too low in contrast. We ran the contrast checker on a blog post using the Level AA, Small Non-Bold Text (4.5:1) and found that author names, taxonomy labels and link text did not have sufficient contrast.

For some examples of websites before and after accessibility improvements, see the annotations for the demonstrations web pages W3C has compiled in their demo.

Distorted Computer Screen Reader

Prioritize High Traffic Pages

It is helpful to prioritize changes that will improve accessibility for the most commonly viewed pages. Reviewing our site’s Google Analytics data using a view that excludes internal traffic is helpful for this. The data shows that these pages are visited most often: Home, Blog, What We Do, Who We Are, Contact Us, Careers and Blog Posts. Clearly, we will want to prioritize changes for these pages.

Web Pages Visited

Consider Your Business Objectives

The results of the audit process outlined above will provide a long list of potential improvements. Creating a prioritized list depends on your objectives for improving the accessibility of your site. Are you legally required to achieve a WCAG 2.0 conformance level? If so, the whole WCAG audit list should be tackled. Do you have a specific group of users that are a priority for access for your site? If so, then you might prioritize the improvements that will aid that group above other potential improvements.

For the TEN7 site, our priorities are more general. We don’t have a particular need to achieve WCAG 2.0 conformance, and we’d like to prioritize implementation of best practices on high traffic pages. That leads us to a list of initiatives that looks like this:

General

  • Make sure descriptive alt text exists for all content-related images
  • Make sure all in-line text links use descriptive text
  • Improve descriptions for social media icons in footer
  • Add a sitemap

Home

  • Make menu and menu bar screen-reader friendly
  • Make the order of page elements more obvious to screen-readers (heading tags, ARIA labels)
  • Improve contrast of text over images, colored text and reversed text in buttons

Blog

  • Make the order of page elements more obvious to screen-readers
  • Improve the contrast of author names, taxonomy labels, and link text

What We Do

  • Make the order of page elements more obvious to screen-readers (heading tags, ARIA labels)
  • Improve contrast of icons, company logos and Your Point Person section
  • Add better alt-text for company logos

Who We Are

  • Improve contrast of Our Team and See Yourself Among Us sections
  • Make the order of page elements more obvious to screen-readers

Contact Us

  • Make sure forms are screen-reader friendly (ARIA form attributes); need prompts for what to enter in each field; identify required fields
  • Forms should be usable for keyboard-only users

Careers

  • Improve contrast of small text blocks

Blog Posts

  • Improve the contrast of author names, taxonomy labels, author bio and in-line link text
  • Add descriptive alt-text for images

Conclusion

We hope this post has helped to illustrate the planning process for an accessibility initiative. Our next post in this series will explore the technical side of implementing the priorities identified through our audit process. Stay tuned as we improve the accessibility of our site. If we can help you improve your site’s accessibility, then please reach out!