One of the more significant challenges in doing accessibility testing with an automated tool is trying to get an idea of how the page actually performs for users. The truth is, the only truly reliable thing a tool can claim is that it found X-number of instances where the code failed the tool’s specified tests. Assuming all of those tests reliable, the end user is still left with little more than a list of issues. Recently one of our customers asked “How do we know if we’re Compliant or not?” Although we’ve already discussed compliance in a previous post, we recognize that this is something users are often concerned with. Our gut reaction is to say that you’re not “compliant” if you have any errors, but that’s admittedly not very practical. Determining compliance is a lot trickier than that and in some cases there may even be exemptions that apply that no tool can be aware of.
Continue reading “How to “Grade” a page tested with Tenon.io”