Accessibility auditing is so funny to propose to academics in HCI.
Engineering researchers hate it because it makes evaluation really slow.
Behavioral researchers hate it because it isn't fully empirical.
Design researchers hate it because it is telling them what is "standard"
This core activity - one that stretches across every major industry as THE main methodology for capturing access barriers in technology is disliked by all 3 of the main camps of researchers in human-computer interaction.
The academic research vs industry divide here is immense.
Heck, even industry research relies heavily on a close relationship to auditing, QA, standards, etc.
Academic HCI researchers are grossly opposed to these sorts of things.
But auditing can teach each of these 3 camps something:
- Some evaluation is qualitatively better when it isn't automated.
- Some knowledge is valuable and usable even if it isn't empirical.
- Some "standards" are more about interrogating your materials than reaching an outcome.
I genuinely believe that accessibility auditing is a para-methodology that all HCI researchers (engineering, behavioral, and design) should at least try once.
It works beautifully alongside all of their own existing methods (hence "para"-methodology).
Auditing is a powerful tool for a creator because of how close and careful it requires the auditor to be to a design. Sure, some of it can be automated. But most of it cannot.
Close interaction with design materials using a set of low-level (but open) heuristics is important.
In a similar vein to the methods divide of close/distant reading in the humanities, auditing is a practice of closeness.
The knowledge produced here is deeply valuable and more importantly: something that the creator can use to make something better. It is usable!
Is a set of accessibility heuristics or criteria (like WCAG or Chartability) the ultimate set of perfect knowledge? No!
It is obviously beneficial for catching access barriers. But it is also useful for generating a common starting point for accessible design practice.
Anyway, this thread is just me lamenting how narrowly academics produce and criticize knowledge and how auditing actually adds an immensely important dimension of consideration that they mostly neglect.
Even some accessibility research projects fail to follow standards like WCAG
Getting on the same starting page is good, especially if everyone acknowledges that the starting page is not the ending page and much more work needs to be done.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I'd like to think I was annoying enough to make this happen, but I know that they have been really working to get this in place for a while.
I hope that the right people see this post and apply!
Special thanks to Stuart and Abby and the folks I know working on this behind the scenes (I won't tag them in case they get an influx of questions heh).
This is a really cool chance for someone to influence a whole industry of practitioners! And make the news better for PWD!
"What are examples of accessible, effective chart descriptions?"
I get asked this question a LOT, so I am going to make a thread here to easily link to later.
(This thread covers both simple and complex use-cases for human-authored descriptions.)
Feel free to share/bookmark!
First, I use "descriptions" because "alt text:"
- is a specific technology
- limits the potential to layer/chunk complex information
- is non-visual (only available via screen reader).
Tables, titles, and data-interfaces don't always use "alt" and are good for everyone!
The first set of examples is Benetech's DIAGRAM Center guidelines. These guidelines are research backed, super comprehensive, and easy to apply!
There are nearly 20 examples of different kinds of charts/diagrams.
This is the exact sort of language a good design system uses when talking about accessibility. It's about providing a base tool to work with, it provides no guarantees.
Turns out my old team team is HIRING!! Literally the best job ever.
Your coworkers are the coolest people ever, your boss genuinely cares about the important stuff, the upper management is supportive enough to let us open source!
I would love to know what other folks think about this!
I'd argue that science isn't really about being the "best" either.
If anyone gets close to whatever "best" is, it's probably whoever synthesizes standards and research in collaboration with people with disabilities to make something specific that is usable and accessible.
Is there a place where I can look up each WCAG criteria and then any research studies that influenced its creation?
There are plenty of studies *on* WCAG. But what research studies (if any) were used to *inform* WCAG?
One of the tricky things when trying to tell researchers they should make their stuff accessible according to the same standards adopted by international law is they will sometimes ask if those international standards were research-backed.
This is especially important when writing papers and mentioning accessibility standards: I want to make assertions and the way we build trust in our claims is through citations to research.
Citations to WCAG are sometimes met with "this isn't empirical."
I am tempted just to say I work with data visualizations and leave out the accessibility part. It makes a lot of folks think this is some extra topic or feature, like voronoi diagrams or storytelling.
But everyone should be doing accessibility work. It is part of visualization.
It's part of the job! Everyone should make their charts, graphs, figures, interactives, and interfaces accessible.
Scientists, designers, engineers, analysts: this should be on all your minds!
The bad news: you can only make something as accessible as your tools and environments allow.
So a lot of the responsibility falls to tools like Tableau (who rely heavily on community-provided accessibility) or journals like @ieeevis/@ieee_tvcg that still use inaccessible pdfs.