Automating the work of complying with those guidelines could make the web more welcoming. But more than 600 accessibility experts have put their names to a document asking website operators to not use such automation tools, including AccessiBe. Signers include contributors to W3C guidelines and employees of Microsoft, Apple, and Google. “Automated detection and repair of accessibility problems is not reliable enough to bring a site into compliance,” the document says, accusing some vendors of “deceptive marketing.”
The site was started by Karl Groves, founder of accessibility consultancy Tenon.io, who provided a withering 35-page analysis of AccessiBe’s software to Murphy’s lawsuit against Eyebobs. Groves said he surveyed a total of about 1,000 pages from 50 websites using the startup’s technology and found a median of 2,300 violations of W3C guidelines for each site. Groves says that is a significant undercount, because most of the guidelines can only be checked by expert, manual analysis. “Artificial intelligence doesn’t work like that yet,” he says.
In his report on AccessiBe, Groves cited an image of a model wearing a white dress for sale on an ecommerce site. The alternative text provided, apparently generated by AccessiBe’s technology, was “Grass nature and summer.” In other cases, he reported, AccessiBe failed to properly add labels to forms and buttons.
On the homepage of its website, AccessiBe promises “automated web accessibility.” But support documents warn customers that its machine learning technology may not accurately interpret webpage features if it “hasn’t encountered these elements enough before.”
AccessiBe’s community relations manager, Joshua Basile, a quadriplegic paralyzed below his shoulders, says that since he joined the company early this year it has engaged more with disability advocacy groups and clarified that it offers “manual remediation” alongside automatic fixes. “It’s an evolving technology and we’re getting better and better,” he says.
In a statement, AccessiBe’s head of marketing, Gil Magen, said the company had analyzed Eyebobs’ website and found it complied with accessibility standards. AccessiBe offers clients assistance with litigation but Eyebobs declined, the statement said.
In its own statement, Eyebobs said it “is no longer working with AccessiBe nor will we in the future.”
Although the Eyebobs settlement, to be finalized next year, doesn’t include an admission its site had problems, it requires the company to pay for an external expert audit and to dedicate one or more staff to accessibility work. “Eyebobs is committed to ADA compliance and supporting all visitors who come to our website,” director of marketing Megan McMoInau says.
Haben Girma, a deafblind disability rights lawyer, says she hopes the Eyebobs suit will discourage companies from using AccessiBe or similar tools. She believes tech companies or regulators like the US Federal Trade Commission should take action against inaccurate marketing of accessibility tools. “Governments, Google, and social media companies can stop the spread of misinformation,” she says.
Experts critical of automated accessibility tools don’t generally argue the technology is wholly worthless. Rather they say that placing too much trust in the software risks causing harm.
A 2018 paper by employees of W3C praised the potential of using AI to help people with poor vision or other needs but also warned of its limitations. It pointed to a Facebook project using machine learning to generate text descriptions for images posted by users as an example. The system won an award from the American Foundation for the Blind in 2017. But its descriptions can be hard to interpret. Sassy Outwater-Wright, director of the Massachusetts Association for the Blind and Visually Impaired, noticed that the system sometimes displayed a preoccupation with body parts—“two people standing, beard, feet, outdoor, water”—that she dubbed the “beard quandary.”