An agent has nothing to process if there's no text on the page.
There are few things less accessible than web applications depending on bleeding edge javascript that might not get successfully executed. There are few things more accessible than text in an HTML web page on an HTML web site. If we really cared about accessibility for the differently abled we wouldn't be making such locked down, proprietary websites. But of course it's not really "we" doing this of our free will. It's corporations and their profit motives telling their human parts to make bad websites. And since that's where the money is in web dev it's become the default for all web devs and even shows up in institutional/government websites.
I don't think corps are going to change and I don't think something other than websites is the answer like the author suggests. We just need actual websites with web pages with text.
pverheggen · 7h ago
> There are few things less accessible than web applications depending on bleeding edge javascript that might not get successfully executed. There are few things more accessible than text in an HTML web page on an HTML web site.
It's not that simple, you can create highly conformant SPAs if you want, there's just more issues you have to watch out for. And even the best semantic HTML and ARIA will need Javascript for things like content replacement - for example, there are valid accessibility reasons why no one uses form submission for table filtering and sorting anymore. JS is also required for implementing and keyboard interactions for WAI patterns, not something you can get out of the box with plain HTML.
Ultimately, the determining factor for a11y is not your choice of front-end technology, but rather whether you're willing to make it a priority. Janky SPAs sites have not made it one, and I don't think an MPA is going to fix that. Raising the bar on quality is really what's needed.
electroly · 4h ago
> there are valid accessibility reasons why no one uses form submission for table filtering and sorting anymore
What are they? I was only aware of this being a performance and general polish issue--what's the accessibility tie-in? I still use form submission for this.
avtar · 3h ago
I guess it depends on how much care you're taking with maintaining the client's position and focus post-submission/page load, and then also announce what data changed on the page. And if someone with cognitive disabilities has to reorient themselves in the now updated page then that's probably not going to result in a great experience.
> An agent has nothing to process if there's no text on the page
all of the major labs' flagship models are multimodal and the post is specifically about processing the non-text parts of the page
superkuh · 8h ago
A good point. I was unclear. I meant it in the sense that if the JS doesn't run fully successfully there won't be text, or images, or even source mark-up. Just some lists of remote JS CDN in source if anything. It'll mostly just be blank. Nothing for the multi-modal to do either.
Etheryte · 5h ago
Why would the Javascript not run? Crawlers have been executing scripts for more than a decade now.
superkuh · 5h ago
Because what javascript is changes rapidly and they always have to change to keep up and we all agree that that amount of work is difficult. So, parts of JS fail and don't run as expected over time even with constant mantainence.
Etheryte · 3h ago
This argument makes no sense, in that case the page would be broken for regular users all the same. Scrapers don't need to use some inferior browser and then pick up the pieces, they can literally use the same browser all the regular users do.
There are few things less accessible than web applications depending on bleeding edge javascript that might not get successfully executed. There are few things more accessible than text in an HTML web page on an HTML web site. If we really cared about accessibility for the differently abled we wouldn't be making such locked down, proprietary websites. But of course it's not really "we" doing this of our free will. It's corporations and their profit motives telling their human parts to make bad websites. And since that's where the money is in web dev it's become the default for all web devs and even shows up in institutional/government websites.
I don't think corps are going to change and I don't think something other than websites is the answer like the author suggests. We just need actual websites with web pages with text.
It's not that simple, you can create highly conformant SPAs if you want, there's just more issues you have to watch out for. And even the best semantic HTML and ARIA will need Javascript for things like content replacement - for example, there are valid accessibility reasons why no one uses form submission for table filtering and sorting anymore. JS is also required for implementing and keyboard interactions for WAI patterns, not something you can get out of the box with plain HTML.
Ultimately, the determining factor for a11y is not your choice of front-end technology, but rather whether you're willing to make it a priority. Janky SPAs sites have not made it one, and I don't think an MPA is going to fix that. Raising the bar on quality is really what's needed.
What are they? I was only aware of this being a performance and general polish issue--what's the accessibility tie-in? I still use form submission for this.
Example table for comparison: https://www.w3.org/WAI/ARIA/apg/patterns/table/examples/sort...
all of the major labs' flagship models are multimodal and the post is specifically about processing the non-text parts of the page