the accessibility paradox

The web is growing up, little by little, step by step. It used to be a barren place governed by no-one in specific, until a few people roared their heads and started organizing things. Those people were the forefathers of what we know today as "the web standards movement". A group of people defending the ideals of the web, fighting for universal standards and accessible content on the web.

Chances are that you, the reader of my blog, feel like you are a part of this community. Maybe not actively, but I'm sure you're trying your best to make the web a better place. The standards community is without a doubt a Good Thing™ and has an ever-expanding influence on the web.

escher weirdness

But as is the fate of many movements that start of small, grow bigger and slowly grow out of control, there will be times when it's wise to look back and make sure that the current actions still represent the initial goals. Even though the standards community was built around the ideal of a web that is accessible for everyone, some of the efforts have backfired and have actively hurt this exact ideal.

Accessibility gone bad

Of course I'm telling nothing new here. We already know that if you over engineer your site, you might actually be hurting the accessibility of your site instead of improving it. There are two common variations of this problem.

The first variation is a direct result from lack of knowledge on how people with disabilities interact with the web. Excessively large alt descriptions or filenames on images that require no extra information, tons of skip links or unnecessary hidden headings might slow actual browsing down for people working with screen readers. These efforts are well-intended but often misguided because the actual benefits are not always fully understood.

The second variation is less obvious, and lies in the misconception that accessibility is targeted solemnly at people with a disability. Some people put so much effort in accessibility measures that they seem to forget there's an audience that views the website through a regular browser (which is probably still the largest portion of our target audience). Such sites often lack any decent functional design (fe. fully liquid layouts) and are often plain eyesores.

But there is more, seemingly hidden from the conscience of the standards community as it has received little to no attention at all. And in a strange way, it even seems to be promoted by the community. Bear with me.

The facts

Some time ago Microsoft dropped a tiny bomb. They announced the inclusion of a new meta tag in IE8. This tag is meant to indicate the browser version for which a certain page is developed. For example, when you define a page as developed for IE7, the browser will render the page according to the IE7 rendering rules, even when using a newer browser version. This measure was proposed to stop pages from breaking whenever a newer version of a specific browser is released.

Although the inclusion of this tag caused quite a stir within the community there seemed to be one main issue on which everyone agreed: the default handling of this meta tag. When the tag is not included in a page the browser will render this page to IE7's standards instead of picking the most current rendering engine available. Only by including the meta tag could newer versions of the rendering engine be used.

Not a good thing according to many, as IE7 is not really known for its support of web standards. Even the people backing up the decision of Microsoft didn't prefer this implementation, but merely empathized with Microsoft's reasons. Much rejoice when Microsoft announced that they would revert their decision and implement the meta tag the way the community had proposed. Victory! ... or not?

The dilemma

I've been following this debate from afar and something struck me when I went over the decision of Microsoft. At first I felt quite stunned, like most. After that I felt quite happy because Microsoft had shown initiative towards the standards community. But the more I considered their decision, the more I felt like they reached out to the community without much critical reflection.

Let's take a closer look at what we have been asking of Microsoft. When they first proposed their meta-switch solution they put the pressure on the shoulders of us web developers. We had to work the magic to render our pages according to the rules of the latest rendering engine. If not, our pages would still render the way IE7 rendered them, all bugs included. Annoying, but not very difficult to overcome.

By reverting their initial decision, Microsoft is now putting the pressure on the people that hardly know what they're doing in the first place. The people that just want their content published on the web. They know little to nothing about meta tags, standards or cross-browser issues. They just know their site as it shows up in their browser, often made with sub-standard authoring tools. They are also the people that write extremely inaccessible code, so I guess that serves them right ...

The least accessible content

It's quite difficult to predict the net effect of this decision on non-professionals. Maybe they might benefit from this decision as the original implementation would create an ever growing gap between the rendering of their pages in IE7 and modern browsers, making it harder to render pages neatly on all browsers. On the other hand, their sites will keep on breaking when newer versions of a browser are released.

But that is not really the point, what surprises me is that nobody even seems to consider those people. All we seem to think about are web standards. My fear is that these people are less involved with the web and are less inclined to look for possible solutions when they face a problem, so chances are that their disappointment will influence their drive to keep publishing content on the web.

For us web developers the meta tag is nothing more than a little quirk. And those who aren't aware of the meta tag have an obligation to look for answers, as it is their job to do so. Web amateurs have no such obligation and should spend their time on getting their knowledge out there on the web. They shouldn't be bothered with web standards, that is our job. So it surprises me that we are actively requesting Microsoft to bother them with it.

I am not foreseeing a huge drop in published content on the web, but I'm sure this will have an effect on some people. It would be a shame to alienate these people as they often have very valuable and rich information on often obscure and marginal subjects. This is a part of what makes the web great, and somehow we seem to be stopping that from happening because the way they are publishing their content isn't accessible to everyone.

The question is, would we rather have 100 sites with totally accessible content, or would we prefer 200 sites of which 50% is only accessible to a majority. Because let's be honest, designs might break, sites might be plain ugly and the source code might be a downright mess, as long as people can get what they're after, most of them won't ever complain.

The future

We can only hope that the tools to build sites will improve at a quick rate, or we'll lose out on people who have interesting content to offer but lack the skills to offer it in an accessible manner. Personally, I think this is a distorted view on "the web that is accessible to all", apparently the community has a different opinion.

In this article I have referred to the "community" as a group of web professionals that's promoting both accessibility and web standards. Web standards are obviously a way to improve accessibility on the web, but at the same time web standards driven to their extremes will hurt web accessibility just as easily by alienating people from the web, preventing them from publishing their content and thus participating on the web.

So let us revise what we are striving for. Do we want an accessible web, or do we just want all content on the web to follow the standards? Are we really supporting an ideal or are we just looking for an easier job?