Search Blog

Loading

Thursday, 26 June 2008

W3C 29 out of 30 Sites Fail

W3C makes the following statement:

“W3C primarily pursues its mission through the creation of Web standards and guidelines. Since 1994, W3C has published more than 110 such standards, called W3C Recommendations. W3C also engages in education and outreach, develops software, and serves as an open forum for discussion about the Web. In order for the Web to reach its full potential, the most fundamental Web technologies must be compatible with one another and allow any hardware and software used to access the Web to work together. W3C refers to this goal as “Web interoperability.” By publishing open (non-proprietary) standards for Web languages and protocols, W3C seeks to avoid market fragmentation and thus Web fragmentation.”


So how come so many sites are not following these standards. As a piece of work recently, we looked at 30 random web sites and checked them for W3C compliance. We found that only 1 of the 30 passed the check.

What was perhaps more surprising, was that around one third of the sites that were reviewed, had an error count of less than 30. Would it not be reasonable to suspect that an organisation which is so close to compliance to the foremost standards on the web, would go the extra mile and achieve compliance? I must therefore assume that these organisations are unaware of the fact that they are nearly W3C compliant and are developing their websites in ignorance, relying solely upon the development agencies to abide by their own standards, some of which may happen to coincide with W3C.

So what are the problems that organisations that are not W3C compliant going to face? Foremost, the transfer of the development from one agency to another is going to become more complex. Rather than being able to transfer code that is well written and understood by many, the site owner may become tied to a certain development agency, as they are the only ones that understand the code. The cost of changing a website completely is often far too expensive to consider, resulting in a reliance on one particular supplier.

Secondly, these standards once employed facilitate more effective and efficient crawling by web robots, who gather the information that search engines use. This means that poorly written code, translates to hard work for robots, poor understanding by search engines, lowering chances of the site being identified and reducing traffic to the site. Reduced traffic means reduced sales.

Thirdly, for those that are using the site from a disabled perspective, the tools that they use to try and surf the web, are hindered by poor code, making it more difficult to look at the site. We have already discussed elsewhere in the blog the importance of making web usage easy, yet here we find another example of poor user experience. Some organisations have even had lawsuits filed due to them failing to meet their obligations to disabled users.

Lastly, the site is less likely to transfer to other platforms used to surf the web, such as mobile phones and handheld devices. This is again restricting the use of the site, barring individuals that do not come on line using the mainly prescribed mechanisms.

My last point on W3C for the moment, so as not to be taken as a complete hypocrite, is that when I checked a couple of blogs, including my own, the volume of errors was in the hundreds. I am not yet certain whether this is something that can be remedied, but I can assure you that we will be looking at it and will let you know the results.

In summary, get your site W3C checked. Know that you are giving all users the best possible chance to use your site, have a good experience and possibly even generate a sale or two.

No comments:

Post a Comment

Hi, we would like to hear your feedback.