I like the idea of the W3C. They’re the group responsible for managing some of most widely used standards in computing. Among the hits:
* HTML ( and XHTML )
The strange thing is that all of these standards are several years old. Plus there are a slew of other standards that haven’t been used or adopted. Their success rate for standards is dismal. Yet, despite this, the W3C still has the stones to act like they are doing everyone a favor by setting the beat for the semantic web.
RDF has been around for years and year, yet no one really uses it. Hell, I’ve even tried to understand what RDF is all about and see if it could actually be useful but to no avail. That hasn’t stopped the W3C though. OWL, RDFS, SPARQL, GRDDL are just a few of the standards that have failed to take off in the semantic space that are all connected to RDF and aim to bring semantic information to the web. While W3C has been drafting, recommending, and re-versioning it’s ivory tower specs, people out there have been busy actually creating the semantic web.
One only has to peruse LinkedIn, Cord’d, Upcoming and a slew of other sites to see *gasp* semantic information exposed, ready to be consumed. Using microformats, the semantic community has done what the W3C has proposed to do in about 2 years. Plus, it shows no signs of slowing down.
In fact, microformats have become so popular, they’ve inspired, SURPRISE, another working draft from the W3C, RDFa. From the RDFa abstract:
“Current web pages, written in HTML, contain significant inherent structured data. When publishers can express this data more completely, and when tools can read it, a new world of user functionality becomes available, letting users transfer structured data between applications and web sites. An event on a web page can be directly imported into a user’s desktop calendar. A license on a document can be detected so that the user is informed of his rights automatically. A photo’s creator, camera setting information, resolution, and topic can be published as easily as the original photo itself, enabling structured search and sharing.”
If that’s not enough arrogance, think about the whole SOAP & WS-* fiasco. It’s pretty safe to say that SOAP, as a proposed standard for web service communications, has failed. Craig Andera recently wrote about “The Failure of SOAP” and echoed something that I have noticed for a while: SOAP is just too complex for almost every scenario except for the edge cases requiring extensive security. WSDL is maddening to deal with. In fact, there’s even a rumor that SOAP et al… were made so complex specifically so that tool vendors could step in and supply easy to use tools to encapsulate that complexity. I know, it’s a big reach of the mind seeing as though the W3C is made up of a lot of tool vendors…..
Oh, Let’s not even start to dive into the plethora of WS-* “standards” that no one actually uses.
What does this have to do with Semantics? Well, let no one say that the W3C is not persistent. They’ve just introduced a new proposed recommendation:
From the abstract:
“This document defines a set of extension attributes for the Web Services Description Language and XML Schema definition language that allows description of additional semantics of WSDL components. The specification defines how semantic annotation is accomplished using references to semantic models, e.g. ontologies. Semantic Annotations for WSDL and XML Schema (SAWSDL) does not specify a language for representing the semantic models. Instead it provides mechanisms by which concepts from the semantic models, typically defined outside the WSDL document, can be referenced from within WSDL and XML Schema components using annotations.”
I’m quietly shaking my head.