Collecting Friends

With the recent explosion of social networking sites, something has always puzzled me. First, consider this excerpt from Robert Scoble’s blog:

“Twitter is still ahead, but growing far slower. Just today I added another 120 friends to Pownce for a total of 839. More than 4,400 on Twitter and more than 1,200 on Facebook.”

My question is this: Is there any value actually garnered from adding an obscenely large amount of random people as your friend on various social sites?

Honestly, if a social networking site it meant to enhance you life through discovery of new interests, music, recommendations, etc…, is that easily done by wading through thousands of people?

At what point does the quality of a social connection come into play and if you have thousands of social connections, how can you possibly assess the quality of those connections?

Arrogance in the W3C Thought Process

I like the idea of the W3C. They’re the group responsible for managing some of most widely used standards in computing. Among the hits:

* HTML ( and XHTML )

* XML

* CSS

* SVG

* XSL

The strange thing is that all of these standards are several years old. Plus there are a slew of other standards that haven’t been used or adopted. Their success rate for standards is dismal. Yet, despite this, the W3C still has the stones to act like they are doing everyone a favor by setting the beat for the semantic web.

Exhibit A:

RDF has been around for years and year, yet no one really uses it. Hell, I’ve even tried to understand what RDF is all about and see if it could actually be useful but to no avail. That hasn’t stopped the W3C though. OWL, RDFS, SPARQL, GRDDL are just a few of the standards that have failed to take off in the semantic space that are all connected to RDF and aim to bring semantic information to the web. While W3C has been drafting, recommending, and re-versioning it’s ivory tower specs, people out there have been busy actually creating the semantic web.

One only has to peruse LinkedIn, Cord’d, Upcoming and a slew of other sites to see *gasp* semantic information exposed, ready to be consumed. Using microformats, the semantic community has done what the W3C has proposed to do in about 2 years. Plus, it shows no signs of slowing down.

In fact, microformats have become so popular, they’ve inspired, SURPRISE, another working draft from the W3C, RDFa. From the RDFa abstract:

“Current web pages, written in HTML, contain significant inherent structured data. When publishers can express this data more completely, and when tools can read it, a new world of user functionality becomes available, letting users transfer structured data between applications and web sites. An event on a web page can be directly imported into a user’s desktop calendar. A license on a document can be detected so that the user is informed of his rights automatically. A photo’s creator, camera setting information, resolution, and topic can be published as easily as the original photo itself, enabling structured search and sharing.”

That sounds an awful lot like the the purpose of microformats. In fact, give a look over the RDFa examples. You’ll notice they look a lot like microformats.

Exhibit B:

If that’s not enough arrogance, think about the whole SOAP & WS-* fiasco. It’s pretty safe to say that SOAP, as a proposed standard for web service communications, has failed. Craig Andera recently wrote about “The Failure of SOAP” and echoed something that I have noticed for a while: SOAP is just too complex for almost every scenario except for the edge cases requiring extensive security. WSDL is maddening to deal with. In fact, there’s even a rumor that SOAP et al… were made so complex specifically so that tool vendors could step in and supply easy to use tools to encapsulate that complexity. I know, it’s a big reach of the mind seeing as though the W3C is made up of a lot of tool vendors…..

Oh, Let’s not even start to dive into the plethora of WS-* “standards” that no one actually uses.

What does this have to do with Semantics? Well, let no one say that the W3C is not persistent. They’ve just introduced a new proposed recommendation:

Semantic Annotations for WSDL and XML Schema

From the abstract:

“This document defines a set of extension attributes for the Web Services Description Language and XML Schema definition language that allows description of additional semantics of WSDL components. The specification defines how semantic annotation is accomplished using references to semantic models, e.g. ontologies. Semantic Annotations for WSDL and XML Schema (SAWSDL) does not specify a language for representing the semantic models. Instead it provides mechanisms by which concepts from the semantic models, typically defined outside the WSDL document, can be referenced from within WSDL and XML Schema components using annotations.”

I’m quietly shaking my head.

Parallels 3.0 Thoughts

When Parallels announced that version 3.0 of their Desktop for Mac product, I was eager to hear what new features would be included. When I got an email offering me an additional discount to upgrade before June 6th ( $39.99 pre June 6th vs. $49.99 post June 6th ), I was torn. On the one hand, I am married to running Windows on my MacBook Pro. I develop day in and day out in .NET, so i need to run Visual Studio 2005. On the other hand, I haven’t been impressed with the more recent builds and with VMWare Fusion development proceeding at a rapid pace, I wasn’t sure I wanted to dump another chunk of money into Parallels.

In the end, I decided to give 3.0 a shot and see if Parallels can blow me away with their new features.

After running 3.0 for almost a month, my verdict is decidedly “meh”.

I don’t play 3D games, so that support didn’t matter to me at all. I could only hope that those enhancements would also fix the excessive graphic driver flushes that I talked about before. Doesn’t seem like it did.

The new SmartSelect functionality also didn’t appeal to me because i already did everything in OS X and had no desire to use more windows functionality.

I’m only interested in one thing: speed. Unfortunately, parallels is quickly losing me on that front by adding in all sorts of features that I won’t use.

In fact, overall, 3.0 seems slower and more buggy than the previous 2.5 builds. I’m not the only one who thinks so. A cursory glance at parallels forums are revels a lot of people having similar complaints.

I have to say, without a significantly upgraded build, I’m not going to pay for another Parallels release. In fact, as soon as VMWare fusion goes gold, I’m going to give it a serious look.

Get your act together Parallels.