Sudo Geek Humor

I started laughing hysterically when I saw this:

And then a little part of my died when I realized that I was laughing at such geek humor….

However, since I am typing this at 8:00 pm, in the middle of a 15+ hour coding session, perhaps I need to remind myself that I am such a geek.

Technorati Tags: ,

Visual Studio.NET development on OS X w/ Parallels

EDITED x3: I’ve completely moved over to VMWare Fusion and have dropped Parallels like a rock. Check out my posts here and here.

EDITED x2: I have removed a comment pretending to be me.

EDITED: MSDN behind the scenes post about the death of VirtualPC

At the beginning of June, I decided to treat myself to a new 15-inch MacBook Pro. I have been yerning for a new Mac ever since selling my old 12-inch G4 Powerbook. I now work from home and I didn’t have a company provided laptop anymore, so instead of upgrading an old desktop, I decided to splurge. The one hitch in my plan was the fact that I still did a fair amount of .NET development. Sure, there is Mono for OSX, but that’s a far cry from Visual Studio.NET on Windows XP. Clearly, I would need a solution that allows me to run Windows XP on my MacBook Pro.

Previously, I had been able to do some .NET development on OSX using Microsoft Virtual PC. In light of Apple’s transition to the Intel platform, Microsoft has decided to discontinue VirtualPC for the Mac. This is a direct result of the fact that VIrtualPC is a emulation based virtual machine solution, and thus, is based on the translation engine. Microsoft felt it would be too difficult to move VirtualPC over to the Intel platform because of how closely it was tied to the x86-PowerPC translation engine that previous versions ran on. Also, I am sure that it is not a coincidence that this announcement from Microsoft came a day after VMWare announced a beta of their product that ran on the new Intel Macs.

However, before there was a VMWare beta, a company named Parallels came out with their product, Parallels Desktop, also for the new Intel Macs. Both Parallels and the new VMWare beta offer greater speeds than Virtual PC because both are virtualization solutions, as opposed to Virtual PC’s emulation based approach. Since Parallels was out first, that’s what I am currently using.

Before we delve into Parallels, I wanted to mention an alternative to running another operating system on your MacBook: Apple’s BootCamp. However, I have chosen to dismiss this as a viable option because it requires you to actually boot into the other operating system, and run it natively. Since the main reason that I bought a MacBook Pro was to run OSX again, this isn’t really an attractive solution.

My MacBook is the lower end 15-inch, though there are no significant differences between both 15-inch MacBooks, other than increased storage, a minor speed bump and double the graphics memory. Since I am not into playing games, the graphics memory really didn’t matter to me. I also upgraded to 2 GB of RAM, since I knew i would be using Parallels a lot, and if there’s one truth about ALL virtual machines, it’s that they LOVE memory.

After 2+ months of working with Parallels every day, I can honestly say that it’s a pleasant working environment. Everything that I can do in OSX, I do. Everything else, I run in a Windows XP instance within Parallels. This includes:

* Visual Studio .NET 2005 & 2003
* SQL Server Enterprise
* ASP.NET & IIS
* Adept SQL Diff
* RedGate ANTS Profiler
* FireFox
* Other misc .NET utilities

That’s not a minor list of apps. SQL Server runs two databases, one of which is several hundred MB’s in size. With the exception of some stability problems that Visual Studio.NET 2005 has with regards to Visual Basic projects, I haven’t experienced any crashes or lock ups, nor have I been upset with the speeds of Parallels.

However, that’s not to say that there can’t be improvements. Already, some people have released hacks and “tweaks” to Parallels. One app in particular, PDTweaker, make two minor changes to the Parallels concerning the caching of the virtual machine and how often data is flushed to your OSX file system. At the time of this writing, Parallels has released a new beta that comes with several speed enhancements & stability improvements as well as increased device support. I am using the beta right now, and I have seen as increase in speed, but I comment on the increased device support. The beta also includes the caching change included in PDTweaker, however it is unknown if it also includes the other PDTweaker change.

One area that I would like to see the folks at Parallels update is the graphics speed, detailed on the PDTweaker page. Apparently, the current implementation of the graphics driver flushes WAY more often that it needs to.

Quote:

“With Quartz Debug still active, now open Notepad (or, if you have it, Microsoft Visual Studio). Start typing. You’ll see the entire window flash for every single character! Let’s do the math here to show how bad this is. Rather than flushing roughly 16×16 = 256 pixels to draw a character, on my MacBook it’s actually flushing as many as 1280×800 = 1,024,000 pixels instead! That’s a whopping 4000 times as much data being moved around than is actually necessary.”


You can see how something like could be a problem for someone who types in a code editor most of the day.

Before wrapping this up, I wanted to spend some time on one of the trickiest areas of Parallels use: memory allocation. More specifically, how much memory to allocate to Parallels Desktop itself, as well as each individual virtual machine. Memory allocation is somewhat of a catch-22. On the one hand, you want your virtual machine to run as fast as possible. On the other, you don’t want to affect the performance of native OS, which, BTW, is running Parallels Desktop.

So what’s the answer? Unfortunately, it depends on what you do. Anywhere between 256-512MB should be enough for basic tasks. However, since this post is about Visual Studio.NET development, I’ll talk about my setup and my experiences in that light. At first it seems logical to spilt my computers’ memory right down the middle: 1GB for Parallels and 1GB for OSX. I have found this to be inefficient. Everything non-Visual Studio.NET related I run in OSX and some of those applications can be memory hogs too. For example, during my typical work day I have the following OSX applications open:

* Entourage ( Mac Outlook )
* Camino Browser
* Colloquy ( IRC client )
* Adium ( Multi-client IM client )
* NetNewsWire ( RSS Aggregator w/ 250+ feeds )
* iTunes ( duh… )
* One or more Microsoft Office applications ( Word, Powerpoint or Excel )
* Several dashboard widgets

Now, one can see how running all those apps, plus dedicating 1GB of memory to Parallels Desktop, can leave my computer sluggish, and can slow everything down to a crawl.

Next I tried 512MB for Parallels. That gave me satisfying performance for OSX, yet left Parallels very sluggish and frustrating. I found the sweet spot to be 800MB for Parallels Desktop, after initially upping it to 768MB. Dedicating 800MB to Parallels left me with enough memory in OSX for my day to day use, but also gave me a pretty snappy .NET development environment in Parallels.

I toyed around with the idea of removing my paging file from my Windows XP instance. The logic being that i’d rather have Parallels swapping to my OSX virtual memory file rather than have the virtual machine swapping to its own paging file. However, my results weren’t stellar. Windows XP quickly runs out of memory as soon as I load up Visual Studio.Net. I am sure some of that has to do with the VB.NET issues listed above ( the VS.NET instance occupies 350MB+ of memory ). So for the time being, I am still using a swap file in Windows XP.

All in all, I think that Parallels is a very viable option for doing .NET development on OSX and I am very happy with my new development environment.

'Where I Work' photos finally up with the help of .Mac Photocasting

Since I’m a virtual worker, I work from my home most of the time. However, that can get very boring, so I grab my laptop and head out to somewhere with free wi-fi. A while ago I blogged about taking photos of the different places I work from and posting them.

Well, I finally my setup up and running. One of the nifty new features of my .Mac account is Photocasting. In previous versions of iPhoto, it was easy to publish photo albums via your iDisk. Simply select an album, pick a theme, create some funny captions and you’re all done. Cool, but not very flexible. If you wanted to add more photos, you had to redo everything.

Now with Photocasting, you simply publish an album to your .Mac account, and that’s it. Any changes to your album or photos you add will automatically be publised. Other iPhoto users can subscribe to your Photocasted albums, and they appear right along side of their own albums.

What if you’re not a iPhoto user? Well, you can still subscribe to the feed by using your favorite RSS Reader.

You can find the RSS feed for my ‘Where I Work’ photos under the group ‘Photos’ on the side of my blog. For those of you in feeds, i’ll reproduce it here:

“Where I Work”

Anyone have any ideas on where in Chicago I should work next?

ZapThink Semantic paper offers no answers *UPDATED*

ZapThink has published another research paper, this time focusing on SOA, Semantics, and Ontologies. While on the surface, this paper looks very interesting, upon further review, it’s a wee bit of a sham.

For starters, it’s 4 pages long and costs $195.00 ( or 200 ZapCredits, whatever those are ). That’s $50 a page for what is categorized as a presentation. So you know at least one, if not all, of those pages is based off of a powerpoint presentation.

Secondly, and perhaps more disturbing, is the fact that, by ZapThinks own admission, this “paper” offers no answers. Quote:

“This presentation does not provide answers, but does provide more food for thought for those thinking about building loosely coupled, composable Services in environments of heterogeneity.”

I am sure that the people who published this synopsis meant to say that the paper identifies some major issues for people to think about. However, I can’t help but be drawn to the fact that, after reading the abstract, it sounds like people will have more questions after viewing the presentation than before, and they’ll be $195 poorer.

On the other hand, I too can offer no answers to the SOA direction and semantic future of your company, yet I only charge $100.

Such a deal. Act now.

Updated – Ron from ZapThink has informed me that the content has been updated and that it should have been a subscriber only download. See Ron’s comment below.

Technorati Tags: , ,

Virtual Machines: Virtualization vs. Emulation

I just wanted to write down some quick notes about virtual machines. Virtual Machines have always been an effective and efficient way to emulate multiple computers without requiring all of the space and hardware. They create a completely isolated operating system installation within your normal operating system. For this posting, i’ll define two words: host and guest. The host is the native computer that you are running and guest is the virtual machine instance that is running on the host. They are implemented in two ways: emulation and virtualization.

Emulation involves emulating the virtual machines hardware and architecture. Microsoft’s VirtualPC is an example of an emulation based virtual machine. It emulates the x86 architecture, and adds a layer of indirection and translation at the guest level, which means VirtualPC can run on different chipsets, like the PowerPC, in addition to the x86 architecture. However, that layer of indirection slows down the virtual machine significantly.

Virtualization, on the other hand, involves simply isolating the virtual machine within memory. The host instance simply passes the execution of the guest virtual machine directly to the native hardware. Without the translation layer, the performance of a virtualization virtual machine is much faster and approaches native speeds. However, since the native hardware is used, the chipset of the virtual machine must match. Usually, this means the Intel x86 architecture. VMWare is an example of this type of application for Windows.

Regardless of your virtual machine direction, they provide a way to run separate logical instances of an OS on one physical computer. Some uses of this include:

1. Development test bed – Creating separate virtual machines for development environments enables you to customize your laptop for particular environments, without messing with your actual laptop. Good for all those WebSphere or BizTalk installs.

2. Testing Environment – The ability to recreate a customer specific environment, including all relevant hot fixes & service packs, proves to be invaluable when debugging production problems.

3. General Sandbox – Ever want to try a piece of software, but were afraid ruin your computer? Most virtual machines have the ability to rollback in-memory changes to preserve a particular state of the virtual machine. This provides a good safety net.

These are just some of the uses of virtual machines. I am sure people will figure out more.

Technorati Tags: ,

Solving problems isn't always a function of caffeine

A friend works for a local internet based company here in Chicago. They don’t have the funds to bring a full time IT person in, so I do odd side jobs for them and get paid an hourly fee.

This company specializes in exam preparation and individual tutoring for bar exams and LSAT. Much of their business is based on their website, that allows students to customize content for their particular interests. This content can either be downloaded as mp3 files, or sent as audio cd’s.

Yesterday, I did something that enabled them to cut some of their costs almost in half and provide a more streamlined experience for their customers.

Hmm…. “cut costs” and “streamlined experience” ? Sounds right up my alley, considering I make my living building enterprise software to do just that.

So I must have built a super new front end for their customers? Nope.

Surely, I reduced the amount of code that their site uses, and reduced maintenance costs? Not even close.

I got it! I must have come up with a new vision & strategy for expanding their current offerings and acquire a larger market share? HA

What did I do then? I showed them how to burn multiple audio tracks onto one CD.

Since most customers choose to receive the audio cd’s, they send out ~150 audio cd’s with every order. Most, if not all, of the cd’s contain a single audio track. Now, the number of cd’s needed can be slashed, as well as the cost of shipping.

Simple? Absolutely.

“Boring” and “Uninteresting” compared to other more “sexy” technology solutions? Yup.

Vital to their business? Yes!

And yet, I was able to have a dramatic impact on a core business function, content distribution in this case, with out writing a single line of code.

I got into this business to help companies just like this, by using technology to make things more efficient. Sometimes, it’s easy to lose sight of this and fall into an academic state of building software for software’s sake, when in fact, we should be solving problems first.

Technorati Tags: ,

Blog feed updated

Sorry for the deluge of phantom posts in my feed. I am in the process of moving my blog to a new server and moving the RSS feed to FeedBurner.
For anyone reading this who hasn’t already updated their links, please subscribe to my new feed, found here.
Thanks.

Not another The Long Tail book review

**Update: I’m an idiot. See bolding below.

I just finished reading the review copy of The Long Tail, and in exchange for the review copy, I had to write a review of it. In short, The Long Tail is a must read by anyone wanting to understand a shift in our economy. For the past several decades, we’ve had the idea of “hits” beat into our head. Now, for at least some slices of our economy, this is being turned on its head. Several years from now, The Long Tail could take its place among the most revolutionary business books of our time.

However, there are already tons of reviews everywhere that talk about how groundbreaking the book is, so I wanted to take a different approach. I picked some passages from the book that pointed out other ideas that, while not as large or revolutionary as The Long Tail concept itself, could turn out to be just as enlightening.

Onto the quotes:

“This is why niches are different. One person’s noise is another’s signal. If a producer intends something to be absolutely right for one audience, it will, by definition, be wrong for another. The compromises necessary to make something appeal to everyone mean that it will almost certainly not appeal perfectly to anyone -that’s why they call it the lowest common denominator. ( Page 118 )”

This is something that I have known for a long time. People who know me know my aversion to all things that are immensely popular. I’ve always annoyed friends with my spiel of “If something is THAT popular, it can’t be that good.” It’s good to see this concept explained a little more succinctly.

“As the tail gets longer, the signal-to-noise ratio gets worse. Thus, the only way a consumer can maintain a consistently good enough signal to find what he or she wants is if the filters get increasingly powerful ( Page 119 )”

Technorati, del.icio.us, etc… are attempting to be these filters for the internet. I am curious to see if anyone they are going to make the jump and try and monatize the filters they are building. Whoever can do this will make a LOT of money.

“It’s hard to overstate how fundamental to economics the notion is that you can’t have it all for free -the entire discipline is oriented around studying trade-offs and how they’re made. Adam Smith, for instance, created modern economics by considering the trade-offs between time, or convenience, and money. He discussed how a person could live near town, and pay more for rent of his home, or live farther away and pay less, paying the difference out of his convenience.” And since then, economics has been all about how to divide finite pies.( Page 144 )“

To me, this could end up being bigger than the The Long Tail itself. I mean imagine a technology advance that disproves Einstein’s Theory of Relativity? The idea that consumers would not have to make compromises when making financial and economical decisions is a huge fundamental shift in the way we make decisions.

”But for DJs, the important information is in the label, not the track. Indie record labels are like tags, providing a clue about what hyper-specialized micro-genre a track is likely to be. Labels are a way to allow DJs to cheaply and efficiently find tracks that are likely to satisfy their audience expectations. In this sense, labels lay the infrastructure for the later aggregation of decentralized information that takes place on the dance floor.(Page 179)“

I’ve written about the use of filters before. I think this is an interesting use of a filter in real life, but I would like to hear more examples. Perhaps book publishers? Taken computer books for example. O’Reilly has developed a reputation for certain types of technical books. Lots of in-depth information on usually cutting edge topics. Springer, on the other hand, focuses more on the academic side of computing. I am sure that after having read about this, I’ll begin to notice more and more examples of these real world filters.

”In short, we’re seeing a shift from mass culture to massively parallel culture. Whether we think of it this way or not, each of use belongs to many different tribes simultaneously, often overlapping ( geek culture and LEGO ), often not ( tennis and punk-funk ). We share some interests with our colleagues and some with our families, but not all of our interests. Increasingly, we have other people to share them with, people we never met or even think of as individuals ( e.g. blog authors or playlists creators ). ( Page 184 )“



This is becoming more and more pronounced. Boundaries are being broken down and people are becoming more and more open to other types of people. Labels like computer geeks, jocks, punks, and more are going away and people are starting to enjoy the company of those who are different than them, moving ever so slowly over to the center of some sort of social venn-diagram. Pretty much a bigger version of those people in high school who never really had a core group of friends and instead were friends with all sorts of people.

In short, I recommend picking up a copy of The Long Tail and forming your own opinions as to how valid his conclusions are. Whatever your conclusions, you have to admit that our economy is changing, and that the internet is a large reason why and in the end, the consumer will be better off. We’ll have more choice and a lower price for the goods that we purchase.

Technorati Tags: