Very cool…. The first thing I thought of when I read it was Home Simpson’s “His name is like my name” quote….
I know I have blogged about how much I love virtual machines before, but this time it’s different. Before, I was talking about virtual machines as servers. However, in the past few weeks, I have fallen in love with virtual machines as development environments.
Being a consultant, I travel to different clients and have to perform different funcitons. This usually means installing software, hooking up to different networks, etc… All of this invovles messing with my machine. As most other consultants know, after a while, your machine ends up being the Tara Reid of laptops: clearly been handled, abused, and misused, resembling a shell of it’s former self.
Well, while working at my current client the other week, it occured to me that I should just use VirtualPC and a bunch of different images for the different things I work on. I did this after installing VS.NET ( 2003 & 2005 Beta 2 ), BizTalk Server 2004, two dozen or so development utilities, and a new enterprise SOA package that I was evaluating. Needless to say, this slowed my machine down A LOT. All for stuff that I would only occasionally use, and only a package at a time. I would never be evaluating the SOA package while doing BizTalk development, for example.
So, one weekend I cleared off my laptop and installed the bare bones:
- Windows XP
- Office 2003
- Lotus Notes
- Virtual PC
That’s it. First, I setup an initial Windows XP installation and, here’s the key, marked it as read only. I am doing to use this as my parent HD, and base all my other vm’s off of this image. You can do this in VPC by creating the subsequent hard drives as ‘differencing disks’. That way you save space and can reuse vm’s in new ways without runing your original vm’s
So I have a base Windows XP hard disk, and I create a slew of other “base” images based off of this “Alpha” image:
- Visual Studio.NET 2003
- Visual Studio.NET 2005
- Windows XP Sandbox
- SOA Package Evaluation
The Sandbox image has undo disks enabled, and I use it as a clean container to run “sketchy” and possibly dangerous programs on. After I am done, I simply throw out the undo disk, and I have my sand box back. The SOA Package Evaluation vm is used for a package evaluation I am working on. I didn’t want to install all ~1GB of their software on my clean box, because there is always some remnants left over after uninstall.
The VS.NET 2003 box is a little different. I used THAT as a base image for two more images:
- BizTalk Development
Development is where I do my day to day development. Plain, vanilla, with all my dev tools on it. Nice…. BizTalk Development is a vm which I use for, you guessed it, ANOTHER vm
- <Client> BizTalk Development
Why would I ever want to do such a thing? Well, the nice part is that I now have a fullyfuncitoning BizTalk development environment that is on the clients network domain and everything. Let’s see a show of hand from everyone that has heard the old “it’s cause your not on our domain” or “you have to join our AD domain” excuses. So, with that vm, no more security problems or annoying junk left over on my machine. When I am done with the client, I simply delete the vm. No muss no fuss.
Now, for those who got lost, this is what my 8 vm look like:
- Windows XP Parent
- ——> Visual Studio.NET 2003
- ————> Default Development
- ————> BizTalk Development
- ——————> <Client> Biztalk Development
- ——> Visual Studio.NET 2005
- ——> Windows XP SandBox
- ——> SOA Package Eval
Now, here’s the nice part of all of this. The part where everything comes together: vm’s can suspend their state, just like a laptop for example. So, to demonstrate the usefullness of this, let’s say I am working on some Spring.Net code and, all of the sudden, a BizTalk bug comes in from the client. Well, I don’t want to keep all my utils open from the Spring.Net development because BizTalk needs a lot of resources, and I don’t want to slow it down. But I also don’t want to lose where I was with Spring.Net. So, what’s the answer? Well, I can just shutdown my development vm, saving it’s state. Once that shutdown, I boot up the client BizTalk vm, whose state was also save, and I am good to go on fixing bugs. When I am done, I do the reverse, andI am back to Spring.Net development.
I didn’t think it would make that big of a difference, but it really does. If you’re a consultant, you owe it to yourself to create a similar setup. It will safe hours of debugging problems and phantom software packages causing problems. The biggest drawback to all of this is the memory requirements. You should have a LOT of RAM, at least 1GB, and 1.5 GB would be ideal. If you’re not running something like BizTalk Server, then you might be able to get away with a little less, but I wouldn’t go under 1GB.
I have been quoted, and often mocked, for my stance that the Internet is still in its infant stages. Most people I share that thought with disagree, saying that the Internet went through its growing up period during the dot-com bubble, and now it’s becoming mature and stable.
With all due respect, I think those people are wrong.
Of course, the Internet is just like any other technological advance humans have come up with in the past 200–300 years or so. We initially use a new invention in the same way we used the previous invention. In fact, we often refer to the new invention in terms of the old one. This can be seen in a handful of examples:
- Automobile = horseless carriage
- Computer = electronic typewriter
- Pen = better pencil
- Telephone = voice telegram
It is exactly this history of progression that has me believe that we have yet to see how the Internet will really change our lives.
Right now, most people don’t use the Internet for anything other than to perform normal day to day functions. If the Internet were to go down, most people’s lives would still be able to go on living. Notice that i said Internet and not LAN. If a company’s LAN goes down, it would render a lot of companies useless. But if the Internet went down, we wouldn’t be able to book plane tickets online, get directions to locations, download music. These are all things we could do without the Internet however. Granted, as with the above inventions, they are much easier to done with the new invention vs. without, but they’re still possible.
In Jeff Hawkings book, On Intelligence, which I refer to in this post, he talks about a similar conclusion he has come to regarding Artificial Intelligence. He makes the same arguments that I make above about previously invented technologies, and compares them to the current state of AI and how people think about applying AI to solve problems we have already solved, but doing them quicker and faster. He is excited for the possible uses for AI that we have not even considered yet.
I am excited for the possible uses for the Internet that we have not even considered yet.
Ok, so a bunch of people ask me about the music I listen to. I also notice that a lot of people click on the little links at the bottom of my blog posts that describe what I am currently listening to, but are surprised when it’s not found in iTunes.
Well, that’s mostly because the music I listen to is not really that main stream. I listen to a lot of underground electornic, hip hop, trip-hop, etc….. That kind of stuff is not really found with main stream music sites. I really dislike most mainstream music. It’s just doesn’t have the substance of other music. But occasionally, I will listen to something that is a little more mainstream because it’s interesting for other reasons. But I digress….
Well, here is my little gift to all those to don’t have access to the same underground channels I do. It’s a dis track from The Game to 50 Cent & G Unit. For those not familiar, i will recap the story so far: The Game & 50 Cent used to be friends and cohorts. Something happen ( no doubt involving some one “fronting” and almost certain some form “disrespect” ). With the friendship now gone, the verbal & physical assaults started. The Game released a track called “300 Bars” calling out everything G Unit & 50 Cent. That, in and of itself, isn’t that interesting. What’s interesting is the song’s title “300 bars” is actually the length of the song. 300 Bars! I don’t know a whole lot about music, but 300 bars of music is a LONG time. Translating to about 15 mintues of straight rapping. That’s pretty impressive.
There are a ton of quality raps in there, as well as some great beats. Go ahead and give it a listen.
Ok, so just to have a little fun today, i decided to pose this question to a group of friends. Then I thought, why not post it? So here it is:
What is the BEST Will Ferrell cameo? Your choices are:
- Wedding Crashers
- Starsky & Hutch
- Jay & Silent Bob Strike back
- Austin Powers Series
One of the points he makes is how we can use abstraction mapping to work at different levels of abstraction than we need. He uses the age old example of a compiler taking in a higher level GL ( i.e. C# ) and producing lower level code ( i.e. IL ).
Upon reading this, I wondering if code smells exist in generated code.
Well, let me back up. First, I wondering if our common practices need to be supported in a GL before we setup the level of abstraction. Like, for example, refactoring. Would people be willing to move to a higher level GL, say while doing model driven development, if they wouldn’t be able to ‘refactor’ it?
This brings me back to code smells. I wondering if, in the case of refactoring, it would be necessary to be able to refactor high level abstractions if they produced lower level code smells. For example, let’s say I had a template that takes in a set of variables, and produces several C# class files, according to the input variables. If there is duplication in the resulting C# files, that’s classic code smell. BUT, since the C# code is simply the result of a abstraction mapping, should it be allowed to slide if we couldn’t change the source abstraction?
People have already come up with code smells that span an abstraction, but I am unaware of code smells that traverse abstractions. I wonder if this will become more of a topic among agile developers, given the current initiatives by companies to introduce model driven development?