comments edit

Here’s the code

<img src="maintenance.gif" />

And the result

Can working on the
server

At least this seems to be my experience with Twitter lately.

Look, I can understand the need to save costs with outsourcing, but you get what you pay for. It’s no wonder that Twitter is down so often given they’re using cats for their server maintenance.

At least pay a little extra and hire these guys!

comments edit

UPDATE: Problem solved thanks to some members of the IIS 7 team!

I am at my wits end trying to get IIS 7 to work on my Vista Ultimate box and I have tried everything I can think of.

I’ve tried every step in the following tutorial, Where did my IIS7 Server go? Troubleshooting “server not found” errors. I also tried every step in this post on troubleshooting “service unavailable” errors. Trust me when I say I went through every one of these steps twice. Rob Conery can back me up on this because he watched me do so in a when I shared my desktop with him via GoToMeeting.

As far as I can tell, IIS 7 or http.sys must be corrupted somehow and the only thing left for me to try is to repave my machine and reinstall Vista. Unless of course one of my dear readers has an insight that will help me solve this, or knows someone who does.

The Problem

I’m running Vista Ultimate which has IIS 7 installed. When I navigate to http://localhost/ or http://localhost/iisstart.htm, I get an HTTP Error 503 Service Unavailable message.

What I’ve Tried

Confirmed that Skype is not listening to port 80 (Skype tries to listen on port 80 by default).

​1. Confirmed that App Pools were configured correctly and started.

​2. Ran the following command: appcmd list apppools which produced the output:

APPPOOL "DefaultAppPool" (MgdVersion:v2.0,MgdMode:Integrated,state:Started) APPPOOL "Classic .NET AppPool" (MgdVersion:v2.0,MgdMode:Classic,state:Started)

​3. Confirmed that Website was started.

​4. Ran the following command: netstat -a -o -b which produced the output:

TCP [::]:80 METAVERSE:0 LISTENING 4Can not obtain ownership information

The 4 there is the PID which I confirmed to be System as in the NT Kernal & System.

​5. Confirmed bindings were configured in IIS Manager and set for localhost port 80.

​6. No error messages in the event log under System nor Application.

​7. Made sure that my user account and the Network Service account both have access to the c:\inetpub\wwwroot directory.

​8. Tried browsing to http://MYCOMPUTERNAME/, http://localhost:80/, http://127.0.0.1/, etc… (I was getting desparate).

​9. Tried changing the default AppPool’s managed pipeline mode to Classic.

​10. Tried changing the default AppPool’s .NET framework version to No Managed Code (recall that I am trying to request a static HTML page).

​11. Able to ping localhost.

​12. Tried to telnet localhost 80 and then issue the command GET / and received the same message.

​13. Double checked that all the Handler Mappings were enabled.

​14. Made sure Anonymous Authentication was enabled. Heck, I tried it with them all enabled and tried it with only Windows Authentication enabled.

​15. Authorization Rules has one rule: Mode=allow, Users=All Users.

​16. Enabled Failed Request Tracking. Nothing showed up in the logs.

​17. Uninstalled and reinstalled IIS 7.

​18. Tried pulling my hair out and rending my garments and sacrificing chickens^1^.

Any ideas?

So there you go. I’ve tried everything I can think of and now I appeal to you for help.

The funny thing is that this works on my Vista box at work and I compared every setting in IIS. This confirms in my mind that something got fubar’d. But I hesitate to repave my machine just yet in the hope that someone out there has some definitive answers for me.

^1^ No actual chickens were harmed in troubleshooting this problem.

comments edit

First week on the job and I’ve already got the keys to the company blog. I just posted my first post at koders.com announcing the latest set of site updates.

One thing that I was surprised to learn this week, though it really shouldn’t surprise me, is that Koders uses an open source search engine to create the full-text index. More specifically, it uses Lucene.NET, a port of the Java Lucene project.

I’m familiar with Lucene.NET because the Subtext and RSS Bandit projects both use it for searching (though I was not the one to implement it in either case). As far as I know, it pretty much is the de-facto standard for open source search software on the .NET platform.

Of course Lucene.NET is only part of the Koders code search picture. It provides the full-text indexing, but if you use the Koders search engine, you’ll notice that there is some level of semantic analysis on top of the text index. Otherwise, you wouldn’t be able to search for method names and class names and such, not to mention the syntax highlighting when viewing code.

I’m still learning about all the layers and extensions Koders built on top of Lucene.NET. As I said in the post, those are probably topics I can’t get write about in too much detail.

Screenshot of project
browserOne thing I should point out is that code search is only part of the picture. Koders also has a pretty sweet code repository browser (click on thumbnail for larger view).

My favorite open source project is now in the index and can be viewed here.

On a side note, I recently talked about Search Driven Development in the theoretical sense, but have been able to put it to good use already at Koders.com. In the great developer tradition of dogfooding, I needed to look at some code from home before I had my VPN setup. It was nice to be able to login to Koders internal Enterprise Edition and find the snippet of code I needed.

comments edit

Real quickly, check out our brand spanking new build server. Notice anything different? No? Good! Hopefully everything is working just fine, but faster.

As you know, I’m ever the optimist. What’s that trite phrase, “When the crap hits the fan, make lemonade”? Or something like that.

So in this tragedy becomes triumph story, the bricking of my tiny little home built build server caused me to start thinking of a more permanent solution. In steps Eric Kemp, Rob Conery’s right hand man (in the clean sense of the idiom) on the Subsonic team.

He converted the VMWare image to run on Virtual Server and is hosting our virtual build server on a pretty hefty machine. Finally, I can shut down the virtual machine running on my desktop.

Eric ended up moving the server twice before settling on a final location. Eric, if you have a moment, remind me what the specs are on that baby.

So now it is time for me to step in with my part of the bargain, which is to help the Subsonic team get a continuous integration setup going. Now that their code is hosted in a Subversion repository, this will be a lot easier than it would’ve been before.

Even so, I’ll probably look at using CIFactory and hopefully enlist the help of an Software Configuration Management Ecowarrior, aka Jay Flowers.

comments edit

Simone Chiaretta, a member of the Subtext development team (among other open source projects), has been quite busy lately. I recently mentioned the Vista Sidebar Gadget for CruiseControl.NET he published. He also was recently in a video interview by MindBlog. Go Simo!

Loading... The post that caught my eye recently is how to make a Gmail-like loading indicator with ASP.NET Ajax. This is a nice demonstartion of how to use the ASP.NET Ajax library to simulate various styles of user interface.

Personally though, I’m not a fan of this particular loading indicator at the page level. When I have my browser fully expanded, I sometimes don’t notice it. However, it works great when constrained to a box, such as the corner of a login box.

comments edit

At the end of an average eight-hour workday, the fingers have walked 16 miles over the keys and have expended energy equal to the lifting of 1 1/4 tons. - DataHand

Ergonomic
Diagram And that’s just for a piddly eight hour workday! What about developers who go home and write more code for fun?

Around three years ago I wrote a post titled The Real Pain of Software Development Part 1 in which I talked about my experience with Repetitive Stress Injury (RSI). Because of the pain of typing at that time, I never really got around to writing part 2 until now.

So why am I bringing up this subject after all these years? Yesterday, I was reminded again about just how important developers need to take ergonomics in their daily work life.

As I wrote in that post, years ago, I suffered from a lot of pain when writing code. I started going to Physical and Occupational therapy which helped immensely.

During that time I went out and bought the best chair I could find, despite the high cost. (Note that the best chair for me might not be the best chair for you due to differences in body types. Jeff Atwood mentions theSteel Case Leapas his favorite in a post that highlights the importance of a good chair to a developer’s productivity.)

Along with the chair, I bought a good keyboard, a keyboard tray with an articulating arm, and a trackball and made sure it was configured in a way that was comfortable for me and ergonomically sound.

And over time, I got better. In fact, I got much better.

At that point I vowed to never to skimp on the tools I needed as a developer. As Jeff points out in his Programmer’s Bill of Rights,

It’s unbelievable to me that a company would pay a developer $60-$100k in salary, yet cripple him or her with terrible working conditions and crusty hand-me-down hardware. This makes no business sense whatsoever. And yet I see it all the time. It’s shocking how many companies still don’t provide software developers with the essential things they need to succeed.

I realized that not every company will follow through on this reasonable advice, so I took it upon myself to make sure I have what I need personally.

In the past several years, I’ve worked alot more from home due to my open source contributions. In the past two years, I worked full time from home. I made sure that I had the best quality setup for me possible. Better than any other employer ever provided. To the items I mentioned before, I added two very bright and sharp LCD monitors which reduce eye strain. This setup is finely tuned for me and the way I work.

Which brings me back to yesterday. It was my first day of work at my new company and naturally, the workstation has not had the opportunity to be configured by me, through no fault of my employer. My requests for a new keyboard, trackball, and keyboard tray were all approved. I plan to bring in my own chair because I have two Neutral Posture chairs at home. I don’t screw around with my equipment any more.

Unfortunately, after just one day I am having trouble sleeping due to pain in my hand and back.Shoot, I need to get up in three hours!

Yeah, I know what you are thinking. What a freaking wuss!

But no amount of macho posturing changes the fact that some people, for better or worse, are more prone to these type of injuries than others.

I wish I could remember his name, but there’s a very famous software developer world renowned who cannot type for himself. He has others type for him. I’m sure someone will remind me.

What my Occupational therapist taught me is that my recovery is based on a delicate balance. Upsetting that balance can bring back a lot of pain. In my case, I will never fully be free from pain while working. But through therapy, I learned techniques to reduce the pain as well as deal with it better.

As fortune would have it, this was around the time I received a private office, so it was less of a spectacle when I would get on the floor with a foam roller and do my stretches every couple of hours.

The pain had gotten so bad, there were days I could barely type a line of code, instead finding ways to be productive without coding. I started to wonder about my future, or lack thereof, as a software developer.

My recovery allowed me to not only work through a normal day productively, but actually start putting in extra coding effort in the evenings as I started to contribute to RSS Bandit and eventually start the Subtext Project.

I became even more productive than before, working day and night at the computer. A stark contrast to when I could barely type a single method of code. I could never have started Subtext if it weren’t for the therapy. For that, I am eternally grateful to the occupational/physical therapists at UCLA medical center.

So if you experience a lot of pain while developing, don’t be a hero. Be smart about it and seek information and help. The pain could be the thing holding you back from your potential.

comments edit

Fortune Magazine published an article in which they describe how Microsoft claims that free software, such as Linux, violates 235 of its patents.

Some key snippets (emphasis mine):

Microsoft is pulling no punches: It wants royalties. If the company gets its way, free software won’t be free anymore.

Microsoft General Counsel Brad Smith and licensing chief Horacio Gutierrez sat down with Fortune recently to map out their strategy for getting FOSS users to pay royalties. Revealing the precise figure for the first time, they state that FOSS infringes on no fewer than 235 Microsoft patents.

In the meantime, with Microsoft seemingly barred from striking pacts with distributors, only one avenue appears open to it: paying more friendly visits to its Fortune 500 customers, seeking direct licenses.

If push comes to shove, would Microsoft sue its customers for royalties, the way the record industry has?

“That’s not a bridge we’ve crossed,” says CEO Ballmer, “and not a bridge I want to cross today on the phone with you.”

The article points out that Microsoft doesn’t make any specific patent claims. They simply break the number down into categories.

But he does break down the total number allegedly violated - 235 - into categories. He says that the Linux kernel - the deepest layer of the free operating system, which interacts most directly with the computer hardware - violates 42 Microsoft patents. The Linux graphical user interfaces - essentially, the way design elements like menus and toolbars are set up - run afoul of another 65, he claims. The Open Office suite of programs, which is analogous to Microsoft Office, infringes 45 more. E-mail programs infringe 15, while other assorted FOSS programs allegedly transgress 68.

I understand that Microsoft is a business and has a duty to its shareholders and the right to protect its patents. But the question I ask here is whether this is a good idea? Is this really a good strategy to meet its fiduciary duty? Will this keep its shareholders, customers, and developers happy?

At the moment, I don’t think this is a wise move. I’m not a patent expert so there is no point in me arguing on the validity of their patent claims, so I won’t. I will try to reserve judgment concerning the areas I am not well informed about. Instead, I want to focus on the negative perception of such a move for Microsoft and the potential effects of that.

On the face of it, given the recent pact between Novell and Microsoft, this seems like a transparent attempt to scare other Linux distributors into forming their own pacts with Microsoft. This perception of bullying certainly doesn’t help Microsoft’s image, which had been on the mend in the last few years.

Not only that, but by executing such a move, Microsoft will find it hard to avoid the charge of hypocrisy by its critics considering how Microsoft itself has been the victim of patent trolling in various lawsuits such as the EOLAS case and the more recent Vertical Computer Systems lawsuit over .NET.

Given the recent Supreme Court rulings around patent trolling, this announcement seems especially poorly timed. Microsoft appears to be trying to have it both ways, fighting against silly patents on the one hand while threatening enforcement of patents with the other.

Again, I do not know that Microsoft is engaging in hypocrisy or patent trolling. I am merely focused on the perception of their actions. I can’t claim that Microsoft is patent trolling because I don’t know whether these patent claims are legitimate. And that’s part of the problem. Nobody knows yet.

By not addressing specifics, Microsoft is not opening its patents to challenge. So while they might not be patent trolling in this case, they certainly are (whether intentionally or not) creating an environment of fear, uncertainty, and doubt for the ecosystems that have grown around these open source projects.

This at a time when Microsoft seemed to really be warming to Open Source. At the Mix07 conference, I really got the sense that developers and program managers at Microsoft are really starting to embrace the Open Source Software and how it fits into the Microsoft ecosystem.

This may be indicative of a disconnect within the Microsoft ranks. It seems that the new wave of Microsoft employees, especially many of their developers and Program managers, see Open Source as a way to enhance and generate value around the Microsoft .NET development platform. Meanwhile, it appears that the old wave of Microsoft executives and its legal department cannot look beyond the potential threat that OSS might be to Microsoft and smell opportunity.

This points to another potential negative effect of this strategy - potential developer discontent within and outside the company. Microsoft’s recent attempt to stifle Office 2007 ribbon look alikes pushed Mike Gunderloy from the ranks of being a relative proponent of Microsoft to deciding to completely leave the Microsoft universe. Could this announcement have a similar effect on other developers and hurt Microsoft’s ongoing competition with Google in retaining the best and the brightest?It remains to be seen.

As I said in a recent interview, I firmly believe that the .NET platform is a fantastic environment for developing open source software. I also believe that these myriad of open source projects built with .NET benefits Microsoft immensely. So this recent news about Microsoft’s Patent fight just leaves me scratching my head, not because I don’t think they have the right to defend their patents, but because I wonder if it’s really the smart thing to do at this time.

In any case, at the very least, I hope they don’t use the RIAA as a model of strategic brilliance and start suing their own customers. Now that, I can unilaterally claim, would be a bad move.

What are your thoughts on this? Am I off base and uninformed, or do I make a good case here?

comments edit

With the announcement of the 1.9.5 release of Subtext, I thought I should talk about the new tagging and tag cloud feature. You can see it in action in the sidebar of my site.

A
Tag To implement tagging, we followed the model I wrote about before. Tags do not replace categories in Subtext. Instead, we adopted an approach using Microformats.

We see categories as a structural element and navigational aid, whereas we see tags as meta-data. For example, in the future, we might consider implementing sub-categories like WordPress does.

The other reason not to implement tags as categories is that most people create way more tags than categories and blog clients are not well suited to deal with a huge number of categories.

To create a tag, simply use the rel-tag microformat. For example, use the following markup…

<a href="http://technorati.com/tag/ASP.NET" rel="tag">ASP.NET</a>

…to tag a post with ASP.NET.

Please note that according to the microformat, the last section of the URL defines the tag, not the text within the anchor element. For example, the following markup…

<a href="http://technorati.com/tag/Subtext" rel="tag">Blog</a>

…tags the post with Subtext and not Blog.

Also note that the URL does not have to point to technorati.com. It can point to anywhere. We just take the last portion of the URL according to the microformat.

comments edit

Subtext Submarine
Logo It is with great pleasure and relief that I announce the release of Subtext 1.9.5. Between you and me, I’m just happy to get this release out before the kid is born.

As with most point releases, this is primarily a bug fix release, but we found time to introduce a few nice new features - most notably support for tagging and Identicons.

New Features

  • Content Tagging and Tag Cloud - for more details, refere to this post
  • Identicon Support - Uses the Identicon Handler project on CodePlex.
  • MyBrand Feedburner Support - Updated our FeedBurner implementation to support custom feedburner URLs
  • Upgrade toLightBox 2.0
    • If you referenced the default lightbox skin in your custom skin, please reference this post by Simone to understand how to update the skin.
  • Author CSS Class - The CSS class of “author” is added to comments left by the owner of a blog (must be logged in when leaving comment for this to work). This allows custom skin authors to highlight comments by authors.
  • Credits Page - In the Admin section, we give credit where credit is due, displaying a list of the open source products we make use of in building Subtext.
  • Implemented ASP.NET AJAX - We replaced MagicAjax panel with ASP.NET Ajax libraries. Keep in mind that this requires a bit of new Web.config configuration sections. So be careful when merging your Web.config changes.

Bug Fixes

Clean Installation

A clean install of Subtext is relatively painless. Essentially copy all the files to your webserver, create a database if you don’t already have one, update web.config to point to your database, and you’re all set. For more details on the setup instructions, read the Clean Installation Instructions

Upgrading

Upgrading from a previous 1.9 version is relatively straightforward. For the safest approach, follow the upgrade instructions here.

I’ve written a command line tool for upgrading Subtext, but it isn’t production ready yet. Our goal is to make the upgrade process as seamless as possible in future versions. If you’d like to help with that, we’d love to have your contributions!

Download

As always, you can download the latest release here. The install file contains just the files you need to deploy to your webserver. The source file contains full source code.

Thanks!

As always, many thanks go out to the many Subtext contributors and community members who helped make this latest release possible. Subtext keeps on getting better because of the community involvement. Just take a look at the improvements to our Continous Integration and Build Server as an example.

If you’d like to contribute, we’re always looking for help. Great positions are open!

What’s Next?

I’m wrapping up a project for a client in which I was able to implement multiple authors per blog. Hopefully, this means that Subtext 2.0 won’t take as long to release as 1.9.5 did.

We are constantly improving our development process and refactoring the code. The big push for 2.0 is to get the Plugin Model and custom MembershipProvider rock solid and to refactor and clean up the code heavily.

comments edit

Today is my last day of work as a VelocIT employee, a company I helped start and had (and still have) high hopes for as employee #1.

No, I’m not being fired for blogging too much or embezzling funds. No, there wasn’t a big falling out with partners in the company throwing books at each other and screaming expletives. Unfortunately, nothing dramatic and tabloid-worthy like that happened at all.

I simply lost interest in being a consultant and I blameSubtext. Micah Dylan, the CEO and Founder of VelocIT and my good friend, and I often talked about the idea that there are two general types of developers (I’m sure there are many more).

  1. Developers who are easily bored and love to learn about new businesses and business models. Staying on one project forever would cause these devs to go insane. They love the excitement of jumping from client to client and project to project.
  2. Developers who love to craft and hone a single code-base through multiple versions. These devs are fine sticking with a project for a long time and get enjoyment in watching the application take form over the years.

For a long time, I’ve been more firmly in camp #1 with tendencies towards #2. But over the past couple of years working on Subtext, I’ve never gotten bored with working on the same code and realized I have been in Camp #2 for a good while now.

Sure, I do get excited about learning new technologies all the time, but now it is in the context of how they will help me make Subtext a better blog engine.

Not only that, I found that what I most love about the Subtext project is not just the craft of developing an application over multiple versions, but the joy in building a community around that project.

Maybe this is because with Subtext, my “clients” are other developers. I understand developers better than I do other clients because their pain is often my pain. I just don’t have the same pains that a Director of Marketing does (well actually I kind of do with Subtext, but I don’t have any budget to address those pains so I ignore the pain).

My heart just hasn’t been in consulting for a good while now, but I couldn’t leave while we were struggling along at the brink of going out of business. So I pushed on, helped land a big client, and now it looks like VelocIT is close to having more projects on its hands than employees! So if you love consulting and software development, send Jon Galloway your resume.

I will still be involved with VelocIT in a limited capacity. Discussions are still underway, but I hope to remain on as a Board Member and shareholder. The team we’ve assembled at VelocIT are among the best and brightest I have ever worked with. I love working with them and working from home. I willl certainly miss all of that.

So where am I going next?

I’ll be taking a position with Koders Inc. as the Product Manager of the Koders.com website, an Open Source code search engine. I think this will be a good fit for me due to my passion for open source software.

My goal is to as much as possible help developers become more productive via search driven development and the services that naturally extend from that.

Naturally, the best way to do that is to provide relevant search results. But beyond that, I believe that building an active community around the site via tools, widgets, and APIs that developers can use in their own projects will also be very important in being a useful resource for developers. Koders is for coders and developers.

I’ll be relying on your feedback regarding the site’s usability and how well it helps you to be more productive to help me do my job. In other words, I’m going to take my lazy butt and try and ride your coattails in order to do my job well. Is that genius or what? ;)

One thing I really like about the site so far is the project browser. Check out the browser for the MbUnit project. Wouldn’t it be nice to integrate that into your project homepage, your CruiseControl.NET build, or even replace the the CodePlex code browser with that? (hint hint Codeplex).

In any case, wish me luck. This is probably the most difficult job change ever for me since it’s not just a job that I’m leaving, and not just a job that I’m joining.

One funny part of this I won’t tell you yet. But you’ll laugh when you hear the name we chose for our son, which we chose before all this happened.

comments edit

If you downloaded Subtext last night and try to edit keywords in the admin section, you might run into a syntax error. I fixed the download so if the file you download is named

SubText-1.9.5-INSTALL.zip

(notice the all caps “INSTALL”) you have nothing to worry about.

If you downloaded

SubText-1.9.5-Install.zip

Then you might want to replace the EditKeywords.aspx file in the Admin folder with this one.

My apologies. I thought I had tested every page in the admin before releasing, but it was late and I must have missed this one. I never use that page day to day so my dogfooding attempts surely missed it.

code, tdd comments edit

If you’ve worked with unit test frameworks like NUnit or MbUnit for a while, you are probably all too familiar with the set of assertion methods that come built into these frameworks. For example:

Assert.AreEqual(expected, actual);
Assert.Between(actual, left, right);
Assert.Greater(value1, value2);
Assert.IsAssignableFrom(expectedType, actualType);
// and so on...

While the list of methods on the Assert class is impressive, it leaves much to be desired. For example, I needed to assert that a string value was a member of an array. Here’s the test I wrote.

[Test]
public void CanFindRole()
{
  string[] roles = Roles.GetRolesForUser("pikachu");
  bool found = false;
  foreach (string role in roles)
  {
    if (role == "Pokemon")
      found = true;
  }
  Assert.IsTrue(found);
}

Ok, so that’s not all that terrible (and yes, I could write my own array contains method, but bear with me). But still, if only there was a better way to do this.

Well I obviously wouldn’t be writing about this if there wasn’t. It turns out that MbUnit has a rich collection of specialized assertion classes that help handle the grudge work of writing unit tests. These classes aren’t as well known as the straightforward Assert class.

As an example, here is the previous test rewritten using the CollectionAssert class.

[Test]
public void CanFindRole()
{
  string[] roles = Roles.GetRolesForUser("pikachu");
  CollectionAssert.Contains(roles, "pokemon");
}

How much cleaner is that? CollectionAssert has many useful assertion methods. Here’s a small sampling.

CollectionAssert.AllItemsAreNotNull(collection);
CollectionAssert.DoesNotContain(collection, actual);
CollectionAssert.IsSubsetOf(subset, superset);

Here is a list of some of the other useful specialized assert classes.

  • CompilerAssert - Allows you to compile source code
  • ArrayAssert - Methods to compare two arrays
  • ControlAssert - Tons of methods for comparing Windows controls
  • DataAssert - Methods for comparing data sets and the like
  • FileAssert - Compare files and assert existence
  • GenericAssert - Compare generic collections
  • ReflectionAssert - Lots of methods for using reflection to compare types, etc…
  • SecurityAssert - Assert security properties such as whether the user is authenticated
  • StringAssert - String specific assertions
  • SerialAssert - Assertions for serialization
  • WebAssert - Assertionns for Web Controls
  • XmlAssert - XML assertions

Unfortunately, the MbUnit wiki is sparse on documentation for these classes (volunteers are always welcome to flesh out the docs!). But the methods are very well named and using Intellisense, it is quite easy to figure out what each method of these classes does.

Using these specialized assertion classes can dramatically cut down the amount of boilerplate test code you write to test your methods.

Keep in mind, that if you need the option to port your tests to NUnit in the future (not sure why you’d want to once you have a taste of MbUnit) you are better off sticking with the Assert class, as it has parity with the NUnit implementation. These specialized assertion classes are specific to MbUnit (and one good reason to choose MbUnit for your unit testing needs).

comments edit

Not too long ago I mentioned that a power surge bricked the Subtext Build Server. What followed was a comedy of errors on my part in trying to get this sucker back to life. Let my sleep deprived misadventures be a cautionary tale for you.

My first assumption was that the hard drive failed, so I ordered a new Hard Drive.

Lesson #1: If you think your hard drive has failed, it might not be a bad idea to actually test it if you can. Don’t just order a new one!

I have my main desktop machine I could have used to test the drive, but due to my sheer and immense laziness, I didn’t just pop the drive in there as a secondary drive to test it out. I just ordered the drive and moved on to other tasks.

Days later, the drive arrived and I popped it in and started to install Ubuntu on the machine. As I got to the disk partitioning part, I noticed that it found a disk and I went ahead and formatted the drive and installed Ubuntu. Sweet! But when I rebooted, the server not find the drive. Huh?

The Scream - Edvard
Munch Lesson #2: When installing an Operating System on a machine, make sure to unplug any external USB or Firewire drives.

Yep, I formatted my external hard drive and installed Ubuntu on that. The Ubuntu installation process recognized my firewire drive and offered that as an available drive to partition and install. Ouch!

At this point, I realized that the machine was not detecting my brand new hard drive, though I could hear the hard drive spin up when I powered on the machine. I figure that quite possibly it’s a problem with the SATA cable. So I order a new one.

Lesson #3: In the spirit of lesson 1, why not just temporarily pull a SATA cable from your other machine, if you have it.

I thought the SATA cables were all inaccessible and would be a pain to pull, but didn’t bother to check. It was in fact easy to grab one. To my defense, I figured having extra SATA cables on hand wouldn’t be a bad idea anyways and they are cheap.

So I plugged the SATA cable that I know to be good into the box and still it won’t recognize the hard drive. At this point it seems pretty clear to me that the drive controller on the Motherboard is fried. Any suggestions on how to fix this are welcome, if it is even possible.

In any case, after a good night of sleep, I started doing the right thing. I plugged the old drive into my desktop and sure enough, I can copy all its files onto my main machine.

I installed VMWare server and the build server is now up and running on my main desktop for the time being. Woohoo!

As a side note, I tried to use this VMDK (VMWare) to VHD (Virtual PC) Converter (registration required) so I wouldn’t have to install VMWare Server on my machine, but it didn’t seem to work. Has anyone had good luck converting a VMWare hard disk into a Virtual PC hard disk?

Long story short, do not under any circumstances let me anywhere near your hardware. At least the build server is back up and working fine. It is officially time to subscribe to mozy.com. Im exhausted. Good night.

comments edit

My friend Scott Hanselman is on a mission to raise $50,000 and then some for the American Diabetes Association to help fund the search for a cure.

Team Hanselman Fight
Diabetes If you don’t know Scott, you should definitely subscribe to his blog.

His blog is has a wealth of information on software development, diabetes, .NET, and other topics.

He’s given much to the community over the years through his blog, podcast, Open Source projects, etc….

Interesting little tidbit: When I first started to take blogging I set a goal for myself. Setting a goal was my way of taking blogging more seriously. I surveyed the blog landscape and said to myself, “Hmmm, this Hanselman dude has a pretty popular blog. I want my blog to be as well known as that guy someday.”, not realizing at the time just how big a target I chose. His blog was an inspiration for mine, and many others I’m sure.

I’m far from reaching that goal, but along the way I have become friends with Scott (as well as a friendly rival with our competing open source projects, though we cross-contribute more than we compete). This past Mix07 I finally met the guy in person and he’s a laugh a minute.

Give now and have your contribution matched 7 times! Scott and his coworker Brian Hewitt have started a 48 hour blog matching challenge. From Wednesday May 9th at Noon PST through Friday, May 11 at Noon PST, contributions will be matched by seven bloggers, myself included.

Here’s the donation page.

comments edit

I am a total noob when it comes to working with Linux. The only experience I have with Unix is in college when I used to pipe the manual of trn via the write command to unsuspecting classmates. If I remember correctly, this is how you do it.

write username | man trn

This would fill the user’s screen with a bunch of text. Always good for a laugh.

Normally, the write command informs the receiver who originated the message. I remember there was some way to hide who I was when sending the message, but have since forgotten that trick. Yep, them are some true ! Good times! Good times!! Good times! Good times!

As usual, I digress.

I recently decided to try out Ubuntu to see what all the fuss was about. My notes here apply to Virtual PC 2007.

Downloading Ubuntu and Setting up the VPC

To start, download the iso image from the Ubuntu download site. I downloaded the 7.04 version first since I assumed the bigger the version number, the better, right? We’ll see this isn’t always the case as we’ll see.

For completeness, I also installed 6.06.

When creating a new Virtual PC, make sure to bump up the ram to at least 256 MB of memory. Also make sure there is enough disk space. I tried to skimp and had a problem with the install. If in doubt, use the default value for disk space.

Installing Ubuntu

After creating a new Virtual PC machine, select the CD menu and then Capture ISO Image and browse for the iso image you downloaded.

Virtual PC Capture ISO Image
Menu

When the Ubuntu menu comes up, make sure to select Start Ubuntu in Safe Graphics Mode. I’ll explain why later.

Ubuntu startup
screen

At this point, Ubuntu boots up and if you’re a total noob like me, you might think “Wow! That was a fast install!”.

It turns out that this is Ubuntu running off the CD. I must’ve been tired at the time because this confounded me for a good while as everytime I rebooted, I lost all progress I was making. ;) The next step is to really perform the install.

The Mouse Capture Issue

If you’re running Ubuntu 7.04, you might run into an issue where you can’t use the mouse in the VPC. This is due to a bug in some Linux distros where it cannot find PS/2 mice, which is the type that VPC emulates.

This post has a workaround for dealing with this issue by using the keyboard until these distros are fixed. Heck, this might be a great feature of 7.04 since it forces you to go commando and learn the keyboard shortcuts.

Ubuntu 6.06 does not suffer from this problem, so it may be a better starting point if you and the mouse have a great rapport.

The Real Install

At this point, you are ready to start the install. From the top level System menu, select Administration | Install.

Starting the
Install

The installation process asks you a few simple questions and doesn’t take too long.

The Bit Depth Issue

Earlier I mentioned making sure to start Ubuntu in Safe Graphics Mode. The reason for this is that the default bit depth property for Ubuntu is 24, which Virtual PC does not support. If you fail to heed this advice, you’ll see something like this. Kind of looks like that All Your Base Are Belong to Us video.

Ubuntu without the proper printer
settings

Fortunately, I found the fix in this post on Phil Scott’s blog (Phils rule!).**I bolded the essential elements.

Once I was in there, I found the configuration file for the graphics card in /etc/X11. Sso type in cd /etc/X11, although I certainly hope even the most harden of MScentric people can figure that out :). Once in there I opened up xorg.conf using pico (so type in pico xorg.conf - isn’t this fun?). Browse down to the screen section. Opps, looks like the defaultDepth property is 24, which VirtualPC doesn’t support. I changed this to 16 and hit CTRL-X to exit (saving when prompted of course). Typed in reboot and awaaaaaaay we go.

When I ran through these steps, I found that I had to use the sudo command (runs the command as a super user) first. For example:

sudo pico xorg.conf

Your results may vary. Speaking of sudo,have you seen my t-shirtfromXKCD.com?

Virtual Machine Additions for Linux

At this point, you’ll probably want to install the Virtual Machine Additions. Unfortunately, the additions only work for Windows and OS/2 guest operating systems.

However, you can go to the Connect website and download Virtual Machine Additions for Linux. It took me a while to find the actual download link because various blog posts only mentioned the Connect site and not the actual location.

Ubuntu isn’t listed in the list of supported distributions. I’ll let you know if it works for Ubuntu.

Now What?

So now I have Ubuntu running in a virtual machine. It comes with Open Office, Firefox, etc… preinstalled. My next step is to install VMWare and MonoDevelop and start tinkering around. Any suggestions on what else I should check out?

UPDATE: Perhaps I should use VMWare 6 instead since it supports multi-monitor in a virtual machine. That’s hot!

code, tdd comments edit

Although I am a big fan of Rhino Mocks, I typically favor State-Based over Interaction-Based unit testing, though I am not totally against Interaction Based testing.

I often use Rhino Mocks to dynamically create Dummy objects and Fake objects rather than true Mocks, based on this definition given by Martin Fowler.

  • Dummy objects are passed around but never actually used. Usually they are just used to fill parameter lists.\
  • Fake objects actually have working implementations, but usually take some shortcut which makes them not suitable for production (an in memory database is a good example).\
  • Stubs provide canned answers to calls made during the test, usually not responding at all to anything outside what’s programmed in for the test. Stubs may also record information about calls, such as an email gateway stub that remembers the messages it ’sent’, or maybe only how many messages it ’sent’.\
  • Mocks are what we are talking about here: objects pre-programmed with expectations which form a specification of the calls they are expected to receive.

Fortunately Rhino Mocks is well suited to this purpose. For example, you can dynamically add a PropertyBehavior to a mock, which generates a backing member for a property. If that doesn’t make sense, let’s let the code do the talking.

Here we have a very simple interface. In the real world, imagine there are a lot of properties.

public interface IAnimal
{
  int Legs { get; set; }
}

Next, we have a simple class we want to test that interacts with IAnimal instances. This is a contrived example.

public class SomeClass
{
  private IAnimal animal;

  public SomeClass(IAnimal animal)
  {
    this.animal = animal;
  }

  public void SetLegs(int count)
  {
    this.animal.Legs = count;
  }
}

Finally, let’s write our unit test.

[Test]
public void DemoLegsProperty()
{
  MockRepository mocks = new MockRepository();
  
  //Creates an IAnimal stub    
  IAnimal animalMock = (IAnimal)mocks.DynamicMock(typeof(IAnimal));
  
  //Makes the Legs property actually work, creating a fake.
  SetupResult.For(animalMock.Legs).PropertyBehavior();
  mocks.ReplayAll();
    
  animalMock.Legs = 0;
  Assert.AreEqual(0, animalMock.Legs);
    
  SomeClass instance = new SomeClass(animalMock);
  instance.SetLegs(10);
  Assert.AreEqual(10, animalMock.Legs);
}

Keep in mind here that I did not need to stub out a test class that inherits from IAnimal. Instead, I let RhinoMocks dynamically create one for me. The bolded line modifies the mock so that the Legs property exhibits property behavior. Behind the scenes, it’s generating something like this:

public int Legs
{
  get {return this.legs;}
  set {this.legs = value;}
}
int legs;

At this point, you might wonder what the point of this is? Why not just create a test class that implements the IAnimal interface? It isn’t that many more lines of code.

Now we get to the meat of this post. Suppose the interface was more realistic and looked like this:

public interface IAnimal
{
  int Legs { get; set; }
  int Eyes { get; set; }
  string Name { get; set; }
  string Species { get; set; }
  //... and so on
}

Now you have a lot of work to do to implement this interface just for a unit test. At this point, some readers might be squirming in their seats ready to jump out and say, “Aha! That’s what ReSharper|CodeSmith|Etc… can do for you!”

Fair enough. And in fact, the code to add the PropertyBehavior to each property of the IAnimal mock starts to get a bit cumbersome in this situation too. Let’s look at what that would look like.

SetupResult.For(animalMock.Legs).PropertyBehavior();
SetupResult.For(animalMock.Eyes).PropertyBehavior();
SetupResult.For(animalMock.Name).PropertyBehavior();
SetupResult.For(animalMock.Species).PropertyBehavior();

Still a lot less code to maintain than implementing each of the properties of the interface. But not very pretty. So I wrote up a quick utility method for adding the PropertyBehavior to every property of a mock.

/// <summary>
/// Sets all public read/write properties to have a 
/// property behavior when using Rhino Mocks.
/// </summary>
/// <param name="mock"></param>
public static void SetPropertyBehaviorOnAllProperties(object mock)
{
  PropertyInfo[] properties = mock.GetType().GetProperties();
  foreach (PropertyInfo property in properties)
  {
    if (property.CanRead && property.CanWrite)
    {
      property.GetValue(mock, null);
      LastCall.On(mock).PropertyBehavior();
    }
  }
}

Using this method, this approach now has a lot of advantages to explicitly implementing the interface. Here’s an example of the test now with a test of another property.

[Test]
public void DemoLegsProperty()
{
  MockRepository mocks = new MockRepository();
  
  //Creates an IAnimal stub    
  IAnimal animalMock = (IAnimal)mocks.DynamicMock(typeof(IAnimal));
  UnitTestHelper.SetPropertyBehaviorOnAllProperties(animalMock);
  mocks.ReplayAll();
    
  SomeClass instance = new SomeClass(animalMock);
  instance.SetLegs(10);
  Assert.AreEqual(10, animalMock.Legs);
  animalMock.Eyes = 2;
  Assert.AreEqual(2, animalMock.Eyes);
}

Be warned, I didn’t test this with indexed properties. It only applies to public read/write properties.

Hopefully I can convince Ayende to include something like this in a future version of Rhino Mocks.

comments edit

Thought I’d post a few pics from mix with some notes. Click on any for a larger view.

Phil Jeff and
Jon

This first one is of the three amigos, not to mention coauthors. That is me on the left sporting a Subtext shirt, Jeff Atwood in the middle, complete with CodingHorror sticker, and Jon Galloway on the right.

Scott Hanselman and Rob
Conery

That’s Scott Hanselman (who runs that other .NET open source blog engine) on the left and Rob Conery (of Subsonic fame) on the right. The joke here is that Scott is standing on some stairs because Rob Conery is a giant.

ScottGu Miguel and
Me

Sometimes, the best parts of conferences are outside of the sessions. A few of us were sitting around having drinks when we spotted Scott Guthrie walking by. Not content to just let him be on his merry way, as that would be the polite thing to do. We called him over and he proceeded to regale us with stories and walked us through some of the namespaces and such of Silverlight.

ScottGu, as he is known, is a total class act and I was happy to finally meet him in person.

Sam
Phil

Here I am regaling Sam Ramji with an obviously hilarious joke. The picture is not, you know, staged whatsoever. No, not at all.

Sam’s is Director of Platform Technology Strategy and runs the Open Source Software Lab at Microsoft. Didn’t know there was someone in charge of Open Source at Microsoft? Neither did I until meeting him. A few of us had his ear during a dinner. Hopefully we’ll see some interesting things come out of it.

Tantek and
Phil

Tantek Çelik was walking by and noticed my XKCD t-shirt which sports a unix joke and had to take a picture. Not many people got the joke.

John
Lam

John Lam prepares for the Dynamic Language Runtime session with Jim Hugunin. This was one of my favorite sessions. One of the demonstrations was an application that allowed them to evaluate script dynamically using Silverlight. The neat part was they could switch languages, for example from Ruby to Python, and still evaluate properties of objects they had declared in the previous language. Hot!

I got a chance to hang out more with John more at Pure and really enjoyed his perspective on Microsoft and child rearing.

Jeff Phil Miguel
Jon

Jeff, Jon, and I intently watch as Miguel de Icaza gives us a demo of Mono. The rotating cube desktop is pretty sweet.

Miguel
Jeff

Jeff cannot conceal his Man Crush on Miguel.

Scott
Stansfield

Scott Stanfield, CEO of Vertigo, on the right playing Guitar Hero, the addict. Tonight he and I cleaned up at Spanish 21, winning over $300 each. This surprised the guy at the craps table who informed us that Spanish 21 is a terrible game in terms of odds. But aren’t they all?

comments edit

Just a couple of notes while I have a break during the conference. I’ll try to find some time to write about my impressions of the technologies when I’ve had time to reflect.

In the meanwhile, allow me to tell a story about the Italia soccer jersey I wore on Sunday. It was a gift from a friend and I figured it fit the theme of staying at the Venetian. Get it? Italy!?

On Sunday, when Jon arrived in L.A. from SD, we went to brunch with my wife before leaving for Las Vegas. We decided to go to a nice French brunch place, La Dijonaise. Already some of you must see the conflict brewing.

Here I am, walking into a French restaurant wearing an Italian soccer jersey. The guy at the door took one look at me and told me, in a deeply French accent, “No no no. You cannot come in here.”

Eric Kemp, Miguel De Icaza, Jon Galloway, John Osborn,
Me

I figured he was joking, but it took me a moment to realize why this guy I had never met was joking with me, as he pointed to my shirt. Silly me.

comments edit

Yesterday, while hanging out in the so called “BlogZone”, Tim Heuer pulled me aside for a audio short interview on the topic of Subtext and Open Source, two things I love to talk about and good luck getting me to shut up once you get me started. ;)

This was a surprise for me as the last time I was interviewed was by a reporter for my college paper after my soccer team used the school paper to dry windows for a fund raising car wash. I told the reporter that the paper was good for drying windows because they don’t leave streaks. I was merely relaying what someone told me when they went to grab the papers, but my teammates all congratulated for sticking it to the paper. Funny how that works out sometimes.

Back to the present, I cringed while listening to the interview as I learned I’m much less eloquent than I hoped I would be in a situation. Apparently I suffer from the “You Know” disease that Atwood suffers from. This is simply due to my nervousness at being interviewed along with the fact that we were in a very noisy room surrounded by a lot of distractions (yes, this is me making excuses).

Not only that, there’s a point in the interview where I seem to lose focus and stammer. That’s because Scott Hanselman was calling me and I wasn’t sure whether to stop and give him directions to the BlogZone or continue. As you can hear, I continue and he found it just fine.

Unfortunately, there’s a lot more I would’ve liked to have said. Upon being asked about whether the community has chipped into Subtext, I started off with the example of recent committs related to the build server and mentioned a couple of people. I was just getting warmed up and didn’t get a chance to mention many others who have contributed. I apologize, but the interview probably would’ve gone on for hours if I had the proper time to express my appreciation to the Subtext developers and community.

The lesson learned for me is to slow down, take a deep breath, and don’t be afraid to take a moment to collect my thoughts. Don’t be afraid of dead air when speaking publicly.

In any case, Tim, I enjoyed being interviewed. I personally think you have a talent for it and would have done a much better job than the painful interview we were subjected to during the keynote. Seriously, they should’ve had you up there asking Ray and Scott questions.

In case you didn’t know, Tim contributed what is probably the most popular skin to Subtext, Origami.