personal comments edit

Greg Duncan is now my hero for this find. This just proves that I am still unable to let go of the 80s.

Unfortunately, some of the flagship songs of certain groups are missing in place of other videos. For example, instead of Pour Some Sugar from Def Leppard we get Photograph. Likewise where is Loveshack from the B52s?

Some notable videos (both good and bad):

There are too many to list.

blogging comments edit

This is the blog of a chap who is riding a teched out motorcycle computer (which he calls his “motocompy”) from San Diego to my homestate Alaska. I helped him get his Subtext blog setup so he can blog the entire trip.

His rig is tricked out with Wi-F, a webcam, a mini-ITX system with DVD Player, a power inverter, GPS, a 7” touch-screen, etc… stored in a tank bag attached to his bike via magnets.

Check out this blog post to see a couple videos where he describes his setup.

He also happens to be my coworker’s (Jon Galloway) brother.

humor comments edit

World Cup Widow Leading up to the world cup there was a lot of attention to how the World Cup would affect the wives of soccer football crazy men. Some were on the humorous side such as this list of rules for women, while others attempt to help women avoid becoming a world cup widow.

Fortunately for me, my wife enjoys watching the world cup. What’s not to like with such pretty boys as Beckham, Kaka and Cristiano Ronaldo on the pitch? The real widow appears to be my blog. As I pass by the office on the way to the television, I can hear her whimper, “Another day of neglect. I understand. Go on. Don’t worry about me.”

I figured I would spare the world of yet another person’s biased and uninformed opinion on the events of the World Cup. But today I thought I would take a few moments to say something.

I had the good fortune to go in for jury duty this past Thursday. It was good fortune because I was not called into a trial so I was able to sit back and watch a couple World Cup with other dedicated soccer fans. When the noon lunch break came, everybody left except for a few of us who had pizza delivered to the courthouse.

Today I enjoyed the Brazil win and will watch Korea play France next. Heavily favored Brazil is not looking so heavily favored so far. Unless they pick up their play, I think Argentina is the team to beat. Ghana looked wonderful against the Czech Republic. The U.S. is going to have their work cut out for them. They will be very lucky if they advance. At least they showed some grit against Italy.

code, comments edit

1040 EZ Form Today I ran across some code in a 3rd party open source library that used the following function in order to retrieve the form id.

public static string GetPageFormID(Control page)
    string id = null;

    foreach (Control con in page.Controls)
        if (con is HtmlForm)
            id = con.ClientID;
    return id;

Which gets called like so:

Control page = HttpContext.Current.Page;
string formID = GetPageFormID(page);

Unfortunately this didn’t work for me because I don’t have the form declared as a direct child of the page. Instead the page contains a user control which contains the form. This is a common scenario when using a MasterPage (in my case an ASP.NET 1.1 backported master page control). When looking for the form, the function should search recursively like so:

public static string GetPageFormID(Control page)
    string id = null;

    foreach (Control con in page.Controls)
        if (con is HtmlForm)
            return con.ClientID;
        id = GetPageFormID(con);
        if(id != null)
            return id;
    return id;

This will search the entire control hierarchy until it finds the HtmlForm. In the most common case, it will find it without having to recurse. But for crazy folks like me who always look for ways to be different, this will do the trick. Luckily this was an open source library I was using so I was able to fix the code and send the authors a patch.

personal comments edit

Soccer BallWorld Cup Fever!.

Hence the lack of blogging. Last weekend, I went on a camping trip with my coworkers Micah and Jon in the Los Padres national forest. I was fighting a bad cold at the time and am only now getting over it.

But a new fever has taken its place. In part, I was feeling pukey after the horrible showing by the U.S. against the Czech Republic. The only bright side to that game was the play of Oguchi Onyewu and Claudio Reyna.

Today during lunch I will watch Brazil play the beautiful game. Since I don’t have cable I will have to watch it on Univision (a former employer) and brush up on my Spanish. At least I know what GOOOOOOOOOOOOOOOL means!

subtext comments edit

If you are hosting multiple blogs on a single installation of Subtext, the recent Subtext 1.5 release unfortunately introduces a security bug that will allow an admin of one blog to login to another blog. The fix has already been posted to Sourceforge as part of the Subtext 1.5.1 release.

If you already upgraded to Subtext 1.5, you only need to update the Subtext.Framework.dll file in the bin directory. The fix was a one line code change. I apologize for the inconvenience and for the mistake. Please spread the word.

Tags: Subtext

comments edit

IMPORTANT UPDATE: There was a security bug in Subtext 1.5 for multiblog setups that will allow the admin of one blog to login to another blog. If you are only using a single blog setup, you have nothing to worry about. For multiblog setups, please upgrade to Subtext 1.5.1. The change is a single line change in Subtext.Framework.dll so if you have already upgraded to Subtext 1.5, you can simply copy over the old Subtext.Framework.dll file with the new one instead of copying every file from the installation package. Sorry for the the inconvenience. Download Subtext 1.5.1.

Subtext Logo After much hard work, the Subtext team is proud to announce the release of Subtext 1.5, dubbed the Nautilus R&R Edition. This is primarily a bugfix release with a load of bug fixes, but there are a couple of significant new features as well as a bit of developer candy thrown in.

Release Notes

Rather than list all the bug fixes and new features here, I will point you to the release notes published online.

I will point out a few key changes.


Subtext now supports an HtmlEditorProvider. You can swap out the html editor used in the admin section by implementing a provider for your editor of choice. Providers that ship with Subtext include the FreeTextBoxProvider (default), FCKeditorProvider, and PlainTextProvider for you angle bracket jockeys.

For more information on how to change the text editor, read this.

Javascript BlogInfo object

By default now, Subtext emits a javascript object useful for client scripts to access information about the blog. This object is intentionally simple, but may be expanded as needed. The variable name for this object is subtextBlogInfo. The following is an example of its usage.

var imagesPath = subtextBlogInfo.getImagesVirtualRoot();

Improved Skin configuration {.clear}

If you write custom skins, the Skins.config file has a few new features. Here is a sample skin configuration that highlights several new features.

<SkinTemplate SkinID="Haackify" Skin="Haackify">
        <Script Src="~/Admin/Resources/Scripts/niceForms.js" />
        <Script Src="~/scripts/lightbox.js" />
        <Script Src="~/scripts/XFNHighlighter.js" />
        <Script Src="~/scripts/ExternalLinks.js" />
        <Script Src="~/scripts/LiveCommentPreview.js" />
        <Script Src="~/scripts/AmazonTooltips.js" />
        <Style href="~/scripts/XFNHighlighter.css" />
        <Style href="~/scripts/lightbox.css" />
        <Style href="niceforms-default.css" />
        <Style href="print.css" media="print" />

Notice that you can reference files using the tilde (~) syntax. Subtext now comes with several script files in the base Scripts directory that can be shared across skins. For example, if you want to add the LightBox JS effect to your skin, just reference the ~/scripts/lightbox.js in the Script section and the ~/scripts/lightbox.css section in the Style section as in the above example.

The root Scripts directory is a central repository for Subtext client script files that are accessible to all skins. Typically skins will include their own scripts. But occasionally the Subtext team will make popular scripts available to all skins. Some of these scripts have been modified to use the subtextBlogInfo object mentioned before.

Also notice that Style elements may now specifa a media. Thus you can add a stylesheet to a skin specifically for printing.

New Skins and Skin Controls

Checkout the new Submarine skin designed by our friends at TurboMilk. We have also added several new skin controls such as the Previous Next control which renders a link to the previous and next blog post. Note that not every blog uses every skin control, but it is quite easy to add such a control to the skin of your choice.

CSS Based Printing

Nearly every skin in Subtext now implements CSS based printing.

Upgrading from Subtext 1.0

Subtext 1.5 has an automatic web-based upgrade wizard that will upgrade your schema and stored procedures if you are currently running Subtext 1.0. We have made changes to web.config so if you have made any customizations, you will need to merge these changes yourself which should not be too hard. Please read the upgrade instructions carefully before upgrading.

If you are uncomfortable upgrading your database schema automatically, you can try the manual schema upgrade process outlined here. The steps in that process are the same ones that the automatic wizard takes on your behalf.


If you are installing Subtext for the first time, the web-based installation works as before. Just follow the instructions here.

Next Stop, Daedelus!

The next version of Subtext is code named Daedelus. It will simply be a straight upgrade to ASP.NET 2.0. We hope for a quick turnaround as we don’t plan to add a lot of features in this iteration. We just want to get up and running in ASP.NET 2.0. Afterwards we will start heavy work on Subtext 2.0 Poseidon. This will be a more ambitious release.

Please note that we may introduce a lot of breaking changes for skins in Subtext 2.0. We will try and keep you informed so that you will have advanced warning on how to upgrade your custom skins.

comments edit

In the Subtext 1.5 release announcement, I mentioned we had a few new skins. I thought I would post a couple of screenshots of the Submarine skin to give you a sense of what it looks like.

And here is a detail shot.

One problem with this skin is that it is fairly narrow for a fixed width skin. Hopefully we can fix that in the next version and offer a few more intriguing skins.

Be sure to check out these skins as well KeyWest, Semagogy, Piyo, WPSkin. I think some of them are new, but I cannot remember.

Also, if you have some sweet design skills, we are always looking for new skins to add to the default installation.

comments edit

Found a useful nugget in Richter’s recent CLR via C# book I want to share with you. But first some background.

Sometimes when I write a catch block, I don’t really have any plans for the caught exception. The following is a contrived example that is somewhat realistic.


In the above code, I only want DoSomethingElse() to execute if DoSomething() throws an exception of type SomeException. I can’t put DoSomethingElse() in a finally block because then it would always get called and not just when the exception is thrown. I don’t need to do anything with SomeException because I am propagating it up the callstack via the throw keyword to let some other method handle it.

But now, as I am stepping through the code in the debugger, I may actually want to examine SomeException when the debugger reaches the line DoSomethingElse(). Typically I would have to rewrite the code like so:

catch(System.FormatException e)

Just so I can examine the exception now stored in the variable e. This is plain dumb and Richter points out why in a little tip in his book. You can use the debugger variable $exception provided by the Visual Studio.NET Debugger to examine the exception in a catch block. I wish I had known about this a while ago.

humor comments edit

As you may well know, today is June 6, 2006 or in shorthand notation 6/6/06, the mark of the beast. As the church lady would say, “mmmmm, isn’t that special?”

Dana Carvey is the Church

This day reminds me of a song I heard a loooong time ago called “Sweat Loaf” by the Butthole Surfers (pleasant name). It starts off with a nice conversation between an all American boy and his all American dad.

  1. Son


  2. Dad

    Yes, son?

  3. Son

    What does regret mean?

  4. Dad

    Well a funny thing about regret is, its better to regret something you have done, then to regret something you haven’t done….Oh and if you see your mother this weekend remember to tell her- SATAN, SATAN, SATAN

So in commemoration of this numerologically significant day, be sure to show the evil horns gesture to somebody today and include a little head banging. Our President will lead us with a showing of the horns.

Bush Horns

I always knew he was in with the Devil

tdd comments edit

If you’ve read my blog for any length of time, you know that I am a fan of the MbUnit generative unit test framework.

What I haven’t been a fan of is linking to the MbUnit website. If you want to refer someone to NUnit, you simply link to But if you want to refer someone to MbUnit, you have to type out this monstrosity:

So in addition to complaining about it, I decided to do something about it. I purchased and pointed it to their website. Now when I mention MbUnit in a post, I can spare my fingers a few keystrokes.

One other issue I hope this helps solve is that the MbUnit website comes up second in Google search results. First is its old hosting location. Hopefully everyone will start updating their links to point to instead and help solve that issue.

open source, blogging comments edit

Recently I highlighted a site named DotNetKicks which is like, but targetted to .NET technology. In particular I thought it was a smart move for them to share in their advertising revenue with those who submit stories.

Well to make it easier to kick stories from the convenience of your favorite RSS Aggregator, I wrote an IBlogExtension plugin so that you can submit/kick/unkick stories from RSS Bandit or any RSS Aggregator that supports the IBlogExtension plugin model (.NET 1.1 must also be on the machine).

Just download and unzip the extension to your plugins directory. The default location for RSS Bandit would be c:\Program Files\RssBandit\plugins.

Once you have it installed, restart RSS Bandit and right click on any feed item and select the DotNetKick This - Configure… menu option.

Context Menu

This will bring up the plugin configuration dialog. You should leave the URLs as they are. I made left them to be configurable in case the URL ever changes. Just enter your DotNetKicks username and password and click OK.

This will save your username and password in an xml settings file with the password heavily encrypted.

Now you can right click on a story to submit it to DotNetKicks. If the story hasn’t been submitted, you will get the following dialog.

Submit a Story Dialog

This form is pretty self-explanatory.

If a story has already been submitted, you will see the following dialog which allows you to kick it or unkick it (essentially adding your vote to the story or removing it (editor’s note: At the time of this writing, the unkick function was not working).

Kick/Unkick a story dialog

The API for DotNetKicks was published today on Gavin Joyce’s wiki. This was quite a turnaround as I emailed him on friday asking if there was a web-based API. We went back and forth formulating the API and he said he would work on it over the weekend. This morning, he sent me the URL to his wiki page describing the API! Much of the API was inspired by the API.

If you want to learn to write an IBlogExtension from scratch, check out my tutorial here. In this case, I was able to get a jumpstart by using Dare’s excellent plugin as a starting point.

Another plugin I wrote a while ago is the improved w.bloggar plugin for RSS Bandit that should hopefully be included in the next release of RSS Bandit.

Once again, in case you missed the first link to this DotNetKicks plugin, [DOWNLOAD] it.

blogging comments edit

I recently learned about DotNetKicks due to a referral in my referrer logs. It is essentially a Digg knockoff, but tightly focused on the .NET community, which makes it a nice complement to Digg.

Recently, security expert Bruce Shneier wrote a piece on how security can be improved by aligning interest with capability. One example he gives is how some retail stores promise refunds if you don’t get a receipt. This keeps the employee working the register honest by aligning the store owner’s security interest with the interests of the customers. The owner is effectively hiring the customers to keep employees honest.

In a similar manner, DotNetKicks has done the same thing to promote its own growth. If you have an AdSense account, you can submit your AdSense ID and earn 50% of the advertisement revenue on the site for all stories that you submit. This aligns story submitters’ interests with DotNetKicks’ interest and should go along way to ensure that more stories are submitted.

Of course, the signal to noise ratio of submitted stories may go down as a result, but that is hardly a concern because only the stories that the community deem interesting, via the voting system, percolate to the front page and have any chance of really generating advertising revenue. The question I have is what is the incentive to users to kick stories apart from the community benefits?

code comments edit

branches Scott Allen writes about a Branching and Merging primer (doc) written by Chris Birmele. It is a short but useful tool agnostic look at branching and merging in the abstract. This is a nice complement to my favorite tutorial on source control, Eric Sink’s Source Control HOWTO.

Another useful resource on branching strategies is Steve Harman’s guide to branching with CVS.

The primer takes a tool agnostic look at branching and points out several branching models. One thing to keep in mind is that not every model makes use of your source control tool’s branching feature. In particular, let’s take a closer look at the Branch-per-task model. This model is almost universally in use via what I call implicit branches, which are private and not shared among other team members.

Using a pessimistic locking source control system like Visual Source Safe (VSS), every time you check out a file (which grants you an exclusive lock on that file), you are implicitly making a branch as soon as you edit that file. However, this is not a branch that VSS recognizes. It is merely a branch by fact that the code on your system is not the same as the code in the repository. Also consider that other team members may be making changes to other files in the same codebase. Perhaps files that contain classes that the file you are working on are dependent. So when you check that file back in, you are performing an implicit merge.

This type of implicit branching pretty much maps to the primer’s Branch-per-Task model of branching. Optimistic locking source control systems such as CVS and Subversion make this implicit branching and merging a bit more explicit. Rather than checking out a file, you typically update your local desktop with the latest version from the repository and just start working on files. There is no need to exclusively lock files by checking them out which only gives you the illusion of safety.

When you are ready to commit your changes back into the system, you typically get latest again and merge in any changes that may have been committed by other team members into your local workspace. Finally, you commit your local changes (assuming everything builds) and resolve any automatic merge conflicts (which is may not be very likely since you just pulled all changes from the repository into your local workspace unless there is a lot of repository activity going on).

The point here is to recognize that the implicit branching model (branch-per-task) is almost certainly already in use in your day to day work. It is not necessary to employ your source control’s branching feature to employ this branching model, unless you need multiple developers working on that single task. In that case, you would create an explicit branch for that task so that it can be shared. However, keep in mind that when multiple developers work on an explicit branch, the branching and merging model for that individual branch will look like the implicit branch-per-task model as I described.

comments edit

I have been looking for this for a looong time.

Jon Galloway writes about two sites created by Dan Vine that allow you to enter an URL and it will display a screenshot of the website as rendered by IE7 or Safari. This is extremely useful for testing out web designs on alternate platform. Much cheaper than buying a Mac to test your work on Safari browsers.

I also want to commend Dan for such clean design. Each site is like a well written function. It does one thing and it does it well. A joy to use really.

ieCapture iCapture

One thing to note, iCapture produces a screenshot of the entire page, potentially producing a tall image. At the time of this writing, ieCapture takes a screenshot of the actual browser window, which doesn’t show the entire document.

comments edit

Greg Young takes my Testing Mail Server to task and asks the question, what does it test that a mock provider doesn’t?

It is a very good question and as he points out in his blog post on the subject, it seems like a lot of overhead for very little benefit. For the most part, he is right.

To my defense, and as Greg points out, I would not start with such a test when writing email functionality. I would start with the mock email provider and follow the typical Red-Green TDD pattern of development. However there are cases where this approach does not test enough and this testing server was necessitated by some real world scenarios I ran into.

For example, in some situations, it is very important to understand the exact format of the raw SMTP message that is sent. Some systems actually use email from server to server to kick off automated tasks. In that situation, it helps to know that the SMTP message is formatted as expected by the receiving server. For example, you may want to make sure that the appropriate headers are sent and that the message is not a multi-part message. This approach lets you get at the raw SMTP message in a way that the mock provider approach cannot.

A more common issue is when sending mass mailings such as newsletters to subscribers. At one former employer, we had real difficulty getting our emails to land in our user’s mailbox despite adhering to appropriate SPAM laws and only mailing to subscribed users.

It turns out that actually landing a mass mailing even to users who want the email is very tough when dealing with Hotmail, yahoo, and AOL accounts. Something as seemingly innocuous as the X-Mailer header value can trigger the spam filters.

In this case, this very much falls under the rubrik of an integration test, as I am testing the actual mailing component in use. But I am not only testing the particular mailing component. I am also testing that my code uses the mailing component in a correct manner.

So in answer to the question, Where’s the red bar? The red bar comes into play when I write my unit test and asser that the X-Mailer header is missing. The green bar comes into play when I make sure to remove the header. I could probably test this with a mock object as well, but I have been burnt by mailing components that did not remove the X-Mailer header but simply set the value to blank, when I really intended it to be removed. That is not something a mock object would have told me.

comments edit

In spirit, this is a follow-up to my recent post on unit-testing email functionality.

This probably doesn’t apply to those of you who have reached O/RM nirvana with such tools as NHibernate etc… ADO is probably just a distant memory, not unlike those vague embarassing recollections of soiling yourself in a public location long ago.

But I digress.

For the rest of us, we sometimes need to get knee-deep in ADO. For example, even though Subtext abstracts away the data access via the provider model, I still want to test the data provider itself, right?

Subtext’s has a series of methods that each call a stored procedure and return an IDataReader. The returned data reader is passed to another class which populates entity objects using the data reader.

In my unit tests, it would be nice to have a means to attach a data reader to an in-memory object structure rather than directly to the database. That is where my StubDataReader class comes into play.

It implements the IDataReader interface and provides a quick and dirty way to create an in-memory object structure. By quick and dirty I mean that you do not need to build out a table schema first. The code to set up the stub data reader is quite simple.

Single Result Set

If you are dealing with a Data Reader that should only return one result set (which seems to be the vast majority of cases), then setting it up would look like this:

DateTime testDate = DateTime.Now;
StubResultSet resultSet 
   = new StubResultSet("col0", "col1", "col2");
resultSet.AddRow(1, "Test", testDate);
resultSet.AddRow(2, "Test2", testDate.AddDays(1));
StubDataReader reader = new StubDataReader(resultSet);

//Advance to first row.

// Assertions            
Assert.AreEqual(1, reader["col0"]);
Assert.AreEqual("Test", reader["col1"]);
Assert.AreEqual(testDate, reader["col2"]);

In the above snippet, I create an instance of StubResultSet with a list of the column names. I then make a couple of calls to AddRow. Notice that AddRow takes in a param array of object instances. This is the quick and dirty part. Since the StubDataReader doesn’t require setting up a schema before-hand, it will not validate that the objects added to the columns of the rows are the correct type. It just doesn’t have that information. But this isn’t all that important since this class is specifically for use in unit testing scenarios.

Multiple Result Sets

Not everyone realizes this, but you can iterate over multiple result sets with a single data reader instance. Simulating that scenario is quite easy.

DateTime testDate = DateTime.Now;
StubResultSet resultSet 
   = new StubResultSet("col0", "col1", "col2");
resultSet.AddRow(1, "Test", testDate);

StubResultSet anotherResultSet 
   = new StubResultSet("first", "second");
anotherResultSet.AddRow((decimal)1.618, "Foo");
anotherResultSet.AddRow((decimal)2.718, "Bar");
anotherResultSet.AddRow((decimal)3.142, "Baz");

StubDataReader reader 
   = new StubDataReader(resultSet, anotherResultSet);

//Advance to first row.

Assert.AreEqual(1, reader["col0"]);

//Advance to second ResultSet.
Assert.IsTrue(reader.NextResult(), "Expected next result set");

//Advance to first row.
Assert.AreEqual((decimal)1.618, reader["first"]);
Assert.AreEqual("Foo", reader["second"]);

In this snippet, I create two StubResultSet instances and pass it to the constructor of the DataReader. Afterwards, you can see that the code makes sure to test that the NextResult functions properly.

The above code snippets above are excerpts from the unit tests I wrote for this code. Although this code is more complete than the mail server example, there are a couple methods that haven’t been well tested because I have never run into a situation in which I needed them. I put in various comments so feel free to improve this and let me know about it. This code is within the UnitTests.Subtext project in the Subtext solution in our Subversion repository.

You can download the code here , but as before, I do not guarantee I will update the link to have the latest code. You can access our Subversion repository for the latest.

comments edit

Mud BathMy wife received a free day at the Glen Ivy Hot Springs Spa from our friends Dan and Judy (the same Dan to whom my last non-geek post was dedicated).

So the four of us headed over there yesterday for a day of relaxation. The day consisted of sitting in a stinky hot sulfur mineral jacuzzi bath, then swimming in the pool, taking a nap, eating lunch, and finally covering ourselves from head to toe in mud.

I was a little too aggressive in covering myself in mud, slathering it on and getting plenty of it in my eyes. I didn’t pay attention to the memo to rub it around the eyes and not in the eyes. You don’t say!

I remember as a kid always being admonished about getting too muddy when playing outside. Now as an adult, I pay for the experience. Must be some form of latent rebellion.

comments edit

MailSo you are coding along riding that TDD high when you reach the point at which your code needs to send an email. What do you do now?

You might consider writing something that looks like:

EmailMessage email = new EmailMessage();
email.FromAddress = new EmailAddress(from);
email.AddToAddress(new EmailAddress(to));
email.Subject = subject;
email.BodyText = message;

SmtpServer smtpServer = new SmtpServer(SmtpServer, Port);

But you, being a TDD god(des) know better and quickly refactor that into some sleek code that uses an EmailProvider. This ensures that your code is not tied to any specific email implementation and will make unit testing your code easier. Just swap out your concrete email provider for a unit test specific email provider. Now your code looks like:

EmailProvider.Instance().Send(to, from, subject, message);

But a nagging thought still pulls at the edge of your consciousness. “Shouldn’t I unit test my concrete email provider and actually make sure the email gets sent correctly?

I certainly think so. As for the semantic arguments around whether this really constitutes an Integration Test as opposed to a Unit Test, please don’t bore me with your hang-ups. Either way, it deserves a test and what better way to test it than using something like MbUnit or NUnit.

Wouldn’t it be nice to test your email code like so?

DotNetOpenMailProvider provider = new DotNetOpenMailProvider();
NameValueCollection configValue = new NameValueCollection();
configValue["smtpServer"] = "";
configValue["port"] = "8081";
provider.Initialize("providerTest", configValue);

TestSmtpServer receivingServer = new TestSmtpServer();
    receivingServer.Start("", 8081);
                "Subject to nothing", 
                "Mr. Watson. Come here. I need you.");

// So Did It Work?
Assert.AreEqual(1, receivingServer.Inbox.Count);
ReceivedEmailMessage received = receivingServer.Inbox[0];
Assert.AreEqual("", received.ToAddress.Email);

That there code starts up a mail server, sends an email to it, and then checks that the mail server received the email. It also quickly checks the to address.

This is a snippet of an actual unit test within the Subtext codebase.

A long while ago I discovered a wonderful .NET based freeware mail server written by Ivar Lumi. I decided to write a wrapper specifically for unit testing scenarios. I added the TestSmtpServer to a new project named Subtext.UnitTesting.Servers in the Subtext VS.NET solution.

The wrapper parses incoming SMTP messages and adds an ReceivedEmailMessage instance to the Inbox custom collection. This makes it easy to quickly examine the email messages sent via SMTP in your unit test.

As this is a very early draft, there are some key limitations. I have yet to implement multi-part messages and attachments in the object model. I also punted on dealing with multiple to addresses. However, the ReceivedEmailMessage class does have a RawSmtpMessage property you can examine. For now, it works very well for simple text based emails.

Over time, I hope to implement these more complicated testing features as the need arises. However, if you find this useful and would like to contribute, please do!

If you want to view the latest code, check out these instructions for downloading the latest Subtext code using Subversion.

Or you can simply download this one project here, though keep in mind that I will be updating this project, but not necessarily this link to the project.

Since the project is specifically for unit testing purpsose, I went ahead and embedded the unit tests for this server within the project itself using MbUnit references. However you can simply swap out the assemblies and references to use NUnit if that is your preference.