personal comments edit

A few weeks ago I felt burned out and was in sore need of a vacation. I suggested to my wife that we take the kids somewhere and she sagely noted that taking the kids anywhere at their ages is not a vacation. Perhaps I should go somewhere on my own.

IMG_1551Like a deserted lighthouse!Did I mention she’s the best wife ever?

I decided to take a week off and mix a bit of “staycation” with my vacation. Spend a few days at home and maybe a couple of days away.

The first thing I did was call up Hanselman who was game for a trip to Los Angeles for E3. Unfortunately that fell through when we couldn’t score tickets (yes, we’re cheap). We did find some guy on Craigslist who could score us cheap tickets, but his badges looked hand-drawn and made me a wee bit suspicious. I seriously doubt E3 uses Comic Sans as their badge font.

After much deliberation, I decided to take a trip to the nearby San Juan Island for two nights. The plan was to go sea kayaking, but based on the glowing reports of my co-worker Brad Wilson, I also decided to try ziplining.

Kenmore Air

There are two ways to get to San Juan Island, take a ferry or fly. Well, you could also try to swim, but I wouldn’t much recommend it. A ferry is much cheaper, but it’ll take you three hours to get there. Not only that, the view doesn’t compare to that of an airplane that takes off and lands on water. So I went with a sea plane, figuring taking off and landing on water would be really fun.

The plane was cozy (aka small) which allowed me to sit right behind the cockpit and enjoy a panoramic view during the flight to the island. The landing was whisper soft and I barely noticed it.

IMG_1473So I wonder what this button does.I even had the opportunity to help out by holding the mooring rope for the pilot while he got in to start the plane and take off. I thought about holding onto the rope for an exhilarating ride, but my better judgment kicked in.

IMG_1507You’re going to tell me when to let go right? RIGHT?!

Friday Harbor House

It was a short walk to the Friday Harbor House, my lodgings for the trip. I splurged a bit for a room with a view overlooking the harbor, affording me a view of the incoming sea planes and ferries. I also had a view of the folks dining in the garden just outside the restaurant downstairs under my balcony.

IMG_1528 The view of the harbor from my room.

And their view was supplemented by a direct view into my room, keen as I was to keep my window open. Fortunately (for them or for me, I can’t decide), I noticed this before having the occasion to create a very awkward moment.

Unlike most vacations, I actually did spend a lot of time in my room curled up with a good book while enjoying a nice view, so the extra money for the view was well spent.

As a bonus, I learned that the restaurant just under my room was considered by many on the island to be one of the better restaurants.


If you find yourself on San Juan Island, I highly recommend setting apart a bit of time (around 3 hrs) to go ziplining with Zip San Juan. It’s a small operation run by a husband and wife team who exude competence while being extremely friendly. Pat even fashions himself to be a bit of a comedian. I’ll only say I was indeed entertained on the drive there and back.

(2) You better believe I’m not letting go of this rope.

The group I ended up with was a rambunctious extended family on a trip from their home state of Ohio. They graciously treated me as part of their family which apparently means enduring sarcastic quips for the duration of the trip.

Flinging yourself off of a platform bolted 50 feet up a tree harnessed to a cable is surprisingly a lot of fun. Unless you’re deathly afraid of heights. We ran through eight different lines each more interesting than the previous.

Sea Kayaking

For sea kayaking I went with Discovery Sea Kayaks. They are an outfit that is committed to keeping the size of their tours small. My tour had five folks, not counting the guide. I saw another group with what looked like 12 people.

A tour starts with a brief tutorial on safety and proper paddling before setting out in a set of double canoes. We set out on the west side of the island with a nice view of Canada across the sound. We didn’t catch any whales but did see jellyfish, sea lions, and some doll porpoises.

IMG_1584Who said anything about having to actually paddle?

I was fortunate to be sharing a canoe with the guide ensuring I wouldn’t have any problems hearing the guide as he regaled us with interesting stories about the island’s history and our natural surroundings.

I spent two days and two nights on the Island before returning home feeling much refreshed. I look forward to the next opportunity to take such a trip and highly recommend it if you feel yourself being weighed down by work.

code,, razor comments edit

As a web guy, I’ve slung more than my fair share of angle brackets over the tubes of the Internet. The Razor syntax quickly became my favorite way of generating those angle brackets soon after its release. But its usefulness is not limited to just the web.

The ASP.NET team designed Razor to generate HTML markup without being tightly coupled to ASP.NET. This opens up the possibility to use Razor in many other contexts other than just a web application.

For example, the help documentation for NuGet.exe is written using the Markdown format that is produced by NuGet.exe. NuGet.exe reflects over its own commands and uses a Razor template to generate the properly formatted output.

The check-in that enabled this caught my eye and prompted me to write this blog post as it’s a very clean approach. I’ll show you how to do the same thing in no time at all.


The first step is to install the RazorGenerator extension from the Visual Studio Extension Gallery.

If you haven’t used the Extension Gallery before, within Visual Studio click on the Tools > Extension Manager menu option to launch the Extension Manager dialog. Select the Online tab and type in “RazorGenerator” (without the quotes) in the upper right search bar.

Make sure to install the one named “Razor Generator” (not to be confused with “Razor Single File Generator for MVC”).


Create your application

For my sample application, I created a simple console application and added a reference to the following assemblies:

  • System.Web.WebPages.dll
  • System.Web.Helpers.dll
  • System.Web.Razor.dll

I then added a new text file and named it RazorTemplate.cshtml. You can name yours whatever you want of course.

Make sure to set the Custom Tool for the CSHTML file to be “RazorGenerator”. To do that, simply right click on the file and select the Properties menu option. Type in “RazorGenerator” (sans quotes) in the field labeled Custom Tool.


I added the following code within the CSHTML file:

@* Generator : Template TypeVisibility : Internal *@
@functions {
  public dynamic Model { get; set; }
@foreach (var item in Model) {
  <li>@item.Name (@item.Id)</li>  

That first line is a generator declaration. It’s required to by the Razor Generator. I chose to make the generated template class internal.

The next line starts a functions block. I specify a property for the template named Model in there. If you’re not a fan of the dynamic keyword, please don’t freak out. At least not yet.

I simply chose a dynamic property for the purposes of demonstration, but I could have just as easily made it a strongly typed property. Well, not just as easily as I would have had to create a another type first. But you get the idea.

In fact, I could have added multiple properties to this template if so desired. These properties and methods added here will show up in the generated template class.

The next section is simply the usual razor syntax markup you know and love which is written against the property I defined. In case you’re out of practices with Razor, be sure to check out the C# Razor Syntax Quick Reference I wrote a while back.

Render the template

Now all we need to do is instantiate the template, populate the properties we defined in the template with real values, and we’re done!

So what exactly are we instantiating? The steps we took up until now results in the Razor file generating a template class. If you expand the CSHTML file, you can see the generated class.


That’s the class we need to instantiate. Here’s some code I added in Program.cs that makes use of this generated template class.

class Program {
    static void Main(string[] args) {
        var template = new RazorTemplate {
            Model = new[] { 
                new {Name = "Scott", Id = 1},
                new {Name = "Steve", Id = 2},
                new {Name = "Phil", Id = 3},
                new {Name = "David", Id = 4}

The code is very straightforward. It simply instantiates an instance of the RazorTemplate class and sets the Model property (which is the property I defined within the template) as an array of anonymous objects.

Again, for demonstration purposes, I’m using a dynamic property to access anonymous objects. You can just as well pass in and render strongly typed properties.

After instantiating the template instance, we simply call the TransformText method on it and write the response to the console.


Easy as stepping on a Lego block in the dark!

Note that using Razor as a general text templating langage might not always produce the best results. It was heavily geared towards rendering markup (aka angle brackets) which it’s very good at. Your mileage may vary when attempting to render other types of textual output.

In a following post, I’ll show you a cool way I’m using this technique for a library I’ve been working on meant to demonstrate some cool internals of ASP.NET MVC.

Some of what I’ve shown here has been shown before in the context of ASP.NET MVC. Those other posts are worth reading as well. For example…

I hope you find this useful for your text templating needs!, code, mvc comments edit

By default, ASP.NET MVC leverages Data Annotations to provide validation. The approach is easy to get started with and allows the validation applied on the server to “float” to the client without any extra work.

However, once you get localization involved, using Data Annotations can really clutter your models. For example, the following is a simple model class with two properties.

public class Character {
  public string FirstName { get; set; }
  public string LastName { get; set; }

Nothing to write home about, but it is nice, clean, and simple.  To make it more useful, I’ll add validation and format how the properties are displayed.

public class Character {
  [Display(Name="First Name")]
  public string FirstName { get; set; }
  [Display(Name="Last Name")]
  public string LastName { get; set; }

That’s busier, but not horrible. It sure is awful Anglo-centric though. I’ll fix that by making sure the property labels and error messages are pulled from a resource file.

public class Character {
  [StringLength(50, ErrorMessageResourceType = typeof(ClassLib1.Resources),
    ErrorMessageResourceName = "Character_FirstName_StringLength")]
  public string FirstName { get; set; }

    ErrorMessageResourceType = typeof(ClassLib1.Resources),
    ErrorMessageResourceName = "Character_LastName_StringLength")]
  public string LastName { get; set; }

Wow! I don’t know about you, but I feel a little bit dirty typing all that in. Allow me a moment as I go wash up.

So what can I do to get rid of all that noise? Conventions to the rescue! By employing a simple set of conventions, I should be able to look up error messages in resource files as well as property labels without having to specify all that information. In fact, by convention I shouldn’t even need to use the DisplayAttribute.

I wrote a custom PROOF OF CONCEPT ModelMetadataProvider that supports this approach. More specifically, mine is derived from the DataAnnotationsModelMetadataProvider.

What Conventions Does It Apply?

The nice thing about this convention based model metadata provider is it allows you to specify as little or as much of the metadata you need and it fills in the rest.

Providing minimal metadata

For example, the following is a class with one simple property.

public class Character {
  public string FirstName {get; set;}

When displayed as a label, the custom metadata provider looks up the resource key, {ClassName}_{PropertyName},and uses the resource value as the label. For example, for the FirstName property, the provider uses the key Character_FirstName to look up the label in the resource file. I’ll cover how resource type is specified later.

If a value for that resource is not found, the code falls back to using the property name as the label, but splits it using Pascal/Camel casing as a guide. Therefore in this case, the label is “First Name”.

The error message for a validation attribute uses a resource key of {ClassName}_{PropertyName}_{AttributeName}. For example, to locate the error message for a RequiredAttribute, the provider finds the resource key Character_FirstName_Required.

Partial Metadata

There may be cases where you can provide some metadata, but not all of it. Ideally, the metadata that you don’t supply is inferred based on the conventions. Going back to previous example again:

public class Character {
  [StringLength(50, ErrorMessageResourceName="StringLength_Error")]
  [Display(Name="First Name")]
  public string FirstName {get; set;}

Notice that the first attribute only specifies the error message resource type. In this case, the specified resource type will override the default resource type. But the resource key is still inferred by convention (aka Character_FirstName_Required).

In contrast, notice that the second StringLengthAttribute, only specifies the resource name, and doesn’t specify a resource type. In this case, the specified resource name is used to look up the error message using the default resource type. As you might expect, if the ErrorMessage property is specified, that takes precedence over the conventions.

The DisplayAttribute works slightly differently. By default, the Name property is used as a resource key if a resource type is also specified. If no resource type is specified, the Name property is used directly. In the case of this convention based provider, an attempt to lookup a resource value using the Name property as a resource always occurs before falling back to the default behavior.


One detail I haven’t covered yet is what resource type is used to find these messages? Is that determined by convention?

Determining this by convention would be tricky so it’s the one bit of information that must be explicitly specified when configuring the provider itself. The following code in Global.asax.cs shows how to configure this.

ModelMetadataProviders.Current = new ConventionalModelMetadataProvider(
  requireConventionAttribute: false,
  defaultResourceType: typeof(MyResources.Resource)

The model metadata provider’s constructor has two arguments used to configure it.

Some developers will want the conventions to apply to every model, while others will want to be explicit and have models opt in to this behavior. The first argument, requireConventionAttribute, determines whether the conventions only apply to classes with the MetadataConventionsAttribute applied.

The explicit folks will want to set this value to true so that only classes with the MetadataConventionsAttribute applied to them (or classes in an assembly where the attribute is applied to the assembly) will use these conventions.

The attribute can also be used to specify the resource type for resource strings.

The second property specifies the default resource type to use for resource strings. Note that this can be overridden by any attribute that specifies its own resource type.

Caveats, Issues, Potholes

This code is something I hacked together and there are a few issues to consider that I could not easily work around. First of all, the implementation has to mutate properties of attributes. In general, this is not a good thing to do because attributes tend to be global. If other code relies on the attributes having their original values, this could cause issues.

I think for most ASP.NET MVC applications (in fact most web applications period) this will not be an issue.

Another issue is that the conventions don’t work for implied validation. For example, if you have a property of a simple value type (such as int), the DataAnnotationsValidatorProvider supplies a RequiredValidator to validate the value. Since this validator didn’t come from an attribute, it won’t use my convention based lookup for its error messages.

I thought about making this work, but it the hooks I need to do this without a large amount of code don’t appear to be there. I’d have to write my own validator provider (as far as I can tell) or register my own validator adapters in place of the default ones. I wasn’t up to the task just yet.

Try it out

  • NuGet Package: To try it in your application, install it using NuGet: Install-Package ModelMetadataExtensions
  • Source Code:The source code is up on GitHub., mvc comments edit

It only feels like yesterday that we shipped ASP.NET MVC 3 followed by a release of updated Visual Studio tooling for ASP.NET MVC 3. But we’re not ones to sit on our hands for long and are busy at work on ASP.NET MVC 4.

In fact, almost immediately after shipping ASP.NET MVC 3, we started working through our backlog of bugs at the same time that we started general planning for the next major version.

Today, I’ve published the result of that planning in the form of a high-level roadmap for ASP.NET MVC 4.

There’s an important disclaimer I want to highlight in the roadmap:

It’s important to understand that we are in the early stagesof development on ASP.NET MVC 4 and that this roadmap is a planning document for the next release. It is not a specification of what is to come. We hope to implement most or all of the features listed here, but there are no guarantees. Plans can change. And you can help change them! Please visit our forums to provide feedback on our plans so that we have a better picture of what you want to see in the next release.

This roadmap is more detailed than roadmaps that we’ve written in the past. My hope is that it provides enough of a taste of the features to come that we can get feedback even earlier on features that we have yet to implement.

One of the cool new features I want to highlight is the feature we’re calling “Recipes”. In brief, a recipe is scaffolding on steroids. These are bits of UI delivered via NuGet for accomplishing common tasks. We put a few ideas in the roadmap, but would love to hear more ideas.

Not included in the roadmap are the many cool enhancements to Razor and other features being considered for the next version of ASP.NET Web Pages that ASP.NET MVC developers will get for free! Erik Porter (aka @humancompiler) and his team are hard at work on those features, so I won’t spoil the surprise.

UPDATE: We started a UserVoice site for ASP.NET MVC features., mvc, code comments edit

ASP.NET MVC 3 introduced the ability to bind an incoming JSON request to an action method parameter, which is something I wrote about before.

For example, suppose you have the following class defined (keeping it really simple here):

public class ComicBook
  public string Title { get; set; }
  public int IssueNumber { get; set; }

And you have an action method that accepts an instance of ComicBook:

public ActionResult Update(ComicBook comicBook)
  // Do something with ComicBook and return an action result

You can easily post a comic book to that action method using JSON.

Under the hood, ASP.NET MVC uses the DefaultModelBinder in combination with the JsonValueProviderFactory to bind that value.

A question on an internal mailing list recently asked the question (and I’m paraphrasing here), “Why not cut out the middle man (the value provider) and simply deserialize the incoming JSON request directly to the model (ComicBook in this example)?”

Great question! Let me provide a bit of background to set the stage for the answer.

Posting Content to an Action

There are a couple of different content types you can use when posting data to an action method.


You may not realize it, but when you submit a typical HTML form, the content type of that submission is application/x-www-form-url-encoded.

As you can see in the screenshot below from Fiddler, the contents of the form is posted as a set of name value pairs separated by ampersand characters. The name and value within each pair are separated by an equals sign.

By the time you typically interact with this data (outside of model binding), it’s in the form of a dictionary like interface via the Request.Form name value collection.

The following screenshot shows what such a request looks like using Fiddler.


When content is posted in this format, the DefaultModelBinder calls into the FormValueProvider asking for a value for each property of the model. The FormValueProvider is a very thin abstraction over the Request.Form collection.


Another content type you can use to post data is application/json. As you might guess, this is simply JSON encoded data.

Here’s an example of a bit of JavaScript I used to post the same content as before but using JSON. Note that this particular snippet requires jQuery and a browser that natively supports the JSON.stringify method.

<script type="text/javascript">
    $(function() {
        var comicBook = { Title: "Groo", IssueNumber: 101 }
        var comicBookJSON = JSON.stringify(comicBook);
            url: '/home/update',
            type: 'POST',
            dataType: 'json',
            data: comicBookJSON,
            contentType: 'application/json; charset=utf-8',

When this code executes, the following request is created.


Notice that the content is encoded as JSON rather than form url encoded.

JSON is a serialization format so it’s in theory possible that we could straight deserialize that post to a ComicBook instance. Why don’t we do that? Wouldn’t it be more efficient?

To understand why, let’s suppose we did use serialization and walk through a common scenario. Suppose someone submits the form and they enter a string instead of a number for the field IssueNumber. You’d probably expect to see the following.


Notice that the model binding was able to determine that the Title was submitted correctly, but that the IssueNumber was not.

If our model binder deserialized JSON into a ComicBook it would not be able to make that determination because serialization is an all or nothing affair. When serialization fails, all you know is that the format didn’t match the type. You don’t have access to the granular details we need to provide property level validation. So all you’d be able to show your users is an error message stating something went wrong, good luck figuring out what.

The Solution

Instead, what we really want is a way bind each property of the model one at a time so we can determine which of the fields are valid and which ones are in error. Fortunately, the DefaultModelBinder already knows how to do that when working with the dictionary-like IValueProvider interface.

So all we need to do is figure out how to expose the posted JSON encoded content via the IValueProvider interface. As I wrote before, Jonathan Carter had the bit of insight that provided the solution to this problem. He realized that you could have the JSON value provider deserialize the incoming JSON post to a dictionary. Once you have a dictionary, it’s pretty easy to implement IValueProvider and the DefaultModelBinder already knows how to bind those values to a type while providing property level validation. Score!

Value Provider Aggregation

The answer I provided only tells part of the story of why this is implemented as a value provider. There’s another aspect that was illustrated by my co-worker Levi. Sadly, for someone so gifted intellectually, he has no blog, so I’ll paraphrase his words here (with a bit of verbatim copying).

As I mentioned earlier, value providers provide an abstraction over where values actually come from. Value providers are responsible for aggregating the values that are part of the current request, e.g. from Form collection, the query string, JSON, etc.  They basically say “I don’t know what a ‘FirstName’ is for or what you can do with it, but if you ask me for a ‘FirstName’ I can give you what I have.”

Model binders are responsible for querying the value providers and building up objects based on those results.  They basically say “I don’t know where directly to find a ‘FirstName’, ‘LastName’, or ‘Age’, but if the value provider is willing to give them to me then I can create a Person object from them.”

Since model binders aren’t locked to individual sources (with some necessary exceptions, e.g. HttpPostedFile), they can build objects from an aggregate of sources. If your Person type looks like this:

public class Person
  int Id { get; set; }
  int Age { get; set; }
  string FirstName { get; set; }
  string LastName { get; set; }

And a client makes a JSON POST request to an action method (say with the url /person/edit/1234 with the following content:

  "Age": 30, 
  "FirstName": "John", 
  "LastName": "Doe" 

The DefaultModelBinder will pull the Id value from the RouteData and the Age, FirstName, and LastName values from the JSON when building up the Person object. Afterwards, it’ll perform validation without having to know that the various values came from different sources.

Even better, if you wrote a custom Person model binder and made it agnostic as to the current IValueProvider, you’d get the correct behavior on incoming JSON requests without having to change your model binder code one tiny iota.  Neither of these is possible if the model binder is hard-coded to a single provider.

TL;DR Summary

The goal of this post was to provide a bit of detail around an interesting aspect of how ASP.NET MVC turns strings sent to a web server into strongly typed objects passed into your action methods.

Going back to the original question, the answer is simply, we use a value provider for JSON to enable property level validation of the incoming post and also so that model binding can build up an object by aggregating multiple sources of data without having to know anything about those sources., mvc comments edit

In May, we released a tools update for ASP.NET MVC 3 in nine languages other than English. Today I got the good news that ASP.NET MVC 3 documentation is also now available in those nine languages, which arguably is even more helpful to those learning and using ASP.NET MVC.

Our team is constantly working to improve the quality of our documentation and having docs available in multiple languages is a big part of that work.

Unfortunately, translation to Klingon is still not on the roadmap.

Update: For those wondering where the English documentation is, it’s here:

open source, nuget comments edit

The moon goes around the earth and when it comes up on the other side, Hark! There’s a new release of NuGet! Well, this time it was more like one and a half revolutions, but I’m happy nonetheless to announce the release of NuGet 1.4.

A big thank you goes out to the many external contributors who submitted patches to this release! Your enhancements are much appreciated!

I’ve written up much more details about what’s in this release in the NuGet 1.4 Release Notes, but I’ll highlight a few choice things in this blog post.

NuGet Self-Update Notification Check

One thing you may notice immediately if you’re running NuGet 1.3 today is that the NuGet dialog notifies you itself that there’s a new version of NuGet.

NuGet Update Check

Note: The check is only made if the Online tab has been selected in the current session.

This feature was actually added in NuGet 1.3, but obviously would not be visible until today, now that NuGet 1.4 is available.

Managing Packages Across The Solution

A lot of work in this release went into managing packages across the solution. If you’re a command-line junky, the Package Manager Console Update-Package commands now support updating all packages in all projects as well as a single package across all projects.

The NuGet dialog can also be launched at the solution level which makes it easy to choose a set of projects to install a package into, rather than installing a package into a project one at a time. This was a common request for those working on a large multi-project solution.

NuGet Project

What’s Next?

This blog post is just a tiny sampling of what’s new. Again, check out the release notes for more details.

We’re going to try better to have a roadmap of the next couple of releases hosted on the front page here: For now, it’s very high level and general because we really only fully plan one iteration ahead.

However, we do have an idea of some of the big themes we want to focus on:

  • Simple package creation: We constantly want to lower the bar for creating and sharing code from inside and outside of Visual Studio.
  • NuGet in the Enterprise:This includes CI scenarios outside of Visual Studio, authenticated feeds, etc.
  • Flexible packaging: Includes things like including assemblies that are not referenced but deployed and vice versa.
  • Developer Workflow: We’re looking at common workflows that don’t match our own expectations and how we can support them. This also includes workflows we do know about such as the use of pre-release packages etc.

In general though, I think we can sum up all of themes in one big theme: Make NuGet Better!

Get Involved!

If you have great ideas for NuGet, please get involved in the discussions. We try to be very responsive and we do accept external contributions as Joshua Flanagan learned and wrote about in his blog post, An opportunity for a viable .NET open source ecosystem.

Then, remembering my last experience, I figured I would at least start a discussion before giving up for the night. To my surprise, the next day it was turned into an issue – this isn’t just another Microsoft Connect black hole. After hashing out a few details, I went to work on a solution and submitted a pull request. It was accepted within a few days. Aha! This is open source. This is how its supposed to work. This works.

Onward to NuGet 1.5!

blogging, personal comments edit

No, I’m not talking about my mental age.

My son turned four this past week which means I’m four years into my world domination plan. One of the gifts we gave my son was a toolbox with plastic toys so we can train him on building the mega-lasers and fortresses needed to take over the world. Turns out that before you start dominating the world, you have to start taking baby steps. And then toddler steps. And then 4-year old running terror steps.

In these past four years, I’ve learned a lot. For example, you can get a four year old to believe anything. That’s come in useful as a tool for manipulating my child to do my bidding. We were at the mall one day and he asked about a picture of a zombie with rotting teeth and without skipping a beat, my wife and I told him that’s what happens when you don’t wash your face and brush your teeth. There’s still complaining when I brush his teeth, but he’s at the sink at “two” on a three count.

Meta Bloggling

The point of this Random Friday series was to get back into the flow of blogging. But I am concerned that my Blog will turn entirely into random Friday blog posts. I guess it’s incentive for me to try and post some substance once a week as well. Or turn the knob down a bit and make this an every-other random Friday. The Fridays in between would be perfectly ordered and predictable Fridays and not worth blogging.

Friday Appreciation

And for the weekly thing I appreciate, it’s the sport of Soccer. Or Football as the rest of the world calls it. It’s interesting to me how quickly my British friends jump on me when I call it “Soccer”.

Bloody ‘ell! It’s called Football you stupid Amurrrrrican.

Which is ironic because the name Soccer comes from Association Football, which was invented, where else? England. The term was shortened to “Asoc” which forms the basis of the the word Soccer. In the words of those cheesy Anti-Drug PSAs, I learned it by watching you!

My summer soccer season started this Monday and we won our first game 5 – 4. Sadly, we went from 1st place in our division in the Winter season to last place last spring, which means we were relegated to a lower division for the summer season. Hopefully we can pull ourselves back up because things are a bit chippier down here.

After each game, I’ve somehow become the chronicler of our great deeds and I write-up a game report I send out to my teammates full of timeless sports classics such as Boom Goes The Dynamite. The fact that I have no future in sports writing is pretty safe. I toyed with the idea of posting the write-ups to my blog, but I didn’t want to turn my blog solely into a collection of Random Friday write-ups and tales of old men playing Soccer.

I’m on vacation next week so things should be quiet for me here. Unless past behavior is any indication in which case I’ll blog a thousand technical posts. Thanks for reading. Smile, mvc comments edit

UPDATE: I have an example Really Empty project template up on GitHub you can look at. I improved on this technique a bit in that one.

When you create a new ASP.NET MVC 3 project, the new project wizard dialog contains several options for different MVC project templates:

There’s a lot of white space in that dialog. To many of you, all that unsullied territory smells like opportunity. When I talk about this dialog, I go to great pains to tell folks that, yes!, you too can extend that dialog and add your own project templates in there.

If you wanted to, you could have your own ASP.NET MVC 3 project template configured exactly the way you want. Hate the default template? Make your own!

The only problem is, I keep telling you that you can extend it, but sadly I never told you how. But that’s about to change!

I don’t expect that a large number of people will want to do this, which is one reason we haven’t spent a large amount of time making it easy (though that may change in the future). But for the few of you impatient masochists who want to add your own custom templates now, this blog post will walk you through the hacking around it takes to make it happen.

Imitation is the sincerest form of productivity

The easiest way to get started is to simply copy and modify an existing project template. For example, I looked in the following directory:

C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ProjectTemplates\CSharp\Web\1033

on my machine and stole *ahem* borrowed the project template named Note that the 1033 folder is for English (en-US) templates. For other languages, you may need to look in a different folder.

I then renamed it to and extracted the contents into a folder so I could make some modifications to its contents.


When you extract the contents, you’ll want to rename the .vstemplate file to match the name of the template you chose. In my case, I renamed MvcWebApplicationProjectTemplatev3.01.cshtml.vstemplate to MyProjectTemplate.cshtml.vstemplate.

Open up the .vstemplate file in NotePad and make sure to change the TemplateID element value to something unique.

You can change any of the contents of the template folder now, but be very careful to make sure that any additions or deletions of content are reflected in the .vstemplate file. That file is a manifest of all the files within the VSIX package that makes up the project template. Also make sure that the .csproj file reflects those changes as well, to ensure any new files you add to the template are properly referenced in the project.

Pre-installed NuGet packages

UPDATE: The upcoming NuGet 1.5 feature will provide support for this feature in a way that doesn’t require the following harsh warning. Marcin Dobosz has a blog post detailing the feature.

Warning: I probably shouldn’t show you this next section and some of my co-workers may chide me on this. But if you promise to be responsible and pay close attention to the information and context for that information I’m about to show you, I’ll do it anyways and trust you not to inundate us with support calls when this blows your hand off.

The ASP.NET MVC 3 Tools Update includes very limited support for project templates that include NuGet packages. We originally wanted it to be very extensible, but ran out of time and imposed some severe limitations on the feature, hence the caution.

If you scroll to the bottom of the .vstemplate file, you’ll notice the following section:

        <package id="jQuery" version="1.5.1" />
        <package id="jQuery.vsdoc" version="1.5.1" />
        <package id="jQuery.Validation" version="1.8.0" />
        <package id="jQuery.UI.Combined" version="1.8.11" />
        <package id="EntityFramework" version="4.1.10331.0" />
        <package id="Modernizr" version="1.7" />

That is the list of NuGet packages that the MVC 3 project template installs when you invoke the project template.

But as I mentioned, there are two major limitations:

  • The package must exist in the %ProgramFiles%\Microsoft ASP.NET\ASP.NET MVC 3\Packages folder. MVC 3 doesn’t go searching online for them.
  • The version attribute of the package in the <package> element is required and is an exact match.

If you are fine with these limitations, you can modify this section of your custom project template to install the NuGet packages you care about. Just make sure they exist in the MVC 3 packages folder like I mentioned.

Once you are done making your changes, zip up the contents of the folder with the same file name you had before.

Registering your project template

At this point, all you need to do is copy the project template to the right location and add the appropriate registry entries. For extra credit, you can write an installer (MSI) that does all this for you.

The place to copy your template is the same place I mentioned previously, C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ProjectTemplates\CSharp\Web\1033

Once the template is there, you’ll need to setup the correct registry settings.


Since I’m lazy, I put these registry settings in a .reg file to make it easy to install. You’ll just need to modify the settings within the .reg file to match your project template.


Windows Registry Editor Version 5.00

"Title"="My Project Template"
"Description"="This is the coolest project template EVAR MADE."



Windows Registry Editor Version 5.00

"Title"="My Project Template"
"Description"="This is the coolest project template EVAR MADE."


The important thing to note are the options in the second registry section:

  • Path – Relative path from the ProjectTemplates folder. For C# projects, enter “CSharp\\Web”. For VB.NET use “VisualBasic\\Web”
  • SupportsHTML5 –Whether or not the project template supports HTML5. If set to 1, then the HTML 5 checkbox is enabled. That checkbox sets a project template variable, $usehtml5$. You can look at the default /Views/Shared/_Layout.cshtml inside of MvcWebApplicationProjectTemplatev3.01.cshtml.zipfor an example of this.
  • SupportsUnitTests – This allows you to associate a unit test project template with your project template.
  • Template – the name of your project template file.

The last step is to run the command devenv /installvstemplates to force Visual Studio to recognize the project templates.

I wrote a batch file, install.bat, when combined with the .reg file, that automates these steps.

cd %~dp0
regedit.exe /s project-template.reg
xcopy "C:\Program Files (x86)\Microsoft Visual Studio   10.0\Common7\IDE\ProjectTemplates\CSharp\Web\1033" /Y
devenv /installvstemplates

For your convenience, I packaged up the necessary files in a zip file. Unzip the file, and run install.bat and you’ll see a new project template when you create a new ASP.NET MVC 3 project.


Pretty cool, eh?

By the way, I’m working on a book about ASP.NET MVC 3 with Brad Wilson, Jon Galloway, and K. Scott Allen. We’ve re-written large portions of the book in light of the new features that were released in ASP.NET MVC 3. If you’re interested, feel free to pre-order our book on!

humor, personal comments edit

It’s that time of year at Microsoft when managers are busily preparing reviews of their reports and preparing for the big stack ranking.

Yesterday, my manager sent out an email asking his reports to email him with their accomplishments in the past year to help jog his memory. This arms him with important information when he goes to the mat for us arguing why we’re more deserving of a higher ranking than some other manager’s sad report.

Here was my response. In the past year, I…

  • Escaped from a black hole. Twice. I forgot my jacket in there and had to go get it.
  • Discovered an albino polar bear.
  • Proved Fermat’s Last Theorem as well as his penultimate theorem.
  • Found Waldo. And made him change out of that ridiculous shirt and hat. Discovered he’s really Harry Potter.
  • Unified gravity and quantum mechanics while watching Jersey Shore to make it a challenge.
  • Invented cold fusion as well as luke warm fusion.
  • Passed the turing test using a computer made of legos.
  • Finished World of Warcraft on an Atari 2600.
  • Counted all the elements within an uncountable set of Hilbert spaces.
  • Hunted a magical unicorn with my bare hands. Drank its blood. And now I’m magical.

Not to be outdone, Steve Sanderson replied with the following:

Oh yeah? That’s nothing. I successfully submitted an expense claim using the online expenses tool.

Color me impressed!

Friday Appreciation

And for the weekly thing I appreciate, this week I appreciate the word hyperbole. It sounds like hyperbola, but is a totally different thing. Be sure not to confuse the two.

What I did above is hyperbole, while this is a hyperbola.


Have a nice weekend!, mvc, nuget, code comments edit

At the risk of getting punched in the face by my friend Miguel, I’m not afraid to admit I’m a fan of responsible use of dependency injection. However, for many folks, attempting to use DI runs into a roadblock when it comes to ASP.NET HttpModule.

In the past, I typically used “Poor man’s DI” for this. I wasn’t raised in an affluent family, so I guess I don’t have as much of a problem with this approach that others do.

However, when the opportunity for something better comes along, I’ll take it Daddy Warbucks. I was refactoring some code in Subtext when it occurred to me that the new ability to register HttpModules dynamically using the PreApplicationStartMethodAttribute could come in very handy.

Unfortunately, the API only allows for registering a module by type, which means the module requires a default constructor. However, as with many problems in computer science, the solution is another layer of redirection.

In this case, I wrote a container HttpModule that itself calls into the  the DependencyResolver feature of ASP.NET MVC 3 in order to find and initialize the http modules registered via your IoC/DI container. The approach I took happens to be very much similar to one that Mauricio Scheffer blogged about a while ago.

using System;
using System.Collections.Generic;
using System.Web;
using System.Web.Mvc;
using HttpModuleMagic;
using Microsoft.Web.Infrastructure.DynamicModuleHelper;

[assembly: PreApplicationStartMethod(typeof(ContainerHttpModule), "Start")]
namespace HttpModuleMagic
  public class ContainerHttpModule : IHttpModule
    public static void Start()

    Lazy<IEnumerable<IHttpModule>> _modules 
      = new Lazy<IEnumerable<IHttpModule>>(RetrieveModules);

    private static IEnumerable<IHttpModule> RetrieveModules()
      return DependencyResolver.Current.GetServices<IHttpModule>();

    public void Dispose()
      var modules = _modules.Value;
      foreach (var module in modules)
        var disposableModule = module as IDisposable;
        if (disposableModule != null)

    public void Init(HttpApplication context)
      var modules = _modules.Value;
      foreach (var module in modules)

The code is pretty straightforward, though there’s a lot going on here. At the top of the class we use the PreApplicationStartMethodAttribute which allows the http module to register itself! Just reference the assembly containing this code and you’re all set to go. No mucking around with web.config!

Note that this code does require that you’re application has the following two assemblies in bin:

  1. System.Web.Mvc.dll 3.0
  2. Microsoft.Web.Infrastructure.dll 1.0

The nice part about this is after referencing this assembly, I can simply register the Http Modules using my favorite DI container and I’m good to go. For example, I installed the Ninject.Mvc3 package and added the following Subtext http module bindings:


There is one caveat I should point out. You’ll notice that when the container http module is disposed, Dispose is called on each of the registered http modules.

This could be problematic if you happen to register them in singleton scope. In my case, all of my modules are stateless and the Dispose method is a no-op, which in general is a good idea unless you absolutely need to hold onto state.

If your modules do hold onto state and need to be disposed of, you’ll have to be careful to scope your http modules appropriately. It’s possible for multiple instances of your http module to be created in an ASP.NET application.

DI for a Single Http Module

Just in case your DI container doesn’t support the ability to register multiple instances of a type (in other words, it doesn’t support the DependencyResolver.GetServices call), or it can’t handle the scoping properly and your http module holds onto state that needs to be disposed at the right time, I did write another class for registering an individual module, while still allowing your DI container to hook into creation of that one module.

In this case, you won’t be using DI to register the set of http modules. But you will be using it to create instances of the modules that you register.

Here’s the class.

using System;
using System.Web;
using System.Web.Mvc;

namespace HttpModuleMagic
  public class ContainerHttpModule<TModule> 
    : IHttpModule where TModule : IHttpModule
    Lazy<IHttpModule> _module = new Lazy<IHttpModule>(RetrieveModule);

    private static IHttpModule RetrieveModule()
      return DependencyResolver.Current.GetService<IHttpModule>();

    public void Dispose()

    public void Init(HttpApplication context)

This module is much like the other container one, but it only wraps a single http module. You would register it like so:


In this case, you’d need to set up your own PreApplicationStartMethod attribute or use the WebActivator.

And of course, I created a little NuGet package for this.

Install-Package HttpModuleMagic

Note that this requires that you install it into an application with the ASP.NET MVC 3 assemblies.

personal comments edit

I’m reading through the archives of a blog where the author posts something random every Friday (yesterday was Thursday, and tomorrow is Saturday). His Friday posts are completely unrelated to the main theme and content of his blog.

I like that idea a lot. I don’t blog as much as I used to mostly because I feel the need to spend so much time on each blog post. A lot of the posts I write take a bit of research and experimentation before I’m ready to post them.

But a random thought? I can pull one of those out of my ascot any day of the week, and twice on Friday. But I’ll only do it once.

And yes, thanks for asking, but the thought has occurred to me that I already have another medium where I post random thoughts 7 days a week, Twitter (I’m @haacked on Twitter).

But my twist on this is that every Friday, I’ll post something random, funny, amusing, or whatever in this blog post, and I’ll use more than 140 characters but I’ll always end the post with something I appreciated either during the week, or in general.

This Friday’s random thought is about starting a random thought Friday blog series and whether this will end up being a one post series like the rest. So there, I’ve done that part.

And the thing I appreciate this past week is how nice it was to take a day off and spend it with my wife. Oh, and Instagram. I appreciate Instagram a lot. Perhaps too much. I’ll try easing off the Tilt-Shift from now on.


Here’s where I violate my wife’s privacy and post a picture of her on a ferry to Bainbridge Island on my blog. We had a really nice outing on Tuesday., mvc, code comments edit

When you build an ASP.NET MVC 3 application and are ready to deploy it to your hosting provider, there are a set of assemblies you’ll need to include with your application for it to run properly, unless they are already installed in the Global Assembly Cache (GAC) on the server.

In previous versions of ASP.NET MVC, this set of assemblies was rather small. In fact, it was only one assembly, System.Web.Mvc.dll,though in the case of ASP.NET MVC 1.0, if you didn’t have SP1 of .NET 3.5 installed, you would have also needed to deploy System.Web.Abstractions.dll and System.Web.Routing.dll.

But ASP.NET MVC 3 makes use of technology shared with the new ASP.NET Web Pages product such as Razor. If you’re not familiar with ASP.NET Web Pages and how it fits in with Web Matrix and ASP.NET MVC, read David Ebbo’s blog post, How WebMatrix, Razor, ASP.NET Web Pages, and MVC fit together.

If your server doesn’t have ASP.NET MVC 3 installed, you’ll need to make sure the following set of assemblies are deployed in the bin folder of your web application:

  • Microsoft.Web.Infrastructure.dll
  • System.Web.Helpers.dll
  • System.Web.Mvc.dll
  • System.Web.Razor.dll
  • System.Web.WebPages.Deployment.dll
  • System.Web.WebPages.dll
  • System.Web.WebPages.Razor.dll

In this case, it’s not as simple as looking at your list of assembly references and setting Copy Local to True as I’ve instructed in the past.

As you can see in the following screenshot, not every assembly is referenced. Not all of these assemblies are meant to be programmed against so it’s not necessary to actually reference**each of these assemblies. They just need to be available on the machine either from the GAC or in the bin folder.


But the Visual Web Developer team has you covered. They added a feature specifically for adding these deployable assemblies. Right click on the project and select Add Deployable Assemblies and you’ll see the following dialog.


When building an ASP.NET MVC application, you only need to check the first option. Ignore the fact that the second one says “Razor”. “ASP.NET Web Pages with Razor syntax”was the official full name of the product we simply call ASP.NET Web Pages now. Yeah, it’s confusing.

Note that there’s also an option for SQL Server Compact, but that’s not strictly necessary if you’ve installed SQL Server Compact via NuGet.

So what happens when you click “OK”?


A special folder named _bin_deployableAssemblies is created and the necessary assemblies are copied into this folder. Web projects have a built in build task that copies any assemblies in this folder into the bin folder when the project is compiled.

Note that this dialog did not add any assembly references to these assemblies. That ensures that the types in these assemblies don’t pollute Intellisense, while still being available to your deployed application. If you actually need to use a type in one of these assemblies, you’re free to reference them.

So here’s the kicker. If you’re building a web application, and you need an assembly deployed but don’t want it referenced and don’t want it checked into the bin directory, you can simply add this folder yourself and put your own assemblies in here.

If you’ve ever run into a problem where an ASP.NET MVC site you developed locally doesn’t work when you deploy it, this dialog may be just the ticket to fix it.

open source, nuget comments edit

Most developers I know are pretty anal about the formatting of their source code. I used to think I was pretty obsessive compulsive about it, but then I joined Microsoft and faced a whole new level of OCD (Obsessive Compulsive Disorder). For example, many require all using statements to be sorted and unused statements to be removed, which was something I never cared much about in the past.

There’s no shortcut that I know of for removing unused using statements. Simply right click in the editor and select Organize Usings > Remove and Sort**in the context menu.

SubtextSolution - Microsoft Visual Studio (Administrator)

In Visual Studio, you can specify how you want code formatted by launching the Options dialog via Tools> Options and then select the Text Editor node. Look under the language you care about and there are multiple formatting options providing hours of fun fodder for religious debates.


Once you have the settings just the way you want them, you can select the Edit > Advanced > Format Document (or simply use the shortcut CTRL + K, CTRL + D ) to format the document according to your conventions.

The problem with this approach is it’s pretty darn manual. You’ll have to remember to do it all the time, which if you really have OCD, is probably not much of a problem.

However, for those that keep forgetting these two steps and would like to avoid facing the wrath of nitpicky code reviewers (try submitting a patch to NuGet to experience the fun), you can install the Power Commands for Visual Studio via the Visual Studio Extension manager which provides an option to both format the document and sort and remove using statements every time you save the document.

I’m actually not a fan of having using statements removed on every save because I save often and it tends to remove namespaces containing extension methods that I will need, but haven’t yet used, such as System.Linq.

Formatting Every Document

Also, if you have a large solution with many collaborators, the source code can start to drift away from your OCD ideals over time. That’s why it would be nice to have a way of applying formatting to every document in your solution.

One approach is to purchase ReSharper, which I’m pretty sure can reformat an entire solution and adds a lot more knobs you can tweak for the formatting.

But for you cheap bastards, there are a couple of free approaches you can make. One approach is to write a Macro, like Brian Schmitt did. His doesn’t sort and remove using statements, but it’s a one line addition to add that.

Of course, the approach I was interested in trying was to use Powershell to do it within the NuGet Package Manager Console. A couple nights ago I was chatting with my co-worker and hacker extraordinaire, David Fowler, way too late at night about doing this and we decided to have a race to see who could implement it first.

I knew I had no chance unless I cheated so I wrote this monstrosity (I won’t even post it here I’m so ashamed). David calls it “PM code”, which in this case was well deserved as it was simply a proof of concept, but also because it’s wrong. It doesn’t traverse the files recursively. But hey, I was first! But I at least gave him the code needed to actually format the document.

It was very late and I went to sleep knowing in the morning, I’d see something elegant from David. I was not disappointed as he posted this gist.

He wrote a generic command named Recurse-Project that recursively traverses every item in every project within a solution and calls an action against each item.

That allowed him to easily write Format-Document which leverages Recurse-Project and automates calling into Visual Studio’s Format Document command.

function Format-Document {
    [parameter(ValueFromPipelineByPropertyName = $true)]
  Process {
    $ProjectName | %{ 
      Recurse-Project -ProjectName $_ -Action {
        if($item.Type -eq 'Folder' -or !$item.Language) {
        $win = $item.ProjectItem.Open('{7651A701-06E5-11D1-8EBD-00A0C90F26EA}')
        if ($win) {
          Write-Host "Processing `"$($item.ProjectItem.Name)`""

Adding Commands to NuGet Powershell Profile

Great! He did the work for me. So what’s the best way to make use of his command? I could add it to a NuGet package, but that would then require that I install the package first any time I wanted to use the package. That’s not very usable. NuGet doesn’t yet support installing PS scripts at the machine level, though it’s something we’re considering.

To get this command available on my machine so I can run it no matter which solution is open, I need to set up my NuGet-specific Powershell profile as documented here.

The NuGet Powershell profile script is located at:


The easiest way to find the profile file is to type $profile within the NuGet Package Manager Console. The profile file doesn’t necessarily exist by default, but it’s easy enough to create it. The following screenshot shows a session where I did just that.


The mkdir –force (split-path $profile) command creates the WindowsPowershell directory if it doesn’t already exist.

Then simply attempting to open the script in Notepad prompts you to create the file if it doesn’t already exist. Within the profile file, you can change PowerShell settings or add new commands you might find useful.

For example, you can cut and paste the code in David’s gist and put it in here. Just make sure to omit the first example line in the gist which simply prints all project items to the console.

When you close and re-open Visual Studio, the Format-Document command will be available in the NuGet Package Manager Console. When you run the command, it will open each file and run the format command on it. It’s rather fun to watch as it feels like a ghost has taken over Visual Studio.

The script has a Thread.Sleep call for 100ms to work around a timing issue when automating Visual Studio. It can take a brief moment after you open the document before you can activate it. It doesn’t hurt anything to choose a lower number. It only means you may get the occasional error when formatting a document, but the script will simply move to the next document.

The following screenshot shows the script in action.


With this in place, you can now indulge your OCD and run the Format-Document command to clean up your entire solution. I just ran it against Subtext and now can become the whitespace Nazi I’ve always wanted to be.

open source, personal comments edit

Almost two years ago, I announced the launch of, a blatant and obvious rip-off of the Let me Google that for you website.

The initial site was created by Maarten Balliauw and Juliën Hanssens in response to a call for help I made. It was just something we did for fun. I’ve been maintaining the site privately always intending to spend some time to refresh the code and open source it.

Just recently, I upgraded the site to ASP.NET MVC 3, refactored a bunch of code, and moved the site to AppHarbor.

Why AppHarbor?

I’ve heard such good things about how easy it is to deploy to AppHarbor so I wanted to try it out firsthand myself, and this small little project seemed like a perfect fit.

I had been working on the code in a private Mercurial repository so it was trivially easy for me to push it to a BitBucket repository. From there it’s really easy to integrate the BitBucket account with AppHarbor.

So now, my deployment workflow is really easy when working on this simple project:

  1. Make some changes and commit them into my local HG (Mercurial) repository. I have my local repository syncing to all my machines using Live Mesh.
  2. At some point, when I’m ready to publish the changes, I run the hg push command on my repository.
  3. That’s it! AppHarbor builds my project and if all the unit tests pass, it deploys it live.

I’m not planning to spend a lot of time on Let Me Bing That For You. It’s just a fun little side project that allows me to play around with ASP.NET MVC 3, jQuery, etc. If you want to look at the source, or contribute a patch, check it out on BitBucket.

nuget, open source comments edit

It’s a common refrain you hear when it comes to documentation for open source projects. It typically sucks! In part, because nobody wants to work on docs. But also in part because good documentation is challenging to write.

What is good documentation in the first place? The following is a list of some qualities that make for great documentation. This list is by no means complete. Good docs are…

  • Written for the right audience
  • Comprehensive and accurate
  • Easily browsable and searchable
  • Written in a clear and concise language
  • Laid out in a readable format
  • Versioned with the source code

While it’s challenging to write and maintain great documentation, my co-worker Matthew was up to the challenge of building a simple Markdown based system to help us manage our documentation. Read about our new docs site in his blog post, Introducing NuGet Docs: Community Driven Documentation.

Our goal in the long run is to have a great set of docs for NuGet with help from the community. So if you’re interested in helping out, please visit our NuGet Docs project page and let us know. It’s a separate repository with its own Mercurial repository so we can give a lot more people write access directly to the repository.

So please, if you’re looking for a low commitment easy way to get a toe in the waters with open source in general or with NuGet, consider helping us with our docs. It’s a great way to get started with OSS. It’s how I got my start a long time ago by contributing docs to RSS Bandit. mvc, comments edit

In April we announced the release of ASP.NET MVC 3 Tools Update which added Scaffolding, HTML 5 project templates, Modernizr, and EF Code First Magic Unicorn Edition.

Today, just shy of one month later I’m happy to announce that this release is now available in nine other languages via the Web Platform Installer (Web PI).

We’ve also included release notes translated into the nine languages as well.

The best way to install the language specific version of ASP.NET MVC 3 is via the Web Platform installer because it will chain in the full installer. If you install the language specific version directly from the Download Details page, you’ll need to run two installers, the full installer and then the language pack installer., mvc comments edit

ASP.NET MVC project templates include support for precompiling views, which is useful for finding syntax errors within your views at build time rather than at runtime.

In case you missed the memo, the following outline how to enable this feature.

  • Right click on your ASP.NET MVC project in the Solution Explorer
  • Select Unload Project in the context menu. Your project will show up as unavailableunavailable-project
  • Right click on the project again and select Edit ProjectName.csproj.

This will bring up the project file within Visual Studio. Search for the entry <MvcBuildViews> and set the value to true. Then right click on the project again and select Reload Project.

Compiling in a build environment

If you search for MvcBuildViews on the web, you’ll notice a lot of people having problems when attempting to build their projects in a build environment. For example, this StackOverflow question describes an issue when compiling MVC on a TFS Build. I had an issue when trying to deploy an ASP.NET MVC 3 application to AppHarbor.

It turns out we had a bug in our project templates in earlier versions of ASP.NET MVC that we fixed in ASP.NET MVC 3 Tools Update.

But if you created your project using an older version of ASP.NET MVC including ASP.NET MVC 3 RTM (the one before the Tools Update), your csproj/vbproj file will still have this bug.

To fix this, look for the following element within your project file:

<Target Name="AfterBuild" Condition="'$(MvcBuildViews)'=='true'">
  <AspNetCompiler VirtualPath="temp" PhysicalPath="$(ProjectDir)\..\$(ProjectName)" />

And replace it with the following.

<Target Name="MvcBuildViews" AfterTargets="AfterBuild" Condition="'$(MvcBuildViews)'=='true'">
  <AspNetCompiler VirtualPath="temp" PhysicalPath="$(WebProjectOutputDir)" />

After I did that, I was able to deploy my application to AppHarbor without any problems.

Going back to the StackOverflow question I mentioned earlier, notice that the accepted answer is not the best answer. Jim Lamb provided a better answer and is the one who provided the solution that we use in ASP.NET MVC 3 Tools Update. Thanks Jim!

nuget, code, open source comments edit

Not too long ago, I posted a survey on my blog asking a set of questions meant to gather information that would help the NuGet team make a decision about a rather deep change.

You can see the results of the survey here.

If there’s one question that got to the heart of the matter, it’s this one.


We’re considering a feature that would only allow a single package version per solution. As you can see by the response to the question, that would fit what most people need just fine, though there are a small number of folks that might run into problems with this behavior.

One variant of this idea would allow multiple package versions if the package doesn’t contain any assemblies (for example, a JavaScript package like jQuery).

Thanks again for filling out the survey. We think we have a pretty good idea of how to proceed at this point, but there’s always room for more feedback. If you want to provide more feedback on this proposed change, please review the spec here and post your thoughts in our discussion forum in the thread dedicated to this change.

The spec describes what pain point we’re trying to solve and shows a few examples of how the behavior change would affect common scenarios, so it’s worth taking a look at.

comments edit

On a personal level, NuGet has been an immensely satisfying project to work on. I’ve always enjoyed working on open source projects with an active community in my spare time, but being able to do it as part of my day job is really fulfilling.

And I don’t think I’m alone in this as evidenced by this tweet from a co-worker, Matt Osborn who was contributing to NuGet on his own time from the early days.

Matthew M. Osborn (osbornm) on Twitter - Google

A big part of the satisfaction comes from being able to collaborate with members of the community, aka you, in a deeper manner than before, which includes accepting contributions.

If you go to the OSS community site,, you can see a list of contributors to Nuget. As you might expect, the top five contributors are Microsofties who work on NuGet as part of their day job. But three of the top ten are external contributors from the community.

NuGet Contributors - Ohloh - Google

It looks like 21 of the 36 contributors are external. Take these numbers with a slight grain of salt because we use a distributed version control system and it appears some developers are counted twice because they used a different email address on a different computer.

Note to those developers! Create an account on and claim those check-ins! Ohloh will provide a merged view of your contributions.

Contributions come in all sizes.We’ve had folks come in and “scratch an itch” with single commits adding things like support for WiX or the .NET Micro Framework. Such commits form a key pillar of open source software as Linus Torvalds stated when discussing a Microsoft patch to Linux:

I agree that it’s driven by selfish reasons, but that’s how all open source code gets written! We all “scratch our own itches”.

While other contributions took a lot of work among multiple community members such as the work to fix Proxy issues within NuGet. We didn’t have the ability to test the wide range of proxy servers people had in the wild. Fortunately several folks in our forums worked on this and tested out daily builds till we got it working in Package Explorer. This will soon be rolled into NuGet proper. Thanks!

As with most open source projects, commits do not tell the full story of a community’s contributions to a project. In some cases, these folks were involved in a lot of design and verification work that ended up being perhaps one commit.

Our discussion boards are full of active participants telling us we’re doing it wrong, or doing it right, or what we need to do. And that’s great! The commitment of their time to help us shape a better project is greatly appreciated. Even those who come in and criticize the product are making a noteworthy contribution as they’ve taken the time to give us food for thought. As they say, indifference is worse than hate and we’ve found a lot of folks who are not indifferent.

Getting Results!

I think all this community contribution to NuGet is a big factor in the success of NuGet. With your help (and a few recent tweaks to their popularity algorithm), we’ve become the #1 most popular extension on the Visual Studio Extension Gallery website.

Visual Studio Gallery - Google

If you enjoy using NuGet and have a moment, consider going to the site and rate NuGet.

Moving Forward

As well as I think NuGet is doing, I’m by no means satisfied. In fact, I’m probably one of the most critical people when it comes to where NuGet is today as compared to where I’d like NuGet to be.

My team is a very small team. If we’re going to make even more progress than we’ve had, we’re going to need to cultivate contributors, both drive-by and consistent. That seems to me like the best way to scale out our development.

If you have tips on how to do that best, do let me know! In the meanwhile, I’ll brainstorm some ideas on how we can encourage more people to participate in the development of NuGet.