asp.net, asp.net mvc, razor comments suggest edit

Within a Razor view, you have access to a base set of properties (such as Html, Url, Ajax, etc.) each of which provides methods you can use within the view.

For example, in the following view, we use the Html property to access the TextBox method.

<code.
@Html.TextBox("SomeProperty")
</code>

Html is a property of type HtmlHelper and there are a large number of useful extension methods that hang off this type, such as  TextBox.

But where did the Html property come from? It’s a property of System.Web.Mvc.WebViewPage, the default base type for all razor views. If that last phrase doesn’t make sense to you, let me explain.

Unlike many templating engines or interpreted view engines, Razor views are dynamically compiled at runtime into a class and then executed. The class that they’re compiled into derives from WebViewPage. For long time ASP.NET users, this shouldn’t come as a surprise because this is how ASP.NET pages work as well.

Customizing the Base Class

HTML 5 (or is it simply “HTML” now) is a big topic these days. It’d be nice to write a set of HTML 5 specific helpers extension methods, but you’d probably like to avoid adding even more extension methods to the HtmlHelper class because it’s already getting a little crowded in there.

html-extensions

Well perhaps what we need is a new property we can access from within Razor. Well how do we do that?

What we need to do is change the base type for all Razor views to something we control. Fortunately, that’s pretty easy. When you create a new ASP.NET MVC 3 project, you might have noticed that the Views directory contains a Web.config file.

Look inside that file and you’ll notice the following snippet of XML.

<system.web.webPages.razor>
    <host factoryType="System.Web.Mvc.MvcWebRazorHostFactory, 
    System.Web.Mvc, Version=3.0.0.0, 
    Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
  <pages pageBaseType="System.Web.Mvc.WebViewPage">
    <namespaces>
      <add namespace="System.Web.Mvc" />
      <add namespace="System.Web.Mvc.Ajax" />
      <add namespace="System.Web.Mvc.Html" />
      <add namespace="System.Web.Routing" />
    </namespaces>
  </pages>
</system.web.webPages.razor>

The thing to notice is the <pages> element which has the pageBaseType attribute. The value of that attribute specifies the base page type for all Razor views in your application. But you can change that value by simply replacing that value with your custom class. While it’s not strictly required, it’s pretty easy to simply write a class that derives from WebViewPage.

Let’s look at a simple example of this.

public abstract class CustomWebViewPage : WebViewPage {
  public Html5Helper Html5 { get; set; }

  public override void InitHelpers() {
    base.InitHelpers();
    Html5 = new Html5Helper<object>(base.ViewContext, this);
  }
}

Note that our custom class derives from WebViewPage, but adds a new Html5 property of type Html5Helper. I’ll show the code for that helper here. In this case, it pretty much follows the pattern that HtmlHelper does. I’ve left out some properties for brevity, but at this point, you can add whatever you want to this class.

public class Html5Helper {
  public Html5Helper(ViewContext viewContext, 
    IViewDataContainer viewDataContainer)
    : this(viewContext, viewDataContainer, RouteTable.Routes) {
  }

  public Html5Helper(ViewContext viewContext,
     IViewDataContainer viewDataContainer, RouteCollection routeCollection) {
    ViewContext = viewContext;
    ViewData = new ViewDataDictionary(viewDataContainer.ViewData);
  }

  public ViewDataDictionary ViewData {
    get;
    private set;
  }

  public ViewContext ViewContext {
    get;
    private set;
  }
}

Let’s write a simple extension method that takes advantage of this new property first, so we can get the benefits of all this work.

public static class Html5Extensions {
    public static IHtmlString EmailInput(this Html5Helper html, string name,       string value) {
        var tagBuilder = new TagBuilder("input");
        tagBuilder.Attributes.Add("type", "email");
        tagBuilder.Attributes.Add("value", value);
        return new HtmlString(tagBuilder.ToString());
    }
}

Now, if we change the pageBaseType to CustomWebViewPage, we can recompile the application and start using the new property within our Razor views.

Html5Helpers

Nice! We can now start using our new helpers. Note that if you try this and don’t see your new property in Intellisense right away, try closing and re-opening Visual Studio.

What about Strongly Typed Views

What if I have a Razor view that specifies a strongly typed model like so:


@model Product
@{
    ViewBag.Title = "Home Page";
}

<p>@Model.Name</p>

The base class we wrote wasn’t a generic class so how’s this going to work? Not to worry. This is the part of Razor that’s pretty cool. We can simply write a generic version of our class and Razor will inject the model type into that class when it compiles the razor code.

In this case, we’ll need a generic version of both our CustomWebViewPage and our Html5Helper classes. I’ll follow a similar pattern implemented by HtmlHelper<T> and WebViewPage<T>.

public abstract class CustomWebViewPage<TModel> : CustomWebViewPage {
  public new Html5Helper<TModel> Html5 { get; set; }

  public override void InitHelpers() {
    base.InitHelpers();
    Html5 = new Html5Helper<TModel>(base.ViewContext, this);
  }
}

public class Html5Helper<TModel> : Html5Helper {
  public Html5Helper(ViewContext viewContext, IViewDataContainer container)
    : this(viewContext, container, RouteTable.Routes) {
  }

  public Html5Helper(ViewContext viewContext, IViewDataContainer container, 
      RouteCollection routeCollection) : base(viewContext, container,
      routeCollection) {
    ViewData = new ViewDataDictionary<TModel>(container.ViewData);
  }

  public new ViewDataDictionary<TModel> ViewData {
    get;
    private set;
  }
}

Now you can write extension methods of Html5Helper<TModel> which will have access to the model type much like HtmlHelper<TModel> does.

As usual, if there’s a change you want to make, there’s probably an extensibility point in ASP.NET MVC that’ll let you make it. The tricky part of course, in some cases, is finding the correct point.

comments suggest edit

It pains me to say it, but ASP.NET MVC 3 introduces a minor regression in routing from ASP.NET MVC 2. The good news is that there’s an easy workaround.

The bug manifests when you have a route with two consecutive optional URL parameters and you attempt to use the route to generate an URL. The incoming request matching behavior is unchanged and continues to work fine.

For example, suppose you have the following route defined:

routes.MapRoute("by-day", 
        "archive/{month}/{day}",
        new { controller = "Home", action = "Index", 
            month = UrlParameter.Optional, day = UrlParameter.Optional }
);

Notice that the month and day parameters are both optional.

routes.MapRoute("by-day", 
        "archive/{month}/{day}",
        new { controller = "Home", action = "Index", 
            month = UrlParameter.Optional, day = UrlParameter.Optional }
);

Now suppose you have the following view code to generate URLs using this route.

@Url.RouteUrl("by-day", new { month = 1, day = 23 })
@Url.RouteUrl("by-day", new { month = 1 })
@Url.RouteUrl("by-day", null)

In ASP.NET MVC 2 the above code (well actually, the equivalent to the above code since Razor didn’t exist in ASP.NET MVC 2) would result in the following URLs as you would expect:

  • /archive/1/23
  • /archive/1
  • /archive

But in ASP.NET MVC 3, you get:

  • /archive/1/23
  • /archive/1

In the last case, the value returned is nullbecause of this bug. The bug occurs when two or more consecutive optional URL parameters don’t have values specified for URL generation.

Let’s look at the workaround first, then we’ll dive deeper into why this bug occurs.

The Workaround

The workaround is simple. To fix this issue, change the existing route to not have any optional parameters by removing the default values for month and day. This route now handles the first URL where month and day was specified.

We then add a new route for the other two cases, but this route only has one optional month parameter.

Here are the two routes after we’re done with these changes.

routes.MapRoute("by-day", 
        "archive/{month}/{day}",
        new { controller = "Home", action = "Index"}
);

routes.MapRoute("by-month", 
        "archive/{month}",
        new { controller = "Home", action = "Index", 
            month = UrlParameter.Optional}
);

And now, we need to change the last two calls to generate URLs to use the by-month route.

@Url.RouteUrl("by-day", new { month = 1, day = 23 })
@Url.RouteUrl("by-month", new { month = 1 })
@Url.RouteUrl("by-month", null)

Just to be clear, this bug affects all the URL generation methods in ASP.NET MVC. So if you were generating action links like so:

@Html.ActionLink("sample", "Index", "Home", new { month = 1, day = 23 }, null)
@Html.ActionLink("sample", "Index", "Home", new { month = 1}, null)
@Html.ActionLink("sample", "Index", "Home")

The last one would be broken without the workaround just provided.

The workaround is not too bad if you happen to follow the practice of centralizing your URL generation. For example, the developers building http://forums.asp.net/ ran into this problem as well during the upgrade to ASP.NET MVC 3. But rather than having calls to ActionLink all over their views, they have calls to methods that are specific to their application domain such as ForumDetailUrl. This allowed them to workaround this issue by updating a single method.

The Root Cause

For the insanely curious, let’s look at the root cause of this bug. Going back to the original route defined at the top of this post, we never tried generating an URL where only the second optional parameter was specified.

@Url.RouteUrl("by-day", new { day = 23 })

This call really should fail because we didn’t specify a value for the first optional parameter, month. If it’s not clear why it should fail, suppose we allowed this to succeed, what URL would it generate? /archive/23?  Well that’s obviously not correct because when a request is made for that URL, 23 will be interpreted to be the month, not the date.

In ASP.NET MVC 2, if you made this call, you ended up with /archive/System.Web.Mvc.UrlParameter/23. UrlParameter.Optional is a class introduced by ASP.NET MVC 2 which ships on its own schedule outside of the core ASP.NET Framework. What that means is we added this new class which provided this new behavior in ASP.NET MVC, but core routing didn’t know about it.

The way we fixed this in ASP.NET MVC 3 was to make the ToString method of UrlParameter.Optional return an empty string. That solved this bug, but uncovered a bug in core routing where a route with optional parameters having default values behaves incorrectly when two of them don’t have values specified during URL generation. Sound familiar?

In hindsight, I think it was a mistake to take this fix because it caused a regression for many applications that had worked around the bug. The bug was found very late in our ship cycle and this is just one of the many challenging decisions we make when building software that sometimes don’t work out the way you hoped or expected. All we can do is learn from it and let the experience factor into the next time we are faced with such a dilemma.

The good news is we have bugs logged against this behavior in core ASP.NET Routing so hopefully this will all get resolved in the next core .NET framework release.

nuget, open source, code comments suggest edit

Today I’m pleased to announce the release of NuGet 1.1 to the VS Extension Gallery and CodePlex. If you have NuGet 1.0 installed, just launch the VS Extension Manager (via Tools | Extension Manager menu) and click on the Updates tab.

If you don’t see any updates, make sure to enable automatic detection of available updates.

Extension
Manager

If you are running VS 2010 SP1 Beta, you might run into the following error message when attempting to upgrade to NuGet 1.1 if you have an older version installed.

Visual Studio Extension Installer
(3)

The workaround is to simply uninstall NuGet and then install it from the VS Extension Gallery.

It turns out that our previous VSIX was signed with an incorrect certificate and our updated VSIX is signed with the correct certificate. VS 2010 SP1 now compares and verifies that the certificates of the old and new VSIX match during an upgrade.

If you don’t have NuGet installed, click the Online tab and type in “NuGet” (sans quotes) to find it.

nuget-in-vs-gallery

The VSIX and updated command line tool (used to create and publish packages) is also available on CodePlex.com.

What’s New in 1.1?

Much of the work in this release was focused on bug fixes. Now that CodePlex.com supports directly linking to filtered views of the issue tracker, I can provide you a link to all the issues fixed in 1.1. Smile

In this post, I’ll highlight some of the new features.

Recent Packages Tab

One of the first changes you might notice is that we have a new tab in the dialog that shows packages that you’ve installed recently. Click the screenshot below for a larger view.

NuGet-Recent-Packages

The recent packages shows the last 20 packages that you’ve directly installed. This often comes in handy when you tend to use the same packages over and over again in multiple projects. Right now, the list simply shows the most recently used, but there has been discussion about perhaps changing the behavior to sort by the packages used most often. Feel free to chime in if you want the behavior changed.

By the way, you can also use the Powershell within the Package Manager Console to get this same information with the –Recent flag to Get-Package.

nuget-ps-recent

Progress Bar During Installation

When you install a package, you’ll now notice a progress bar dialog that shows up with output from installing the package.

Installing

The dialog is meant to give an indication of progress, but also gets out of your way immediately when the installation is complete so you’re not stuck clicking a bunch of Close buttons. But what happens if you actually want to review that output?

Package Manager Output Window

NuGet 1.1 also posts that output to the Output window now. When you go to the Output window, you’ll need to select output from the Package Manager to see that output as in the screenshot.

nuget-output

This allows you to review what changes a package made at your leisure after the fact.

Dependency Resolution Algorithm

NuGet 1.1 includes an update to our dependency resolution algorithm which is described in David Ebbo’s blog post on this topic in the section titled “NuGet 1.1 twist”.

Support for F# Project Types

If you are using F#, this one’s for you.

PowerShell Improvements

Thanks to our newest core contributor, Oisin Grehan a PowerShell MVP who really knows his stuff, NuGet 1.1 has a lot of improvements to the PowerShell Console and scripts. I have to admit, a lot of it is over my head as I’m no PowerShell guru, but we’re now much more compliant with PowerShell conventions. Or so I’ve been told. Oisin has been driving a lot of improvements with our PowerShell support.

We also now execute commands within the Powershell Console asynchronously. This means that a long running command won’t freeze the rest of Visual Studio while it runs.

And many others!

There were a lot of other tweaks, bug fixes, and minor improvements that were not worth mentioning here, but they are all listed in our release notes.

Breaking Changes?

There are some minor changes that hopefully won’t break 99.9% of you. If you recall, we made our PowerShell scripts fit with PowerShell conventions. If you have a package that calls one of these methods, your package might need to be updated.Here’s the list of changes we made:

  • Removed List-Package. Use Get-Package instead.
  • Get-ProjectNames was removed. Use Get-Project instead and examine the Name property.
  • Add-BindingRedirects was renamed to Add-BindingRedirect.

What’s next?

Our hope is to have a monthly point release, though we may adjust some iterations to be longer as needed. To see what we’re planning for the 1.2 release, check out this link of issues for 1.2 (note that by the time you read this, some of these features might already be implemented). We’re constantly refining our planning so nothing is set in stone.

For a small taste of what’s coming in 1.2, check out this video by David Ebbo showing a streamlined workflow for creating packages.

Get Involved!

I bet many of you have some great ideas on what we should and shouldn’t do for NuGet. We’d love to have you come over and share your great ideas in our discussion list. Or if you’re looking for other ways to contribute, check out our guide to contributing to NuGet.

asp.net, nuget, open source comments suggest edit

Over a decade ago, Tim Berners-Lee, creator of the World Wide Web instructed the world know that cool URIs don’t change with what appears to be a poem, but it doesn’t rhyme and it’s not haiku.

What makes a cool URI? \ A cool URI is one which does not change. \ What sorts of URI change? \ URIs don’t change: people change them.

In a related article, URL as UI, usability expert Jakob Nielsen lists the following criteria for a usable site:

  • a domain name that is easy to remember and easy to spell
  • short URLs
  • easy-to-type URLs
  • URLs that visualize the site structure
  • URLs that are “hackable” to allow users to move to higher levels of the information architecture by hacking off the end of the URL
  • persistent URLs that don’t change

The permanence of URLs is a fundamental trait of the web that seems to run counter to one of the benefits of using a feature like ASP.NET Routing. For example, one benefit of routing is you can change a route from {controller}/{action}/{id} to {controller}/{id}/{action} and have every URL in your site corresponding to that route automatically be updated.

This is very nice during development when you’re still fleshing out your URLs and haven’t committed to anything, but once you’ve published your site, changing a route URL violates the sacred trait of URL permanence.

This is exactly where I find myself with Subtext. All of our existing URLs end with the .aspx extension, a practice which Jon Udell convincingly argued is harmful. In the upcoming version of Subtext, we’re moving to extensionless URLs by building upon the great support built into ASP.NET 4 and Routing.

I could simply change our routes to remove the .aspx extension, but that would break nearly every existing URL in every blog running on Subtext. So much for URL permanence, right?

There’s a Better Way

Rather than changing routes, what I really want is a way to simply redirect the existing route to a new route. This is pretty easy, but there are a few caveats to keep in mind that make it non-trivial.

  1. Since you don’t want to generate URLs for the old route, the legacy route should never be selected for URL generation. It’s only for matching incoming requests.
  2. The legacy route should be registered after the new URL to ensure it doesn’t accidentally match and supersede the new URL.

I wrote a library that provides a RedirectRoute and a simple extension method for registering a RedirectRoute that satisfies these conditions. Let’s look at an example of how it would be used.

Let’s suppose we have the following route defined and the site has been published to the web..

routes.MapRoute("old", "foo/{controller}/{action}/{id}");

But later, we decide we want all such URLs to start with /bar instead and we want to re-order the id and action segments of the URL.

Here’s an example of how we can do that using this new library.

var route = routes.MapRoute("new", "bar/{controller}/{id}/{action}");
routes.Redirect(r => r.MapRoute("old", "foo/{controller}/{action}/{id}"))  .To(route);

This snippet registers the new route and passes that route to the RedirectRoute that was returned by a call to the Redirect extension method. The RedirectRoute delegates to the old route to match incoming requests. With this in place, every request matching the old route will be redirected to the new route.

Thus a request for /foo/home/index/123 will be redirected to /bar/home/123/index.

Why The Lambda Expression?

To fully understand what’s going on under the hood, I need to explain why the API takes in a lambda expression rather than simply taking in two routes, old route and new route.

Let’s suppose that the API did just that, simply accepted two routes. Here’s what a naïve attempt to use the method might look like.

var new = routes.MapRoute("new", "bar/{controller}/{id}/{action}");
var old = routes.MapRoute("new", "foo/{controller}/{action}/{id}");
routes.Redirect(old).To(new);

Hopefully it’s immediately apparent why this is not good. The old route is mapped before the redirect route. So the redirect route will never be matched. 

The MapRoute extension method not only creates a route, but it adds it to the route collection. So we could have manually created the route, but that’s a pain if you’re already using the MapRoute method to create the route. Or, we could have done this:

var new = routes.MapRoute("new", "bar/{controller}/{id}/{action}");
var throwAway = new RouteCollection();
var old = throwAway.MapRoute("new", "foo/{controller}/{action}/{id}");
routes.Redirect(old).To(new);

Requiring the user of the API to create a throwaway route collection is ugly when the API itself could do it for you. Hence the lambda expression argument to Redirect. Internally, the method creates a throwaway route collection and calls the expression against that instead of against the main route collection.

Implementation Details

I won’t post the full source here, but the implementation details are pretty simple. Here’s the implementation of GetRouteData which is the method called when matching incoming requests.

public override RouteData GetRouteData(HttpContextBase httpContext) {
    // Use the original route to match
    var routeData = SourceRoute.GetRouteData(httpContext);
    if (routeData == null) {
        return null;
    }
    // But swap its route handler with our own
    routeData.RouteHandler = this;
    return routeData;
}

Notice that I use the source route, which is the old route passed into the redirect route, to match the request, but I swap the route handler with the redirect route. RedirectRoute also implements IRouteHandler. It was a little implementation shortcut I took which happens to work fine in this case.

The implementation of GetVirtualPath is even simpler.

public override VirtualPathData GetVirtualPath(RequestContext requestContext  , RouteValueDictionary values) {
    // Redirect routes never generate an URL.
    return null;
}

We never want to generate a URL to the old route, so this method always returns null.

As mentioned, RedirectRoute implements IRouteHandler, so we should look at its implementation.

public IHttpHandler GetHttpHandler(RequestContext requestContext) {
  var requestRouteValues = requestContext.RouteData.Values;

  var routeValues = AdditionalRouteValues.Merge(requestRouteValues);

  var vpd = TargetRoute.GetVirtualPath(requestContext, routeValues);
  string targetUrl = null;
  if (vpd != null) {
    targetUrl = "~/" + vpd.VirtualPath;
    return new RedirectHttpHandler(targetUrl, Permanent, isReusable: false);
  }
  return new DelegateHttpHandler(    httpContext => httpContext.Response.StatusCode = 404, false);
}

Notice that we make use of the DelegateHttpHandler which is something I wrote about a while ago.

Where to get it?

All the code I showed here is now part of the RouteMagic library I blogged about recently. I’ve updated the package so all you need to do is Install-Package RouteMagicwithin NuGet.

asp.net, code, asp.net mvc, open source comments suggest edit

Over the past couple of years, I’ve written several blog posts on ASP.NET Routing where I provided various extensions to routing. Typically such blog posts included a zip download of the binaries and source code to allow readers to easily try out the code.

But that’s always been a real pain and most people don’t bother. But now, there’s a better way to share such code. Moving forward, I’ll be using NuGet packages as a means of sharing my code samples.

In the case of my routing extensions, I’ve compiled them into a solution I call RouteMagic (source is available on GitHub). This solution includes two packages, RouteMagic.Mvc (extensions specific to ASP.NET MVC Routing) and RouteMagic (more general ASP.NET Routing extensions). The RouteMagic.Mvc package depends on the RouteMagic package.

These packages are available in the NuGet feed!

After installing the RouteMagic.Mvc package, you’ll have the following  features available to you.

The source code for the solution contains the following projects:

  • RouteMagic
  • RouteMagic.Mvc
  • RouteMagic.Demo.Web (ASP.NET MVC Web application used to demo these features)
  • UnitTests

This is just a pet project I put together based on various blog posts I’ve written. I’d love to see some of these ideas eventually make it into the Framework. But until then, you’ll probably see these things make it into Subtext for sure!

asp.net, asp.net mvc comments suggest edit

Ni hao ma!

Hot on the heels of the RTM release of ASP.NET MVC 3, we now have localized versions of ASP.NET MVC in 9 languages! The installation links within the Web Platform Installer was updated. If you want to download the installer yourself, you can go to the English download page and select your language or click on one of the nine languages below:

If you speak one of these nine languages, you can now develop with ASP.NET MVC in your native language. Salud!

nuget comments suggest edit

Last night I got a little treat in the mail from the kind folks at StickerMule. I really appreciate how they support open source projects with such great stickers.

NuGet-Stickers-550x365 Look at all those little NuGets!

Just in time as I have a few events I’ll be going to where I can hand some out such as Web Camps Argentina/Brazil, the MVP Summit, and if I’m selected, Mix 2011.

After that, I need to figure out the best way to give the rest out since I won’t be travelling a whole lot this year. We’ll need to have a Nerd Dinner or something. I’ll give some to Scott Hanselman since he travels a lot too.

asp.net mvc, open source, nuget comments suggest edit

For those of you who enjoy learning about a technology via screencast, I’ve recorded a video to accompany and complement this blog post. The screencast shows you what this package does, and the blog post covers more of the implementation details.

A key feature of any package manager is the ability to let you know when there’s an update available for a package and let you easily install that update.

For example, when we deployed the release candidate for NuGet, the Visual Studio Extension Manager displayed the release in the Updates section.

Extension Manager Displaying NuGet as an Available
Updates

Likewise, NuGet lets you easily see updates for installed packages. You can either run the List-Package –Updates command:

list-package-updates

Or you can click on the Updates node of the Add Package dialog:

updates-tab

This feature is very handy when using Visual Studio to develop software such as Subtext, an open source blog engine I run in my spare time. But I started thinking about the users of Subtext and the hoops they jump through to upgrade Subtext itself.

Wouldn’t it be nice if Subtext could notify users when a new version is available and let them install it directly from the admin section of the running website completely outside of Visual Studio? Why yes, that would be nice.

NuGet to the Rescue!

Well my friends, that’s where NuGet comes into play. While most people know NuGet as a Visual Studio extension for pulling in and referencing libraries in your project, there’s a core API that’s completely agnostic of the hosting environment whether it be Visual Studio, PowerShell, or other. That core API is implemented in the assemly, NuGet.Core.dll.

This assembly allows us to take advantage of many of the features of NuGet outside of Visual Studio such as within a running web site!

The basic concept is this:

  1. Package up the first version of a website as a NuGet package.
  2. Install this package in the website itself. I know, crazy talk, right?
  3. Add a custom NuGet client that runs inside the website and checks for updates to the one package that’s installed.
  4. When the next version of the website is ready, package it up and deploy it to the package feed for the website. Now, the users of the website can be notified that an update is available.s

I should point out a brief note about step #2, because this is going to be confusing. When I say install the package in the website, I mean to contrast that with installing a package into your Web Application Project for the website.

When you install a package into your Web Application Project, you use the standard NuGet client within Visual Studio. But when you deploy your website, the custom NuGet client within the live website will install the website package into a different location. In the example I’ll show you, that location is within the App_Data\packages folder.

The AutoUpdate Package

Earlier this week, I gave an online presentation to the Community For MVC (C4MVC) user’s group on NuGet. During that talk I demonstrated a prototype package I wrote called AutoUpdate. This package adds a new area to the target website named “Installation”. It also adds a nuspec file to the root of the application to make it easy to package up the website as a NuGet Package.

The steps to use the package are very easy.

  1. Install-Package AutoUpdate.
  2. In Web.config, modify the appSetting PackageSource to point to your package source. In my demo, I just pointed it to a folder on my machine for demonstration purposes. But this source is where you would publish updates for your package.
  3. In the Package Console, run the New-Package script (This creates packages up the website in a NuPkg file).
  4. Copy the package into the App_Data\Packages folder of the site.
  5. When you are ready to publish the next version as an update, increment the version number in the nuspec file and run the New-Package script again.
  6. Deploy the updated package to the package source.
  7. Now, when your users visit /installation/updates/check within the web site, they’ll be notified that an update is available and will be able to install the update.

The Results

Lets see the results of installing the AutoUpdate package and I’ll highlight some of the code that makes the package work. The following screenshot shows a very basic sample application I wrote.

home-page

The homepage here has a link to check for updates which links to an action within the area installed by the AutoUpdate package. That action contains the logic to check for updates for this application’s package.

Clicking on that link requires me to login first and then I get to this page:

update-available

As I mentioned in the steps before, I packaged up the first version of the application as a package and “installed” it into the App_Data folder.

That yellow bar above is the result of an asynchronous JSON request to see if an update is available. It’s a little redundant on this page, but I could have it show up on every page within the admin as a notification.

Under the Hood

Let’s take a look at the controller that responds to that asynchronous request.

public ActionResult Check(string packageId) {
  var projectManager = GetProjectManager();
  var installed = GetInstalledPackage(projectManager, packageId);
  var update = projectManager.GetUpdate(installed);

  var installationState = new InstallationState {
    Installed = installed,
    Update = update
  };

  if (Request.IsAjaxRequest()) {
    var result = new { 
      Version = (update != null ? update.Version.ToString() : null), 
      UpdateAvailable = (update != null)
    };
    return Json(result, JsonRequestBehavior.AllowGet);
  }

  return View(installationState);
}

The logic here is pretty straightforward. We grab a project manager. We then grab a reference to the current installed package representing this application. And then we check to see if there’s an update available. If there isn’t an update, the GetUpdate method returns false. There’s a couple of methods here that I wrote we need to look at.

The first method very simply retrieves a project manager. I encapsulated it into a method since I call it in a couple different places.

private WebProjectManager GetProjectManager() {
  string feedUrl = @"D:\dev\hg\AutoUpdateDemo\test-package-source";
  string siteRoot = Request.MapPath("~/");

  return new WebProjectManager(feedUrl, siteRoot);
}

There’s a couple things to note here. I hard coded the feedUrl for demonstration purposes to point to a directory on my machine. This is a nice demonstrations that NuGet can simply treat a directory containing packages as a package source.

For your auto-updating web application, that should point to a custom feed you host specifically for your website. Or, point it to the official NuGet feed and put your website up there. It’s up to you.

This method returns an instance of WebProjectManager. This is a class that I had to copy from the System.Web.WebPages.Administration.dll assembly because it’s marked internal. I don’t know why it’s internal, so I’ll see if we can fix that. It’s not my fault so please direct your hate mail elsewhere. Smile

What is the web project manager? Well the WebMatrix product which includes the ASP.NET Web Pages framework includes a web-based NuGet client for simple web sites. This allows packages to be installed into a running website. I’m just stealing that code and re-purposing it for my own needs.

Now, we just need to use the project manager to query the package source to see if there’s an update available. This is really easy.

private IPackage GetInstalledPackage(WebProjectManager projectManager,     string packageId) {
  var installed = projectManager.GetInstalledPackages("AutoUpdate.Web")    .Where(p => p.Id == packageId);

  var installedPackages = installed.ToList();
  return installedPackages.First();
}

What’s really cool is that we can just send a LINQ query to the server because we’re running OData on the server, it’ll run that query on the server and send us back the packages that fulfill the query.

That’s all the code necessary to check for updates. The next step is to write an action method to handle the upgrade. That’s pretty easy too.

public ActionResult Upgrade(string packageId) {
  var projectManager = GetProjectManager();
  var installed = GetInstalledPackage(projectManager, packageId);
  var update = projectManager.GetUpdate(installed);
  projectManager.UpdatePackage(update);

  if (Request.IsAjaxRequest()) {
    return Json(new { 
      Success = true, 
      Version = update.Version.ToString()
    }, JsonRequestBehavior.AllowGet);
  }
  return View(update);
}

This code starts off the same way that our code to check for the update does, but instead of simply returning the update, we call projectManager.UpdatePackage on the update. That method call updates the website to the latest version.

The rest of the method is simply concerned with returning the result of the upgrade.

Try it Yourself

If you would like to try it yourself, please keep a one big caveat in mind. This is rough proof of concept quality code. I hope to shape it into something more robust over time and publish it in the main package feed. Until then, I’ll post it here for people to try out. If there’s a lot of interest, I’ll post the source on CodePlex.com.

So with that in mind, give the AutoUpdate package a try

install-package AutoUpdate

and give me some feedback!

UPDATE: I upgraded the project to target ASP.NET MVC 4 and posted the source on GitHub. I have no idea if it still works, so please do submit pull requests if you find bugs you would like to have fixed.

asp.net, asp.net mvc, code, open source, nuget comments suggest edit

The changing of the year is a time of celebration as people reflect thoughtfully on the past year and grow excited with anticipation for what’s to come in the year ahead.

Today, there’s one less thing to anticipate as we announce the final release of ASP.NET MVC 3 and NuGet 1.0!

double-rainbow \ Oh yeah, this never gets old.

Install it via Web Platform Installer or download the installer directly to run it yourself.

Here are a few helpful resources for learning more about this release:

Those links will provide more details about what’s new in ASP.NET MVC 3, but I’ll give a quick bullet list of some of the deliciousness you have to look forward to. Again, visit the links above for full details.

  • Razor view engine which provides a very streamlined syntax for writing clean and concise views.
  • Improved support for Dependency Injection
  • Global Action Filters
  • jQuery based Unobtrusive Ajax and Client Validation.
  • ViewBag property for dynamic access to ViewData.
  • Support for view engine selection in the New Project and Add View dialog
  • And much more!

For those of you wishing to upgrade an ASP.NET MVC 2 application to ASP.NET MVC 3, check out Marcin Dobosz’s post about our ASP.NET MVC 3 Projct upgrader tool. The tool itself can be found on our CodePlex website.

NuGet 1.0 RTM

Also included in this release is the 1.0 release of NuGet. I’ll let you in on a little secret though, if you upgraded NuGet via the Visual Studio Extension Gallery, then you’ve been running the 1.0 release for a little while now.

If you already have an older version of NuGet installed, the ASP.NET MVC 3 installer cannot upgrade it. Instead launch the VS Extension manager (within Visual Studio go to the Tools menu and select Extension Manager) and click on the Updates tab.

Just recently we announced the Beta release of our NuGet Gallery. Opening the door to the gallery will make it very easy to publish packages, so what are you waiting for!?

At this point I’m obligated to point out that everything about NuGet is open source and we’re always looking for contributors. If you’re interested in contributing, but are finding impediments to it, let us know what we can improve to make it easier to get involved. Here’s the full list of OSS projects that make up the NuGet client and the server piece:

Show Me The Open Source Code!

As we did with ASP.NET MVC 1.0 and ASP.NET MVC 2, the source for the ASP.NET MVC 3 assembly is being released under the OSI certified Ms-PL license. The Ms-PL licensed source code is available as a zip file at the download center.

If you’d like to see the source code for ASP.NET Web Pages and our MVC Futures project, we posted that on CodePlex.com too.

What’s Next?

So what’s next? Well you can probably count as well as I can, so it’s time to start getting planning for ASP.NET MVC 4 and NuGet 2.0 in full gear. Though this time around, with NuGet now available, we have the means to easily distribute a lot of smaller releases throughout the year as packages, with the idea that many of these may make their way back into the core product. I’m sure you’ll see a lot of experimentation in that regard.

nuget, open source comments suggest edit

As David Ebbo blogged today, the NuGet Gallery is now open to the public. The goal of the NuGet Gallery is to be the hub for NuGet users and package authors alike. Users should be able to search and discover packages with detailed information on each one and eventually rate them. Package authors can register for an API key and upload packages.

We’re not quite where we want to be with the gallery, but we’re moving in the right direction. If you want to see us get there more quickly, feel free to lend a hand. The gallery is running on fully open source code!

In this blog post, I wanted to cover step by step what it takes to create and upload a package.

Create Your Package

Well the first step is to create a package so you have something to upload. If you’re well acquainted with creating packages, feel free to skip this section, but you may learn a few tips if you stick with it.

I’ll start with a simple example that I did recently. The XML-RPC.NET library by Charles Cook is very useful for implementing XML-RPC Services and clients. It powers the MetaWeblog API support in Subtext. As a courtesy, I recently asked Charles if he would mind if I created a NuGet package for his library for him, to which he said yes!

So on my machine, I created a folder named after the latest 2.5 release, xmlrpcnet.2.5.0. Here’s the directory structure I ended up with.

package-folder-structure

By convention, the lib folder is where you place assemblies that will get added as referenced assemblies to the target project when installing this package. Since this assembly only supports .NET 2.0 and above, I put it in the net20 subfolder of the lib folder.

The other required file is the .nuspec file, which contains the metadata used to build the package. Let’s take a look at the contents of that file.

<?xml version="1.0" encoding="utf-8"?> 
<package> 
  <metadata> 
    <id>xmlrpcnet</id> 
    <version>2.5.0</version> 
    <authors>Charles Cook</authors> 
    <owners>Phil Haack</owners>
    <description>A client and server XML-RPC library for .Net.</description> 
    <projectUrl>http://www.xml-rpc.net/</projectUrl>
    <licenseUrl>http://www.opensource.org/licenses/mit-license.php</licenseUrl>
    <tags>xml-rpc xml rpc .net20 .net35 .net40</tags>
    <language>en-US</language> 
  </metadata> 
</package>

There’s a couple of things I want to call out here. Notice that I specified Charles Cook in the authors element, but put my own name in the owners element. Authors represent the authors of the library within the package, while the owner typically represents the person who created the package itself. This allows people to know who to contact if there’s a problem with the package vs a problem with the library within the package.

In general, we hope that most of the time, the authors and the owners are one and the same. For example, someday I’d love to help Charles take ownership of his packages. Until that day, I’m happy to create and upload them myself.

If somebody creates a package for a library that you authored and uploads it to NuGet, assume it’s a favor they did to get your library out there. If you wish to take ownership, feel free to contact them and they can assign the packages over to you. This is the type of thing we’d like to see resolved by the community and not via some policy rules on the gallery site. This is a case where the gallery could do a lot to make this sort of interaction easier, but does not have such features in place yet.

With this in place, it’s time to create the package. To do that, we’ll need the NuGet.exe console application. Copy it to a utility directory and add it to your path, or copy it to the parent folder of the package folder.

nuget-dir

Now, open a command prompt and navigate to the directory and run the nuget pack command.

nuget pack path-to-nuspec-file

Here’s a screenshot of what I did:

nuget-pack

Pro tip: What I really did was add a batch file I call build.cmd in the same directory that I put the NuGet.exe file. The contents of the batch file is a single line:

for /r %%x in (*.nuspec) do nuget pack "%%x" -o d:\packages\

What that does is run the nuget pack command on every subdirectory of the current directory. I have a folder that contains multiple packages that I’m working on and I can easily rebuild them all with this batch file.

Ok, so now we have the package, let’s publish it! But first, we have to create an account on the NuGet Gallery website.

Register and Upload

The first step is to register for an account at http://nuget.org/Users/Account/Register. Once you have an account, click on the Contribute tab. This page gives you several options for managing packages (click to enlarge).

contribute-tab

To upload your package, click on the Add New Package link.

upload-package

Notice there’s two options. At this point, you can simply browse for the package you created and upload it and you’re done. In a matter of a few minutes, it should appear in the public feed.

The second option allows you to host your package file in a location other than the NuGet gallery such as CodePlex.com, Google Code, etc. Simply enter the the direct URL to the package and when someone tries to install your package, the NuGet client will redirect the download request to the external package URL.

Submit From The Command Line

Ok, that’s pretty easy. But you’re a command line junky, right? Or perhaps you’re automating package submission.

Well you’re in luck, it’s pretty easy to submit your package directly from the command line. But first, you’re going to need an API key.

Visit the My Account page (http://nuget.org/Contribute/MyAccount) and make a note of your API key (click image below to enlarge it).

nuget-gallery-api-key

Be sure to keep that API key secret! Don’t give it out like I just did. If you do happen to accidentally leak your API key, you can click the Generate New Key button, again like I just did. You didn’t really think I’d let you know my API key, did you?

Now, using the same NuGet.exe command line tool we downloaded earlier, we can push the package to the gallery using the nuget push command.

nuget push path-to-nupkg api-key –source http://packages.nuget.org/v1/

Here’s a screenshot of the exact command I ran.

publishing-nupkg

Shoot! There I go showing off my secret API key again! I better regenerate that.

As you can see, this command uploaded my package and published it to the feed. I can login and visit the Manage My Contributions page to see this package and even make changes to it if necessary.

Moving Forward

We’re still working out the kinks in the site and hopefully, by the time you read this blog post, this particular issue will be fixed. Also, we’re planning to update the NuGet.exe client and make the NuGet gallery be the default source so that the –source flag is not required.

As David mentioned, the site was primarily developed as a CodePlex.com project by the Nimble Pros in a very short amount of time. There’s two major components to the site. There’s the front-end Orchard Gallery built as an Orchard module. This powers the gallery website that you see when you visit http://nuget.org/. There’s also the back-end gallery server which hosts the OData feed used to browse and search for packages as well as the WCF service endpoint for publishing packages.

Each of these components are open source projects which means if you really wanted to, you could take the code and host your own gallery website. Orchard will be using the same code to host its own gallery of Orchard modules.

Also, these projects accept contributions! I personally haven’t spent much time in the code, but I hope to find some free time to chip in myself.

asp.net mvc, asp.net, code comments suggest edit

In part 1 of this series, we looked at the scenario for grouping routes and how we can implement matching incoming requests with a grouped set of routes.

In this blog post, I’ll flesh out the implementation of URL Generation.

Url Generation Implementation

URL generation for a group route is tricky, especially when using named routes because the individual routes that make up the group aren’t in the main route collection.

As I noted before, the only route that’s actually added to the route table is the GroupRoute. Thus if you supply a route name for one of the child routes (such as “r1”) during URL generation, you’ll get a null URL.

Interestingly enough, in this case, if you don’t use named routes when using URL generation, everything works just fine. However, since I heartily recommend using named routes all the time, I should cover that situation.

So what we need to do here is supply two route names during URL generation. One for the group route, and one for the child route. How do we supply the child route name? We’re going to have to supply it in the route values. Here’s an example of generating an URL in this manner:

@Html.RouteLink("Hello World Child", "group", new { __RouteName = "hello-world3" }) 

Note that the second parameter, “group”, refers to the route name for the GroupRoute that we registered. The route value __RouteName is passed into the GroupRoute so that it can look in its own collection of routes for the matching child route.

In the following code sample, I’ve highlighted the essential part of the URL generation logic within the GroupRoute class.

public override VirtualPathData GetVirtualPath(RequestContext 
    requestContext, RouteValueDictionary values) {
  string routeName = values.GetRouteName();
  var virtualPath = ChildRoutes.GetVirtualPath(requestContext, 
    routeName as string, values.WithoutRouteName());
  if (virtualPath != null) {
    string rewrittenVirtualPath = 
      virtualPath.VirtualPath.WithoutApplicationPath(requestContext);
    string directoryPath = VirtualPath.WithoutTildePrefix(); // remove tilde
    rewrittenVirtualPath = rewrittenVirtualPath.Insert(0, 
    directoryPath.WithoutTrailingSlash());
    virtualPath.VirtualPath = rewrittenVirtualPath.Remove(0, 1);
  }

  return virtualPath;
}

The code grabs the route name for the child route from the supplied route values. Notice that I’m using an extension method I wrote in my last blog post.

The block of code after the highlighted portion rewrites the virtual path back to the full virtual path for the parent GroupRoute. This ensures that the virtual path that’s eventually returned to the caller will actually work, since the individual routes within the group don’t have a clue that they’re within a group.

In a follow-up blog post, I’ll wrap up this series and provide access to the full source code.

asp.net mvc, asp.net, code comments suggest edit

I gave a presentation to another team at Microsoft yesterday on ASP.NET MVC and the Razor view engine and someone asked if there was a reference for the Razor syntax.

It turns out, there is a pretty good guide about Razor available, but it’s focused on covering the basics of web programming using Razor and inline pages and not just the Razor syntax.

So I thought it might be handy to write up a a really concise quick reference about the Razor syntax.

Syntax/Sample Razor Web Forms Equivalent (or remarks)
Code Block
@{ 
  int x = 123; 
  string y = "because.";
}
<%
  int x = 123; 
  string y = "because."; 
%>
      
Expression (Html Encoded)
<span>@model.Message</span>
<span><%: model.Message %></span>
Expression (Unencoded)
<span>
@Html.Raw(model.Message)
</span>
<span><%= model.Message %></span>
Combining Text and markup
@foreach(var item in items) {
  <span>@item.Prop</span> 
}
<% foreach(var item in items) { %>
  <span><%: item.Prop %></span>
<% } %>
Mixing code and Plain text
@if (foo) {
  <text>Plain Text</text> 
}
<% if (foo) { %> 
  Plain Text 
<% } %>
Using block
@using (Html.BeginForm()) {
  <input type="text" value="input here">
}
<% using (Html.BeginForm()) { %>
  <input type="text" value="input here">
<% } %>
Mixing code and plain text (alternate)
@if (foo) {
  @:Plain Text is @bar
}
Same as above
Email Addresses
Hi philha@example.com
Razor recognizes basic email format and is smart enough not to treat the @ as a code delimiter
Explicit Expression
<span>ISBN@(isbnNumber)</span>
In this case, we need to be explicit about the expression by using parentheses.
Escaping the @ sign
<span>In Razor, you use the 
@@foo to display the value 
of foo</span>
@@ renders a single @ in the response.
Server side Comment
@*
This is a server side 
multiline comment 
*@
<%--
This is a server side 
multiline comment
--%>
Calling generic method
@(MyClass.MyMethod<AType>())
Use parentheses to be explicit about what the expression is.
Creating a Razor Delegate
@{
  Func<dynamic, object> b = 
   @<strong>@item</strong>;
}
@b("Bold this")
Generates a Func<T, HelperResult> that you can call from within Razor. See this blog post for more details.
Mixing expressions and text
Hello @title. @name.
Hello <%: title %>. <%: name %>.
NEW IN RAZOR v2.0/ASP.NET MVC 4
Conditional attributes
<div class="@className"></div>
When className = null
<div></div>
When className = ""
<div class=""></div>
When className = "my-class"
<div class="my-class"></div>
Conditional attributes with other literal values
<div class="@className foo bar">
</div>
When className = null
<div class="foo bar"></div>
Notice the leading space in front of foo is removed.
When className = "my-class"
<div class="my-class foo bar">
</div>
Conditional data-* attributes.

data-* attributes are always rendered.
<div data-x="@xpos"></div>
When xpos = null or ""
<div data-x=""></div>
When xpos = "42"
<div data-x="42"></div>
Boolean attributes
<input type="checkbox"
  checked="@isChecked" />
When isChecked = true
<input type="checkbox"
  checked="checked" />
When isChecked = false
<input type="checkbox" />
URL Resolution with tilde
<script src="~/myscript.js">
</
script>
When the app is at /
<script src="/myscript.js">
</
script>
When running in a virtual application named MyApp
<script src="/MyApp/myscript.js">
</
script>

Notice in the “mixing expressions and text” example that Razor is smart enough to know that the ending period is a literal text punctuation and not meant to indicate that it’s trying to call a method or property of the expression.

Let me know if there are other examples you think should be placed in this guide. I hope you find this helpful.

UPDATE 12/30/2012: I’ve added a few new examples to the table of new additions to Razor v2/ASP.NET MVC 4 syntax. Razor got a lot better in that release!

Also, if you want to know more, consider buying the Programming ASP.NET MVC 4 book. Full disclosure, I’m one of the authors, but the other three authors are way better.

humor, blogging comments suggest edit

Dear Reader, I apologize for not blogging much lately. I know, total #fail, but I’ve been so f***ing busy lately. I thought I would start off this new year right with a top ten list FTW!

Without further ado, I present my list of the top 10 blogging clichés of

  1. These are things yours truly would never ever do, right?stones-on-a-beach

  2. The random photo: Starting off the list is a very common one that even gets the very best bloggers, including a random photo in the blog post completely irrelevant to the topic at hand as if trying to meet a stock photography quota for the month. Perhaps you own stock in the stock xchng (or perhaps I should!). At least try to make the photo slightly relevant so it adds something to the post.
  3. “Dear Reader”:Is there a more patronizingly boorish phrase to use to refer to those reading your blog than “Dear Reader”. Do you know if the person reading your blog is dear? Seriously, he or she could be a total prick who’s only redeeming quality is that he or she clicks on your AdSense link so you can buy a cup of coffee in two years. Do you realize that the person reading the blog might be me? I’m a total jerk and I don’t click on your ad links, but you just complimented me. Ha! When I read a blog post that uses the phrase “Dear Reader” what I see in my head is “Dear random person that I hope clicks on my ads”. I’ve used it at least five times. Don’t be like me.
  4. Apologizing for not posting regularly: I have a dirty little secret, nobody gives a flying f*** whether or not you’re posting regularly (unless you’re Randall Munroe, then we absolutely do care), so stop apologizing for it. They’re all using some sort of RSS aggregator in the first place so they didn’t even have a clue to how lame you are up until the point that just reminded them. Great job Sport!
  5. Using f*** when you really meant to say “fuck”: It’s just a fucking word! If you really mean to use it for emphasis, just fucking use it! Nobody, in the history of humanity, ever lost his sight, hearing, or sanity from reading the word fuck. Not to mention that you’re not fooling anyone when you type f***. Do you really have such a low opinion of your readers that you think they’re sitting there thinking “Gee, I wonder wonder what word that could be? Good thing he asterisked the fuck out of that word because I might go blind if he had spelled it out.”
  6. Overusing the word “fuck”: Whoa nelly! Just because it’s ok to unmask those asterisks from time to time doesn’t mean we should go overboard here. Slowly back away from the “F” key. The word is meant to be very lightly sprinkled to pack a powerful punch when you need it. It’s not meant to be poured liberally like salt in a futile attempt to salvage taste from your awful cooking.
  7. “Wah wah, I’m so busy.”:You know what, we’re all fucking busy, so shut your pie hole about it already.
  8. “Hinting at a super secret project you can’t reveal just yet.”: Yeah yeah, we get it. You know something we don’t know so you’re going to rub our faces in it like a bad little doggy who just did a no-no. Bad doggy! This may even be the reason you’re “so busy”. Well I have news for you…wait for it. Wait for it. Nobody cares! Maybe your project really is the next big thing since that little plastic triangle thingy that holds the pizza box up away from the cheese. Really, that thing is awesome! Maybe your project is better than that, but if you can’t talk about it yet, you’re just wasting bandwidth. Once again, shut your pie hole until you can talk about it.
  9. Top ten lists, for all values of “ten”:Top ten (or eight, or eleven, or any number) lists are a cop out. You know it, I know it, and your readers know it. Top ten lists are what happens when a blogger is in the middle of writing a blog post apologizing for not posting regularly and thinks, “What the f*** am I apologizing for?! I know, I’ll write a top ten list of the varieties of lint I found in my belly button.” Yeah, you’ll make the front page of Reddit, but at what cost of your soul, dear reader? What cost?
  10. Name Dropping:So just last year I was chatting with my friends Jeff Atwood (aka CodingHorror), George Clooney, and Miguel De Icaza (aka Mr. Mono) about how lame it is to name drop. There’s nothing lamer than that except for name dropping about fictitious events that never happened. Seriously, nobody is going to change their opinion of how lame you are just because you happened to have seen the neighbor of the third cousin of Bruce Schneier from across the conference hall floor (but if you did, high five right atcha!).
  11. “FTW!” Yes, we all know you’re so hard core and like to express your enthusiasm while simultaneously tweaking your nose at the powers that be, but seriously now. You’re all growed up and it’s time to lay this one to rest, in the same way you no longer play with your GI Joes except when the wife has the kids at her mother’s. One exception to this rule, it’s perfectly fine to use it on Twitter, but only because of its brevity and only until we come up with something better.

Yes, some of these clichés were also noted back in 2007 as reported in CodingHorror, but apparently nobody got the memo as they were still going strong in

  1. Now it is 2011 and I’ve made a new years resolution to avoid such blogging clichés. How am I doing so far?

Probably about as well as my resolution to stop procrastinating, which I made after the new year, so I’m not off to a good start on that one either. Winking
smile

Before you flame me about this blog post, this was all in good fun. I love top 10 lists for binary values of 10!

personal comments suggest edit

It’s the end of the year and it’s time for the annual year in review blog post. I know what you’re thinking, but don’t worry, you remembered to turn off the stove before leaving.

You’re also probably thinking, “do we really need an end of year blog post from everyone?”. I asked on Quora, and the answer is a definitive no, we don’t. This is my absolutely unnecessary self indulgent end of year blog post. I wouldn’t have it any other way, would you?

I didn’t want to settle with an ordinary “Ho hum bore you to tears 2010 Year in Review Blog Post”. Mainly because I hate to see you cry. Really. Please stop. What I wanted was nothing short of nuclear supernova fireworks, dancing panda bears, and double rainbow fish that shoot sparks out their butts. In short, a blow your monocle off spectacular end of year review blog post!

Unfortunately, I waited too long and everybody was booked. So all I could get was this dancing panda.

This has been a great but somewhat crazy year for me. It’s kind of insane to think that in the same year we released the RTM for ASP.NET MVC 2, we also released not one, but count ‘em, two release candidates for ASP.NET MVC 3!

Also in this year, we started a new open source package manager, NuGet, which accepts external contributions. My how the pendulum swings over here.

What I Liked In 2010

2010 presented me with a lot of things to like.

  • Podcast: In general, I feel the same way Scott Hanselman feels about Podcasts when he said “Sorry folks, PodCasting = Verbal Incontinence.  I’m just not feeling it.” This makes for delicious irony that he’s involved in my favorite podcast of 2010, This Developer’s Life. Started by friend Rob Conery, I describe it as focusing on the more human side of software and it’s pretty much the only podcast I subscribe to.
  • Video: RSA Animate – Drive: The surprising truth about what motivates us. I absolutely love how this video uses a unique animation style to present on the topic of motivation.
  • Book: I didn’t read a lot of books in 2010, but one that stood out was Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems, which is Steve Krug’s worthy follow-up to Don’t Make Me Think. You owe it to the users of your software to read and absorb these books.
  • Blog: Nothing really did it for me this year. As always, I enjoyed CodingHorror, but I found a read a much wider variety of blog posts due to Twitter and unfortunately didn’t remember to bookmark my favorite of the year. I’ll try better next year. Smile
  • Gadgets: This is a tie between my iPhone 4 and my Kinect. I am on my phone all the time. Perhaps too much. Ok, definitely too much. I now have a WP7 Samsung Focus at home, but I haven’t had any time to play with it yet. We recently got a Kinect and I am totally hooked on Dance Central. The part that really hooked me is that they show where your score ranks among the scores of your friends on X-Box live after each dance. I’m determined to destroy my friends with my fresh moves.
  • My Kids:Seriously, they’re cool! (I like my wife too, but she doesn’t like her picture plastered on the web. My kids get no choice.) Winking
smileMe-and-the-kids

Memorable Moments/Trips of 2010

Here are some memorable moments from 2010 that I happened to blog about. I’m sure there are many others memorable moments I forgot to blog about.

  • February I visited Austin Texas for the first time. I spoke at the Austin .NET User Group as well as at Dell and enjoyed the fine taste of Rudy’s BBQ. I spent every moment reminding Texans that if Alaska were split in half to make two states, Texas would be the third largest state in the union.
  • March We released ASP.NET MVC 2 which was shortly followed with the source code release of System.Web.Mvc.dll under the Ms-PL license. I was allowed exactly two breaths of relief before getting started on ASP.NET MVC 3. In the same month, I attended the Mix 10 conference in Las Vegas and met John Resig and Douglas Crockford for the first time. I gave two talks, one of them co-presented with Scott Hanselman which is always a good time.
  • April I attempted an April Fool’s joke. I’ve been told many times I should leave comedy to the experts, but I don’t see the fun in that.
  • June The Subtext team and I released version 2.5 of Subtext. The pace of work on Subtext has really slowed down as work has kept me busier and busier, but we’re still improving it and getting closer to another release. Also in June, I took my son to Alaska to visit his grandparents as a little vacation. While there, I spoke at the Alaska .NET User Group. No moose were in attendance.
  • July We released the first preview of ASP.NET MVC 3. This was the first public disclosure of the new streamlined Razor syntax for views. Nobody was cut in the process.
  • August After a five year absence, I returned to Black Rock City for the Burning Man festival. As best as I can tell, I did not die while there.
  • SeptemberI presented at Web Camp L.A. In the evening, I visited my old friends and we went curling. Yeah, seriously.
  • October We publicly announced the NuGet package manager project (it was initially called NuPack). This was a project I had been working on at the same time as ASP.NET MVC 3. To me, this project was very significant as it’s an open source project that accepts external contributions! But don’t worry, we also reject some too.
  • December We released the second release candidate for ASP.NET MVC 3.

My Top 3 Posts (Using the Ayende Formula)

With Subtext 2.5, the Ayende Formula, as we call it, is now integrated into the Subtext admin dashboard which makes compiling this list very easy!

Interestingly enough, this list doesn’t correspond to the posts that I think are most interesting. But that’s typically the case. I have bad taste.

  • ASP.NET MVC 2 Optional URL Parameters Covers using UrlParameter.Optional for optional Routing paramreters in ASP.NET MVC 2 (36 comments, 40,542 Web views, 23,818 aggregator views).
  • Sending JSON to an ASP.NET MVC Action Method Argument This post covers posting JSON to an action method in ASP.NET MVC 2. Some of the code presented in this post is now built-in with ASP.NET MVC 3. (39 comments, 37,668 web views, 21,899 aggregator views)
  • Razor View Syntax This post gave a little bit of the backstory about the new Razor syntax our team built for ASP.NET MVC as well as a new product, ASP.NET Web Pages. (43 comments, 27640 web views, 23234 aggregator views)

My Top 3 Posts (Using the Haack Formula)

The Haack Formula just involves me arbitrarily picking my personal favorites.

Enough About Me, What about You?

Yes, I get self absorbed. But let’s set that aside a moment and talk about you (insomuch as it pertains to me Me ME!). How you doin’?

According to Google Analytics:

  • Hello Visitors! 1,308,414 of you (absolute unique visitors) made 1,972,239 visits to my blog came from 219 countries/territories. Most of you came from the United States (676,050) followed by the United Kingdom (170,814) with India (140,732) in third place.
  • Browser of choice:40.94% of you use Firefox, 29.2% use IE, and 23.07% of you use Chrome. Probably not even worth mentioning, but 3.43% of you use Safari and  2.32% of you use Opera. Opera?!
  • Operating System: Not surprisingly, 90.64% of you are on Windows. 5.11% on a Mac and 2.62% on Linux. The mobile devices are a tiny percentage, but I would imagine that’ll pick up a lot next year.
  • What you read: The blog post most visited in 2010 was written in 2009, Using jQuery Grid with ASP.NET MVC with 93,715 page views.
  • How’d you get here: As usual, most of you came here via Google search results (1,120,983), which probably means most of you aren’t reading this. Winking
smile In second place, many of you came here directly (254,533) via typing the URL to this blog in an address bar (or clicking on a bookmark, etc.). StackOverflow.com had a strong showing coming in third with 85,002 visits referred.

Podcasts/Videos

This next list is more for me than for you as I was curious about the various interviews and talks I gave online. It’s the kind of list I wouldn’t expect anyone but myself and my mother to be interested in. In fact, I’m pretty sure my mother doesn’t care.

Suffice to say, there were plenty of opportunities for me to be aghast at the sound of my own voice.

Well if you’ve read this far, wow! you must really be bored on your holiday break. But seriously, if you’re a regular reader of my blog, thanks for sticking around! I think 2011 is going to be an interesting year as well and I’m looking forward to it.

Let me know if you have suggestions on how I can make my blog suck less.

asp.net, asp.net mvc, code comments suggest edit

A lot has been written about how to get ASP.NET MVC running on IIS 6 with extensionless URLs. Up until now, the story hasn’t been very pretty. When running ASP.NET MVC on ASP.NET 4, it gets a lot easier.

To be fair, the part that makes it easier has nothing to do with ASP.NET MVC 3 and everything to do with a little known new feature of ASP.NET 4 creatively called the ASP.NET 4 Extensionless URL feature. ASP.NET MVC 3 requires ASP.NET 4 so it naturally benefits from this new feature.

If you have a server running IIS 6, ASP.NET 4, and ASP.NET MVC 3 (or even ASP.NET MVC 2. I haven’t tried ASP.NET MVC 1.0), your website should just work with the default extensionless URLs generated by ASP.NET MVC applications. No need to configure wildcard mappings nor *.mvc mappings. In fact, you don’t even need to set RAMMFAR to true. RAMMFAR is our pet name for the runAllManagedModulesForAllRequests setting within the system.webserver setting in web.config. You can feel free to set this to false.

<modules runAllManagedModulesForAllRequests="false"/>

When installing ASP.NET 4, this is enabled by default. So if you have a hosting provider still using IIS 6, but does have ASP.NET 4 installed, then this should work for you, unless…

How does this work?

To be honest, it’s a bit of voodoo magic as far as I can tell and there’s a lot of caveats when it comes to using this feature with IIS 6. There are edge cases where it can cause problems. This is why Thomas Marquardt, the implementor of the feature, wrote a blog post on how to disable ASP.NET 4.0 Extensionless URLs just in case.

In that blog post, he describes the bit of magic that makes this work.

Here’s how the v4.0 ASP.NET extensionless URL features works on IIS 6.  We have an ISAPI Filter named aspnet_filter.dll that appends “/eurl.axd/GUID” to extensionless URLs.  This happens early on in the request processing.  We also have a script mapping so that “*.axd” requests are handled by our ISAPI, aspnet_isapi.dll.  When we append “/eurl.axd/GUID” to extensionless URLs, it causes them to be mapped to our aspnet_isapi.dll, as long as the script map exists as expected.  These requests then enter ASP.NET where we remove “/eurl.axd/GUID” from the URL, that is, we restore the original URL.  The restoration of the original URL happens very early.  Now the URL is extensionless again, and if no further changes are made

He also has a list of conditions that must be true for this feature to work. If any one of them is false, then you’re back to the old extensionfull URLs with IIS 6.

I’m Getting a 403 Forbidden

This is not technically related, but when I tried to test this out to confirm the behavior, I ran into a case where every request was giving me a 403 Forbidden error message. Here’s how I fixed it.

In IIS Manager, I right clicked on the Web Services Extension node and selected the menu option labeled Allow all Web Service extensions for a specific application:

iis6-allowing-extensions

In the resulting dialog, I chose the ASP.NET v4.0.30319 option.

iis6-allow-web-service

To double check that everything was configured correctly, I looked at the properties for my website and ensured that Scripts were enabled.

iis6-home-directory-tab

I also clicked on the Configuration… button and made sure that *.axd was mapped to the proper ASP.NET Isapi dll (aspnet_isapi.dll).

iis6-isapi

iis6-extension-mapping

With all that in place, I was able to run standard ASP.NET MVC web application and make requests for /, /home/about/, etc. without any problems!

personal, asp.net mvc, asp.net comments suggest edit

Along with James Senior, I’ll be speaking at a couple of free Web Camps events in South America in March 2011.

argentina_flagBrazil_flag

Buenos Aires, Argentina –March 14-15, 2011

São Paulo, Brazil –March 18-19, 2011

The registration links are not yet available, but I’ll update this blog post once they are.Registration is open! Register for Argentina. Register for Brazil. For a list of all upcoming Web Camps events, see the events list.

If you’re not familiar with Web Camps, the website provides the following description, emphasis mine:

Microsoft’s 2 day Web Camps are events that allow you to learn and build websites using ASP.NET MVC, WebMatrix, OData and more. Register today at a location near you! These events will cover all 3 topics and will have presentations on day 1 with hands on development on day 2. They will be available at 7 countries worldwide with dates and locations confirming soon.

Did I mention that these events are free? The description neglects to mention NuGet, but you can bet I’ll talk about that as well. Smile 

I’m really excited to visit South America (this will be my first time) and I hope that schedules align in a way that I can catch a Fútbol/Futebol game or two. I also need to brush up on my Spanish for Argentina and learn a bit of Portugese for Brazil.

One interesting thing I’m learning is that visiting Brazil requires a Visa (Argentina does not) which is a heavyweight process. According to the instructions I received, it takes a minimum of 40 business days to receive the Visa! Wow. I’m sure the churrascaria will be worth it though.

Good thing I’m getting started on this early. Hey Brazil, I promise not to trash your country. So feel free to make my application go through more quickly.

Tags: webcamps, aspnetmvc, asp.net, brazil, argentina

code, nuget comments suggest edit

We could have done better. That’s the thought that goes through my mind when looking back on this past year and reflecting on NuGet.

Overall, I think we did pretty well with the product. Nobody died from using it, we received a lot of positive feedback, and users seem genuinely happy to use the product. So why start off with a negative review?

It’s just my way. If you can’t look back on every project you do and say to yourself “I could have done better”, then you weren’t paying attention and you weren’t learning. For example, why stop at double rainbows when we could have gone for triple?

leaning

When starting out on NuGet, we hoped to accomplish even more in our first full release. Like many projects, we have iteration milestones which each culminate in a public release. Ours tended to be around two months in duration, though our last one was one month.

Because we were a bit short staffed in the QA department, at the end of each milestone our one lone QA person, Drew Miller, would work like, well, a mad Ninja on fire trying to verify all the fixed bugs and test implemented features. Keep in mind that the developers do test out their own code and write unit tests before checking the code in, but it’s still important to manually test code with an eye towards thinking like a user of the software.

This my friends, does not scale.

When we look back on this past year, we came to the conclusion that our current model was not working out as well as it could. We weren’t achieving the level of quality that we would have liked and continuing in this fashion would burn out Drew.

I came to the realization that we need to assume we’ll never be fully staffed on the QA side. Given this, it became obvious that we need a new approach.

This was apparent to the developers too. David Fowler noted to me that we needed to have features tested closer to when they were implemented. As we discussed this, I remember a radical notion that Drew told me about when he first joined our QA team. He told me that he wants to eliminate dedicated testers. Not actually kill them mind you, just get rid of the position.

An odd stance for someone who is joining the QA profession. But as he explained it to me in more detail over time, it started to make more sense. In the most effective place he worked, every developer was responsible for testing. After implementing a feature and unit testing it (both manually and via automated tests), the developer would call over another developer and fully test the feature end-to-end as a pair. So it wasn’t that there was no QA there, it was that QA was merely a role that every developer would pitch in to help out with. In other words, everyone on the team is responsible for QA.

So as we were discussing these concepts recently, something clicked in my thick skull. They echoed some of the concepts I learned attending a fantastic presentation back in 2009 at the Norwegian Developer’s Conference by Mary Poppendieck. Her set of talks focused on the concept of a problem solving organization and the principles of Lean. She gave a fascinating account of how the Empire State Building finished in around a year and under budget by employing principles that became known as Lean. I remember thinking to myself that I would love to learn more about this and how to apply it at work.

Well fast forward a year and I think the time is right. Earlier in the year, I had discussed much more conservative changes we could make. But in many ways, by being an external open source project with a team open to trying new ideas out, the NuGet team is well positioned to try out something different than we’ve done before as an experiment. We gave ourselves around two months starting in January with this new approach and we’ll evaluate it at the end of those two months to see how it’s working for us.

We decided to go with an approach where each feature was itself a micro-iteration. In other words, a feature was not considered “done” until it was fully done and able to be shipped.

So if I am a developer working on a feature, I don’t get to write a few tests, implement the feature, try it out a bit, check it in, and move on to the next feature. Instead, developers need to call over Drew or another available developer and pair test the feature end-to-end. Once the feature is fully tested, only then does it get checked into the main branch of our main fork.

Note that under this model, every developer also wears the QA hat throughout the development cycle. This allows us to scale out testing whether we have two dedicated QA, one dedicated QA, or even zero. You’ll notice we’re still planning to keep Drew as a dedicated QA person while we experiment with this new approach so that he can help guide the overall QA process and look at system level testing that might slip by the pair testing. Over time, we really want to get to a point where most of our QA effort is spent in preventing defects in the first place, not just finding them.

Once a feature has been pair tested, that feature should be in a state that it can be shipped publicly, if we so choose.

We’re also planning to have a team iteration meeting every two weeks where we demonstrate the features that we implemented in the past two weeks. This serves both to review the overall user experience of the features as well as to ensure that everyone on the team is aware of what we implemented.

You’ll note that I’m careful not to call what we’re doing “Lean” with a capital “L”. Drew cautioned me to user lower-case “lean” as opposed to capital “Lean” because he wasn’t familiar with Lean when he worked this model at his previous company. We wouldn’t want to tarnish the good name of Lean with our own misconceptions about what it is.

This is where I have to confess something to you. Honestly, I’m not really that interested in Lean. What I’m really interested in is getting better results. It just seems to me that the principles of Lean are a very good approach to achieving those results in our situation.

I’m not one to believe in one true method of software development that works for all situations. What works for the start-up doesn’t work for the Space Shuttle and vice versa. But from what I understand, NuGet seems to be a great candidate for gaining benefits from applying lean principles.

So when I said I’m not interested in Lean, yeah, that was a bit of a fib. I definitely am interested in learning more about Lean (and I imagine I’ll learn a lot from many of you). But I am so much more interested on the better results we hope to achieve by applying lean principles.

personal comments suggest edit

At some point, everybody and every team makes a mistake they regret and wish they could take back. During our regular status meetings, I sometimes make the mistake of saying something like “if I could go back in time, I’d tell myself not to make that decision.”

flux-capacitor \ Image from the greenhead.

That tees it up for our lead developer who’s so smart even his ass is smart. You might say he’s a smart ass. His response is usually “Really? I can think of a lot better things I would do with a time machine.”

Which got me thinking. Hypothetically speaking of course, if I did have a time machine, how exactly would I maximize my profit?

Often, time travel questions fixate on boring topics such as if you could meet anyone in history, who would it be?

Lincoln? Snore. Jesus? Yah, maybe he can do something about that chronic rash you got going down there.

I think the question of how you’d become rich is much more interesting and potentially creative. Let’s put some constraints on the question to make it more interesting.

  1. The time machine room size and located in your house. Thus you can’t travel with the time machine.
  2. The time machine can only transport you back in time and back. It can’t transport you through space to another location. So if you live in Seattle Washington, you can travel back in time to any year, say 1951, but you’ll still be in Seattle.
  3. You have one trip and one trip only and you can only bring yourself and the clothes on your back. I’d recommend a backpack.
  4. You have the resources available to you today. You can’t go back in time and buy a million shares of some stock unless you could actually afford to buy that stock.

Keep in mind the consequences of your action. It might seem like it’d be easy to go back in time around ten years, go to a public library and login to your old E-Trade account and buy a bunch of stock. But you, ten years ago, would probably notice and assume you’ve been hacked and perhaps sell immediately.

Also, if you travel to a time before you were born, consider that you probably won’t have proper identification and papers, unless you forge them. Could you buy stocks without identification?

So my friends, I ask you. Given these constraints, how would you maximize profit with a single trip back in time?

Again, this is purely hypothetical. I don’t have a time machine in the garage. Let’s just say, I like to be prepared just in case.

asp.net, asp.net mvc, nuget, code comments suggest edit

Almost exactly one month ago, we released the Release Candidate for ASP.NET MVC 3. And today we learn why we use the term “Candidate”.

As Scott writes, Visual Studio 2010 SP1 Beta was released just this week and as we were testing it we found a few incompatibilities with it and the ASP.NET MVC 3 RC that we had just released.

newdotnetlogo_2That’s when we, in the parlance of the military, scrambled the jets to get another release candidate prepared.

You can install it directly using the Web Platform Installer (Web PI) download the installer yourself from from here.

Be sure to read the release notes for known issues and breaking changes in this release. I’m not saying I put an Easter egg in there or not, but you’ll have to read all the notes to find out.

In particular, there are two issues I want to call out.

Breaking Change Alert!

The first is a breaking change. Remember way back when I wrote about Dynamic Methods in ViewData? Near the end of that post I wrote an Addendum about the property name mismatch between ViewModel and View.

Well we finally resolved that mismatch. The new property name, both in the controller and in the view is ViewBag. This may break many of your existing ASP.NET MVC 3 pre-release applications.

NuGet Upgrade Alert

The other issue I want to call out is that if you already have NuGet installed, running the ASP.NET MVC 3 RC 2 installer will not upgrade it. Instead, you need to go to the Visual Studio Extension Manager dialog (via the Tools | Extensions menu option) and click on the Updates tab. You should see NuGet listed there (click on image for larger image):

extension-manager

The NuGet.exe command line tool for creating packages is available on CodePlex.

Overall, this release consists mostly of bug fixes along with some fit and finish work for ASP.NET MVC 3. We’ve updated the version of jQuery and jQuery Validation that we include in the project templates and now also include jQuery UI, a library that builds on top of jQuery to provide animation, advanced effects, as well as themeable widgets.

In terms of NuGet, this release contains a significant amount of work. I’ll try and follow up soon with more details on the NuGet release along with release notes.

nuget, code comments suggest edit

I don’t normally post lists of links as it’s really not my style. But there’s a lot of great NuGet blog posts I want to call out so I thought I’d try my hand at it.

Hey! Here’s a random picture of a goat.

089

I also tend to post links from my twitter account http://twitter.com/haacked.

Well that’s it for now. If you found this helpful, let me know and I’ll try to do it once in a while. Either once a quarter or once a month depending on interest. Smile