comments suggest edit

Some computer scientist by the name of Donald Knuth once said,

Premature optimization is the root of all evil (or at least most of it) in programming.

Bah! What did he know?

Of course we all know what he meant, but when you take his statement at face value, the claim is a bit vague.  What exactly is it that is being optimized?

Image From

Well speed of course! At least that is the optimization that Knuth refers to and it is what developers typically mean when they use the term optimize.  But there are many factors in software that can be optimized, not all of which are evil to optimize prematurely.

The key positive optimization that comes to mind is optimizing developer productivity.  I hardly see anything evil about optimizing productivity early in a project.  It is most certainly a healthy thing to do, hence the misleading title of this post.

However as with all things, optimizations bring with them tradeoffs.  Optimizing for developer productivity often comes at the price of optimizing code execution speed.  Likewise optimizing for speed will come at the cost of developer productivity.

Security is another example of an optimization that bears with it various trade-offs.

The point of all this is to keep in mind that at all times within a software project, whether explicitely or implicitely, you are optimizing for something.  It is important to be intentional about what exactly you wish to optimize.

If you start optimizing for performance early, keep in mind Knuth’s forewarning.  If you are optimizing for productivity early, then you are on the right track.  This does not mean that you should never consider performance. On the contrary, a good developer should definitely design for performance and measure measure measure.

The danger to avoid is diverting too much optimization attention to areas that provide too little gain, as discussed in my last post on Ruby Performance.

tags: Performance, Optimization, Software, Programming

comments suggest edit

Of course that assumes that Joel wears a size 9 and a half.


Once again the Joel Cycle takes another turn. The cycle goes something like this:

  • Joel critiques something or other.
  • Bloggers counter Joel’s claims, many with thoughtful counter arguments.
  • Soon a flood of comments and posts start to turn a bit ugly and form around two camps: The Joel is an idiot why do you even read him? camp and the Joel is successful, what have you ever done that you can disagree with him? camp.
  • Rinse and Repeat

It really is an interesting phenomenon to watch and participate in. For example, I’ve had my blog post lumped in as part of the angry lynch mob out to get Joel.  All I said was that I found his argument unconvincing. Am I really a part of a mob conspiracy?

Now before anyone jumps down my throat, let me clarify something very important. I have tremendous amount of respect for Joel Spolsky.

There, I said it. I can’t speak for Jeff Atwood, but I would venture to guess that he too has a lot of respect for Joel, despite the big red WTF on his forehead.

Anyone who has thousands of developers dissecting his every blog post and arguing the pros and cons of how he runs his company is doing something very right. I’d love to be in those shoes, running a very successful company with thousands of people invested in what I do and how I do it, whether positively or negatively.

If I were to blog something stupid (and I’m not saying he did, but just for the sake of argument, sheesh!), I’d be lucky to get a couple comments to the effect, “Dude, you’re an idiot.” Heck, I’d be happy if I even generated that level of passion. Rather I’d probably get a comment to the effect of, “I disagree. Nice Post! Buy Xanax”.

The other reaction Joel commonly gets is the I don’t know why y’all are reading him, I gave that shit up a long time ago reaction. I also don’t understand this reaction. For the most part, I think Joel’s signal to noise ratio is very high, and he’s written some really top notch articles on his blog. Just because he says a few things from time to time that you disagree with doesn’t mean you should throw the baby out with the bath water. Sure he comes across as a bit arrogant, but he’s that good.

The last question I often see is Why is everyone paying Joel so much attention? I addressed this very question before in my post, What Is It About Joel?.

Rock Star

In many respects, Joel is the closest thing the software community has to a bonafide rockstar. We’re half expecting to open up our aggregators one day and read about him enrolling in a drug rehab program, but one of those trendy ones in Malibu (or in the Hamptons I suppose with his fondness for New York). Like it or not, he’s opinionated, successful, and a thought leader in our field.

So when he says something controversial, it’s natural to want to provide a counter argument lest some young punk developer at your next team meeting argues vehemently for writing a custom programming language and uses an Appeal to Authority to make his/her case.

Obviously what works for FogBugz does not work for everyone else, but not understands that distinction.

In any case, this will be my last post on the subject of Joel. At least until the cycle begins anew.

Related Links:

comments suggest edit

Ruby Joel Spolsky follows up on his earlier remarks about scaling out a Ruby on Rails site with this post on Ruby performance.  I’m afraid it is a thoroughly unconvincing and surprising argument.  He states…

I understand the philosophy that developer cycles are more important than cpu cycles, but frankly that’s just a bumper-sticker slogan and not fair to the people who are complaining about performance.

A bumper-sticker slogan?  That’s a surprising statement considering that FogBugz is not written entirely in C.  Is it because Wasabi compiled to PHP or VBScript is saving CPU cycles?  Hardly.

As one might expect from a well designed application, FogBugz is written in a productive high-level language for the very reason that Ruby advocates push ruby - it saves developer cycles and thus money.

Also as one would expect from a well written application, in the few cases where performance is a problem, those particular features were written with a lower-level high performance language.

So why wouldn’t this approach apply to Ruby?  From the tenor of his post, Joel seems to indicate that those who choose to implement their enterprise applications on Ruby are so religiously blinded by the benefits of Rails that they would never dare allow the impurity of non-Ruby code to enter the boundary of their architecture.

Really now?

To his credit, Joel states at the end…

In the meantime I stand by my claim that it’s not appropriate for every situation.

And this is true. It may not work well for that computation intensive Bayesian filter.  But is anyone making the claim that Ruby is appropriate for every situation?  The claim I’ve heard is that it is certainly appropriate for many more situations than Joel gives credit for.  I believe that.

Update: Related Links

tags: Ruby, Ruby On Rails, Joel Spolsky, Performance

comments suggest edit

Just a little shout out to my wife to wish us a happy anniversary.  We’ve been married for four years and each one has been better than the last.  I love you honey!

She’s got a rock solid sense of humor (have you seen her gravatar?) and a smile with a gleam so bright it makes you shout Eureka

Ask a
Ninja I would post a picture, but my wife’s sense of online privacy would make Bruce Schneier look like a MySpace exhibitionist.  In fact, I’ve already said too much. 

Instead, I’ll post a picture of a ninja because ninjas have a lot in common with my wife.  They both kick ass, they are both Japanese (except for this one), they are both concerned about privacy, and like my wife, ninjas are so totally cool!

I mean who doesn’t love ninjas!?

Picture from

personal comments suggest edit

Update: You can click on the images (except the vegas one) to see larger pics.

Lights As I mentioned in a previous post, I am currently on a road trip with my younger brother, which explains the lack of blogging.  

This crazy kid started off from Anchorage and drove down the Alaskan Highway (which is known for being gravelly and unpaved in parts) and made his way to Los Angeles.  And that is just the first leg of his drive.  His final destination is Luverne, Minnesota where he will live for a year interning with a pastor at the local church.

Despite the crazy conditions on the Yukon roads, he was still able to take this picture of the Aurora Borealis (aka Northern Lights).

Vegas Baby,
Vegas!Once in Los Angeles, Brian stayed with us a couple of days before we set out towards Minnesota.  My parents felt more relaxed that I was driving with him on this leg of his journey.

Leaving Los Angeles at around 9:30 PM on Wednesday night we made very good time and drifted into Las Vegas around 1:30 AM.  My screams of “Vegas Baby! Vegas!” along the way did not making any sense to my brother who had not seen Swingers.

Lightning! Being my brother’s first time in Vegas, it was my older-brotherly duty to introduce him to one of the (many) sins of Sin City -Gambling!

Usually such a duty ends in tragedy for everyone involved, but ours ended well. 

After a half hour or so at the BlackJack tables, we were up a bill ($100), easily enough to cover the room for the night and some drinks, and probably turning my brother into a gambling addict for life.  I’m sure his church will be happy with me for corrupting him.

again! The next morning we headed out towards Denver.  This part of the drive was just amazing in terms of scenery, especially driving through the canyons of Utah.  There were some lightning storms along the way and my brother managed to snap a couple pictures of the lightning from the car.

 We are now in the tiny town of Luverne where my brother will live for the next year or so assuming he doesn’t go stir crazy.  This is one of those small towns that seemed to cherry pick the good parts of progress, and not the bad.

FallsFor example, people often don’t lock their cars, even leaving them running in the winter when they run into a store to pick up a few items.  Yet they are not so small town that I don’t have a high speed wireless connection at my disposal.

Also, Sioux Falls, South Dakota is only a half hour away and its a reasonably large city.  We drove there last night and saw the falls which are surrounded by red rock, which is where we took this picture.

Tonight I will take an unfortunately long flight home with two stopovers (I wonder if I could just walk and be there sooner).  This road trip with my brother has been a wonderful way to catch up with him, especially now that he’s leaving the nest and becoming a man (although one still prone to fart jokes). 

Phil and
BrianOf course, during the trip, I realized he’s been grown up for a long time, which is hard to recognize when one is an older sibling. 

As for me, the hours of driving left me free to reflect and get a bit philosophical. I’ll write about some of this in future posts, I hope.

comments suggest edit

Tonight at Soccer practice, we scrimmaged for a while then ran through some drills.  We have an English guy and a Scottish guy (who hardly anyone can understand) on the team who are a laugh a minute.  You can imagine their surprise when we started a shooting drill and our team manager tells them that we all have to shag our own balls.

comments suggest edit

Seems like all sorts of open source projects have been releasing lately.  Darren Neimke and Keyvan Nayyeri proudly announce the release of BlogML 2.0 on CodePlex.  Here’s a list of new features on Keyvan’s blog. With a bit of luck and lots of persistence, BlogML will hopefully be a key component in breaking vendor lock-in when it comes to blogging engine. 

For example, if you decide to try Subtext as your blogging engine, and decide it’s not for you, I want you to have your data in a form that is easy to import to other engines.  Why should you have to write code to move from one platform to another?

I contributed a BlogML provider framework with the goal of making it really easy to implement BlogML on other platforms.  For example, the homegrown blog.  I don’t feel I fully accomplished the goal of making it easy, but I think it’s a step in the right direction and I’m sure it’s in good hands now.

Of course I am now pissed (for you British, that’s angry, not drunk) that we have all this extra work to do for Subtext since we’re still running BlogML 1.0 (legacy!), and 2.0 has already been out a few hours!

comments suggest edit

Darren Neimke apparently is not one to shy away from a bit of trash talk.  He IM’d me via MSN recently to warn me about a new SUB, ready to take down Subtext. In this case, it is his newly open sourced blog engine, SingleUserBlog or SUB, which is now hosted on CodePlex.  Darren has been on a roll lately with the recent release of BlogML 2.0.  Now SUB enters the scene with torpedoes blazing!

But Darren must know he is not dealing with a complete novice in warfare.  I deftly guided him to choose the BSD license.  So should they implement something I must have, I can just take it, BWAHAHA! (with proper attribution of course, following all terms of the license and other legal mumbo jumbo)

Darren, I believe I sunk your battleship.

comments suggest edit

SceneryThis past weekend my wife and I drove up to San Francisco to attend a friend’s wedding, which ended up being a lot of fun.  We always like visiting The City because of the many friends we have in the area, though being there reinforces the fact that it is not a place where we’d want to live (no offense to anybody who lives there, it’s just not our style).

While up there we were fortunate enough to have dinner with Mr. Coding Horror Jeff Atwood and his wonderful wife.

They gave us a tour of their new place and I was most impressed with his “boom boom” room (it’s his term and it’s not what you think).  They have a separated room tricked out with surround sound THX speakers, an LCD Projector, a Play Station 2, Dance Dance Revolution game pads, and of course, Guitar Hero.  Jeff gave us a quick demo of Guitar Hero and I was quite impressed, both by the fact that Jeff has some rhythm as well as the graphics and sound of the game.

franciscoIn some ways, the boom boom room reminded me of Ricky Shroder’s room from Silver Spoons.

Meanwhile, my brother is moving from Alaska to Minnesota, but he’s doing it the hard way, driving.  He drove down to Los Angeles from Anchorage, Alaska at a relatively leisurely pace visiting with friends along the way.

He will stay with us for a couple of days and then he and I will set out to drive to Minnesota.  My parents are such worriers so it made them feel better that I was driving with him for this leg of the trip since he won’t be stopping much.

Hopefully we’ll get there with enough time to visit some relative we have in the area.  In any case, if I am less than responsive to emails and Subtext mailing list/forums, you’ll know why.

comments suggest edit

I mentioned several heuristic approachs to blocking spam in my recent post on blocking comment spam, but commenters note that I failed to mention CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart).  At the moment, CAPTCHA is quite effective, both at blocking spam and annoying users.

But I don’t have any real beef with CAPTCHA, apart from the accessibility issues.  If I met CAPTCHA in a bar, I’d buy it a beer! No hard feelings despite thetrash talkingin the past, right?

There is successful code out there that can break CAPTCHA, but that is pretty much true of every other method of blocking spam I’ve mentioned.

The reason I didn’t mention CAPTCHA is that it would be ineffective for me.  Much of my spam comes in via automated means such as a trackback/pingback .  The whole point of a trackback is to allow another computer to post a comment on my site.  So telling a computer apart from a human in that situation is pointless.

And at the moment, the Comment API has no support for CAPTCHA.  If comment spam isn’t coming in via the comment api now, it is only a matter of time before it is the primary source of comments.

So while I believe CAPTCHA is effective and may well be for a good while until comment spammers catch up, I would like to look one step ahead and focus on heuristics that can salvage the use of trackbacks and the Comment API. 

comments suggest edit

As a follow-up to the recent release of Subtext 1.9, I would also like to introduce a little something I put together at

Since we have yet to incorporate a skin preview or gallery in Subtext, I setup a site that features each of the default skins.  Additionaly, the site also has a Skin Showcase where users can submit skins to share with the community.  User submissions are moderated.

Subtext Skins

Although we plan to overhaul skinning in the next version, it is still worthwhile to share skins here as my hope is to help port any user submitted skin (unless it gets too ridiculous in there).

I started off the sharing with the Marvin 3 Column skin which used to be included with Subtext.  If you have a skin you wish to share, please do share!  I recommend that your zip file include a Skins.User.config file rather than a Skins.config file.

comments suggest edit

Logo Well my friends, it took a bit longer than expected to get Subtext 1.9 out the door, but we did it.  When we released Subtext 1.5 back in June I said,

The next version of Subtext is code named Daedelus. It will simply be a straight upgrade to ASP.NET 2.0. We hope for a quick turnaround as we don’t plan to add a lot of features in this iteration. We just want to get up and running in ASP.NET 2.0.

Well that was then and this is now and I was wrong about the quick turnaround.

We realized that a straight port to ASP.NET 2.0 wasn’t much fun if we couldn’t take advantage of some of the new goodies that ASP.NET 2.0 has.  So we spent a significant amount of time cleaning up code and refactoring some functionality to take advantage of what ASP.NET 2.0 offers.  A lot of the changes are under the hood, but there are still a few surface level treats for everybody.

Before you upgrade to 1.9, please check out my notes on upgrading.

So what is new in Subtext 1.9 besides that it is now an ASP.NET 2.0 application?

Under The Hood

Let’s not kid ourselves. 99.999% (Five nines baby!) of my readers are geeks and we want to pop open the hood and take a look around.

  1. Subtext Providers have been refactored to use the Microsoft base Provider class, System.Configuration.Provider.ProviderBase.
  2. Used Generics where appropriate.  As you know, there is a lot of temptation when given a new hammer to start looking at everything as a nail.  We tried to avoid that temptation and make judicious use of Generics.  I think we did a bang up job.  Most of our collection classes are now generic collections and there’s that CollectionBook class I wrote about recently.
  3. Improved our Continuous Integration and build process using CruiseControl.NET.  We now have a nice dashboard that provides a lot of visibility into our development progress.
  4. Improved our unit test code coverage to 36.4% and counting. (When I started it was pretty much 0)
  5. Subtext now runs under Medium Trust without problems except for the Trackback/Pingback issue.
  6. Converted the Subtext.Web project into a Web Application Project.
  7. Added a _System folder to the Skins directory. This contains some CSS files that any skin may reference which provide some common CSS layout and styles.  For example, by referencing commonstyle.css, you can use the pullout css class to pullout some text.  Custom skins can reference these files and override specific settings, putting the Cascading back in CSS.

Try out the new “pullout” or “pullout.alt” CSS classes.

New Features

Some new features we added.

  1. Sometimes removing code is as much a feature as adding code.  As I announced earlier, we removed some old skins and added some snazzy new ones.  We also implemented a way for custom skin definitions to not get overwritten when upgrading code.
  2. Improvements to the packaged skins.  We added the Previous/Next control to nearly every skin as well as Gravatar support among other minor improvements.
  3. Comment Moderation!  This high demand feature was fast-tracked when my company was hired to implement it for a client who wished to start a blog. The client agreed to contribute the source back to the project!
  4. Not exactly a new feature, but we changed the default Html Editor to use the FCKEditor.
  5. Implemented RSD (Really Simple Discovery) and the MetaWeblog API’s newMediaObject method so that Subtext works quite well with Windows Live Writer.

Bug Fixes

There are probably too many to list, but I’ll point out a few that people may have noticed.

  1. [1524172] The Username is being saved as the title.
  2. [1524371] Non-English comments do not appear correctly in mail message.
  3. [1521317] Installation check code fails in locked down scenarios.
  4. [1519764] Skin selection not retained.

Important Note

Subtext ships with ReverseDOS spam blocker enabled out of the box. Please check the ReverseDOS.config file to make sure that it is not filtering any terms that would be relevant to comments in your blog. You can also disable ReverseDOS by removing any reference to it from Web.config should you so desire.

Plans For The Future

We are now gearing up for Subtext 2.0 “Poseidon”, our next major release, which will feature a plugin framework.  Our hope is to foster a community of plugin contributions.  Other features in the works include a Membership provider which will allow multiple authors for a single blog and a new skinning framework. I will update the Roadmap soon to reflect our current plans for 2.0.

Also, with the recent deluge in comment spam, I am considering having an interim release (1.9.1) that would include Akismet as well as semi-moderation (1.9 does include full moderation now). Ideally we would save these features until we have a plugin framework, as they seem like great candidates for plugins. However the communal benefit of blocking spam may be too great to wait.

Many thanks to the growing numbers of Subtext contributors who helped shape and test this release.  All your efforts, whether it is coding, submitting patches, testing, reporting bugs, requesting features, or just giving us a piece of your mind are appreciated!

And before I forget, as I tend to do, the link to the latest release is here.

comments suggest edit

I am still continuing my experiment in running as a LUA (aka Non-Admin).  Let me tell you, it has been a total pain in the ass and now I totally understand why more developers do not do this, which feeds into the vicious cycle in which apps are developed that do not run well under least user privileges.  When I have some time, I will write up my experiences.

One tool that has been invaluable in this experiment is the MakeMeAdmin batch file used to temporarily elevate your privilegs in a command window.  This has worked nicely for me for a while.


Then Scott Hanselman points out Console that takes cmd.exe and adds transparency and tabs.  Just pure geek hotness that I gotta have.

However, the only command shell I normally keep open is my MakeMeAdmin shell.  It’d be a shame to install Console and never see its sleek hotness.  So I decided to play matchmaker and see if I could marry these two wonderful utilities.

I modified the MakeMeAdmin.bat file to use Console instead.  It was a one line change (note file paths should be changed to fit your setup and the line break in there is for formatting purposes. There shouldn’t be a line break.).

set _Prog_="console.exe c:\console_admin.xml 
    -t """*** %* As Admin ***""""

I also created a new admin config file named console_admin.xml that specifies transparency and a red tint which lets me know that this console window is not like the others. It will run commands as an admin.

I’ve uploaded my modified MakeMeAdmin.bat file as well as the console configuration file to my company’s tool site here.  Hopefully all five of you out there also running as a non-admin will find this useful.

tags: LUA, Least User Account, NonAdmin

comments suggest edit

Let me start off by noting that Subtext 1.9 requires ASP.NET 2.0!  Thus the upgrade process from a prior version of Subtext (all which run on ASP.NET 1.1) will not be quite as simple as before, but should hopefully not be overly complicated as is the spirit of Subtext.

For Users Who Have Control Of Their Hosting Server

Users who host on their own server, or have Remote Desktop access to their hosting server will have an easier time with the upgrade.  My recommendation is to simply setup a new folder with the new version of Subtext, copy in your modifications, and then switch IIS over to ASP.NET 2.0 to point to the new directory.  The following is a step-by-step detailed procedure.

  1. Backup your Database.This should go without saying.
  2. While you are in Enterprise Manager, make sure the database user that your blog uses to access the database has DBO permissions temporarily.  This is required so that the web-based upgrade procedure will work.
  3. Make sure you can login to the HostAdmin section.  On most blogs this would be the /HostAdmin/ directory of the site. For example, on my blog the HostAdmin is located at  If you forget your HostAdmin password, there is a query you can run in Query Analyzer to reset your password at the bottom of this page.
  4. Download and unzip the Subtext 1.9 binaries into a new directory parallel to your current installation.  For example, on my server I host my blog in the d:\Websites\\ directory.  When upgrading to Subtext 1.9, I unzipped the distribution to the following path d:\Websites\\.
  5. Merge any customizations from your old web.config file into the new web.config file.  Be sure to note that some settings have moved. For example, the connection string has been moved into the <ConnectionStrings> section.  Also take a look at any new settings you may be interested in.
  6. Copy all your images, videos, audio files and any other non-Subtext files and customizations into the appropriate place in the new directory.For example, I copied the images folder as well as my own Demos folder which contained some demo code on my site into the folder.
  7. Now in IIS Manager, configure your existing site to use ASP.NET 2.0 and point it to your new directory.  For details, see the section at the bottom of this post.  You may need to change the Application Pool your site runs in if you are running Windows 2003.
  8. Visit your website and follow the instructions. At this point, the normal web-based upgrade wizard should kick in, asking you to login to the HostAdmin tool and click the upgrade button.  This will upgrade your database schema and stored procedures.
  9. Make sure to reverse the change you made in step 2!  Subtext does not require DBO permissions for normal operations. The user that Subtext uses to connect to your database should just be in the public group.

For Users With Hosted Solutions Such as WebHost4Life

Unfortunately I am not familiar with the procedure that the various hosting providers use to upgrade a site from ASP.NET 1.1 to ASP.NET 2.0.  If the upgrade happens on the same machine that your site is currently hosted in, the upgrade may bring down your site for a short bit.  You may have to coordinate the above steps with a technician at your hosting company, except for the following changes.

  • Step 4: Download and unzip the Subtext 1.9 binaries to your local machine.
  • Step 6: Have your hosting support technician upgrade your site to ASP.NET 2.0
  • Step 7:  Copy your local files over to your hosting provider.

Configuring IIS for ASP.NET 2.0

To configure a website in IIS for ASP.NET 2.0, right click on the website in the IIS Manager tool and select properties. Click on the ASP.NET tab in the dialog box. It should look something like this…

ASP.NET Version

Makes sure to select 2.0.50727 in the ASP.NET version dropdown.

On Windows 2003, I created a separate Application Pool for my ASP.NET 2.0 websites.  To select the Application Pool for a website, cilck on the Virtual Directory tab and select the Application Pool in the dropdown at the bottom of the dialog as in the following screenshot.

App Pool

Good luck and smooth sailing!

comments suggest edit

Scott Hanselman sets the geek-o-sphere abuzz with his latest (and apparently now annual) Ultimate Developer and Power Users Tool List for Windows.  The publishing of this list usually coincides with a productivity drop for me as I find many new toys to play with.  Unfortunately, many tools don’t work so well when running as a non-admin.

This year I was pleased to find my name and my humble little blog on his list.  Quite pleased in fact until it struck me. 

Wait one doggone minute!

Is Scott calling me a tool?  An ultimate tool no less.  We’ll see about that!

comments suggest edit

Scott writes about making DasBlog work on Mobile Devices.  The approach he takes is to programmatically detect that the device is a mobile device and then present an optimized TinyHTML (his term) theme.

Ideally though, wouldn’t it be nice to have mobile versions of every theme?  In fact, I thought this could be handled without any code at all via CSS media types.

Unfortunately (or is that fortunately) I don’t own a BlackBerry or any such mobile device with a web browser, so I can’t test this, but in theory, another approach would be to declare a CSS file specifically for mobile devices like so:

<link rel="stylesheet" href="mobile.css" type="text/css" 
    media="handheld" />

The mobile browser should use this CSS to render its screen while a regular browser would ignore this.  Should being the operative word here.  Unfortunately, at least for Scott’s Blackberry, it doesn’t.  He told me he does include a mobile stylesheet declaration and the BlackBerry doesn’t pick it up.  Does anyone know which devices, if any, do support this attribute?

For those devices that do, a skin in subtext can be made mobile ready by specifing the media attribute in the Style element of Skin.config like so (note this feature is available in Subtext 1.5).

<Style href="mobile.css" media="handheld" />

Refer to my recent overview of Subtext skinning to see the media attribute in play for printable views, which does seem to work for IE and Firefox.

comments suggest edit

Update: Rob renamed his project to Subsonic.

Rob Conery just released ASP.NET ActionPack 1.0.1 on his blog today.  This project is definitely one to watch!  He is essentially taking some of the principles of developing web apps with Ruby on Rails and porting those ideas to ASP.NET.  Just watch this great screencast to get a taste of the progress he has made in a short time.

So far I am very impressed with this guy.  Yesterday evening I sent the link to the screencast to Jon Galloway who then wondered why he was using strings for table names.  I told him to quit bothering me about it and post something in the Codeplex forum.  But Jon, being the simultaneous type of guy he is, had already posted a comment on Rob’s blog before I could finish my sentence.  This all happened last night.  This morning I notice the sixth bullet point in Rob’s announcement states that he added struct in classes for column names.  Apparently he had received the comment, made the change, and sent a reply to Jon in two hours.

Now that is a quick turnaround and good customer service! ;)

Not only that, but this guy lives in Kuaui, Hawaii! I don’t know how he gets anything done unless it’s the rainy season right now. Subtext would definitely languish if I lived in Kuaui.

comments suggest edit

SpamLately my blog has been hit with a torrential downpour of comment spam.  I’ve been able to fight much of it off with some creative regular

expressions in my ReverseDos configuration file.  Of course keyword filtering, even Bayesian filtering, can only go so far.  We need to supplement these approaches with something else.

But first, in order to combat SPAM, we need to identify the enemy.  Are we fighting against automated bots relentlessly crawling the web and posting comments?  Or are these low paid humans behind the keywords?  Are they attacking via the Comment API or posting to an HTML form?

My assumption has been that these are bots, but I plan to add some diagnostics to my blog to test that assumption someday soon.  Lets run with the assumption that the bulk of comment spam is generated by bots.  In this case, we need to examine the behavioral differences between bots and humans for clues in how to combat spam.

For example, an automated script can pretty much post a spam comment instantaneously.  What if your blog engine timed the interval between sending out the content and receiving a comment?If the comment came back too quickly, then we have high confidence that it is spam.

Certainly this is easily defeated by a spammer by adding a delay, but an artificial delay is costly to an automated script trying to hit the most blogs possible in the shortest amount of time.  Anything to slow down the spammers is worthwhile.

Another potential approach is to require javascript to comment.  Perhaps your comment form doesn’t even exist without some javascript to insert it in there.  The theory behind this approach is that most automated scripts won’t evaluate javascript. They simply want to post to some form fields.  Unfortunately this hinders the accessibility of your site for users who turn off javascript, but it may be worth the price.  Spammers will eventually figure this one out too, but it does add a nice computation cost to implement javascript handling in an automated spambot.

Ultimately, these approaches are more about the behavior of the spammer than the content.  For example, when I first started working on Subtext, I added two features that at the time blocked a significant amount of spam for me.  The first was to not allow duplicate comments.  I found that a lot of comment spam simply posted the same thing over and over.

The second feature was to require a delay between comment spam originating from the same IP address.  Using a sliding timeout of only two minutes seemed to defuse spam bombs which would try to post a large number of comments in a short period of time.

Later, I added ReverseDOS to help catch the spam that made it through these approaches.  Over time, I’ve noticed that comment spam starts to look more and more like legitimate messages, like the current crop of “Nice Site!” spam. 

The one thing that every comment spam has in common is a link.  Ultimately, the only way to stop content spam via a content-based approach is to simply not allow any comment that contains a link in any way shape or form.  But how awful would that be for the many legitimate commenters who wish to share a link?

No, we must do something better. I currently don’t think we’ll ever win the battle, but we can work to stay one step ahead.

comments suggest edit

The other tactic I neglected to mention in my previous post on combatting comment spam is more big picture.  How do we remove the incentive for spammers to comment spam in the first place?

Apparently the rel=”nofollow” approach has done little to curb comment spammers despite many predictions (including my own).  I still think it is an important step in removing one incentive, but what else can be done to remove this incentive?

With the lack of results from the rel=”nofollow” approach, the lesson we learn is that either the incentive for comment spam isn’t necessarily Google rankings or that there are enough unpatched blogs out in the wild that it still does help the google rank to post comments indiscriminately.  Or both.

If a spam commenter can put a link in the comments of several thousand blogs, then certainly that translates to tens to hundreds of thousands of eyeballs on that link, and maybe a few hundred clickthroughs (yes, I’m pulling these numbers out of my rear).  When someone clicks through, the spammer gets paid a small amount from the owner of the site.

Warning, here is where I go off the deep end in brainstorming solutions.  Forgive my naivete.

What if the marketers who pay for these links to be spread around found out that comment spammers were creating negative feelings for their products by posting comments on sites that were vehemently against having these advertisements.  Would they care?  Would they be interested in not paying for click throughs from sites who have specifically opted-out of such links? 

I’m probably dreaming here, but stay with me for a moment as I flesh out a quick thought experiment.  Suppose these sites did care.  One option is for them to not pay for links that originate from sites that specifically opt-out of comment advertising.  For example, by registering with some central opt-out site.

Another approach would be for sites that receive click-throughs to initiate a trackback like mechanism in which they request a comment spam policy from the blog.  If the blog does not explicitely endorse their product, the link does not get paid.

Of course the big flaw in this experiment is that these sites probably do not care and wouldn’t go to the trouble to implement these approaches to being a good citizen.  They just want the links to come in.  Even negative publicity is good publicity.  So what can we do? Is there a way to make them care? Is there a way to make comment spam less lucrative?

comments suggest edit


Another blog linked to this post and mentioned watching the video up to the slow motion practice session.  Be sure to keep watching past it to see the woman juggling a soccer ball, while playing double dutch, with a flaming soccer ball and jump rope.  Ronaldinho never did that!