comments edit

SeattleThe wife and I are headed up to Seattle for President’s day weekend to visit some friends.

Yeah, I know. Why Seattle At This Time Of Year!?

Well with her being all preggers and all we didn’t want to go somewhere too distant (like Tahiti). Not only that, a little cold weather might be a nice change of pace. I mean you can only take so many consecutive days of blue skies and sunshine.

I’ll also be swinging by the Microsoft Campuson Tuesday to have lunch with my compadres Adam Kinney and Erik Porter. My hope is to grovel and beg for them to sneak me through the back door of Mix

  1. It probably won’t work, but I gotta give it a try, no?

I plan to show up a little earlier to see another old friend from college who works at Microsoft and, believe it or not, is not a developer! She’s in Finance doing something or other.

Hopefully I’ll run into some other Microsoftie characters I know!

comments edit

Steve Harman just announced the release of Subtext version 1.9.4 Windward Edition. This one comes with a lot of bug fixes, so be sure to upgrade.

Just so you know things work, I add bugs to Subtext, and Steve and the other developers fix the bugs. It’s a rather efficient ecosystem and is working quite well for us. It keeps everyone on their toes.

Steve’s post has the full list of bug fixes and such. The most interesting addition is that we’ve implemented a Google Sitemap, which was submitted as a patch, if I remember correctly.

This release is notable because of the increased number of patch submissions. I greatly appreciate the contributions by all the new contributors along with the stalwarts. Working on Subtext is a joy because of these people.

As for Subtext 2.0, Progress has been slow, but steady. One challenge we’re dealing with is how to cleanly handle the following two multi blog scenarios as summarized by Simo in our mailing list.

​1. The multi blog site is a communiy site: so user registered to a blog, are already registered for all blogs.

​2. The multi blog site is just a host for many different non related blogs: here all users are different, and even if 2 blogs are on the same system, the user shouldn’t know about that.

We pretty much can already handle case 1 which is nice because it means the same user can be the owner of more than one blog, rather than requiring a user account per blog. This is useful for me personally as I host three blogs on a single installation, and right now, the admin is technically three different accounts.

Strictly speaking, this implementation makes implementing the isolation requested in case 2 difficult, because usernames must be unique. So if there’s ever a name conflict between two users attempting to register with the same username, the two blogs will affect each other and not be completely isolated.

However, if we make the user’s email address their login, we can implement case 2 in spirit. First of all, there won’t be two different users using the same email address, so the naming conflict issue is resolved.

Secondly, if you wish to register for blog 2, why should we make you fill in your information again if we already know who you are? We should simply present a message that says, Hey, we know who you are, if you want to register for this site, click here and we’ll auto register you.

That way, the user is in control over who which blogs get to see their registration information, yet they only have one user account in the system. Certainly there are some improvements we can make later, for example, some blogs may want more information than others to register.

But for now, we only ask for the minimal amount of information and will keep things simple and consistent across the board.

In any case, enough blabbing. If you want to download the latest release, you can find it on our SourceForge site or just use this [DOWNLOAD] link.

comments edit

UPDATE: How could I have overlooked the Plugin Keyvan Nayyeri wrote two months ago. Sorry buddy!

You know what would go well with my XFN Highlighter Script like Kimchi goes with Bulkogi? How about a plugin for inserting proper XFN hyperlinks?

XFN
Screenshot Scott Dornan (and Keyvan) delivers exactly that with his Insert XFN Hyperlink plugin for Windows Live Writer.

Apparently he was introduced to the concept of XFN after reading a post on my blog.

Yes, you have me to thank for this fine plugin (not to mention Scott who actually did all the work, while dealing with a-holes like me trying to take the credit).

It’s not yet available in the Windows Live Gallery, but will hopefully be there soon.

In the meanwhile, we’ll have to beg for a direct download. Wink

comments edit

My feed looks like several items were reposted. This is due to a bug in Subtext when editing an existing post using Windows Live Writer. In any case, we’ll have a fix available in the next release of Subtext soon.

comments edit

I received a call from a nice lady asking to speak to someone who could confirm an employment at VelocIT.

Me: Sure, I can do that.

Lady: What’s your name and title?

Me: I’m Phil Haack, the CTO.

Lady: Is there someone else I can speak with?

Me: Uhhh, why? Is there a problem? I am an owner and manager of the company. Certainly I have authority to confirm an employment.

Lady: I can’t have you confirm your own employment.

Me: Oh! This is for me!?

Man, I’d be in big trouble if I were a one-man shop.

comments edit

IBM Second Life Offices - From
http://www.sutor.com/newsite/blog-open/?p=1382Micah Dylan, co-founder and CEO of VelocIT (you know, the place I work for my day job), has recently discovered Second Crack Life. He’s been hanging out with some big wig at IBM in charge of open source and open standards. And you know how I loves me that open source.

Apparently IBM is looking at ways to build internal community among a distributed workforce and exploring whether or not Second Life fits into that. Obviously, this holds appeal to VelocIT because we’re a distributed company.

It’s hard not to find Second Life intriguing if you’ve ever read Snow Crash, but I’m tentatively skeptical of its use as a day to day communication tool. I’m just not ready to have a processor intensive 3D environment running in a window while I’m trying to get my coding on.

However, I do see its appeal for having short fun meetings. Jon Galloway and I have hung out in Second Life for a short period before we realized what a time sink it could end up being.

I think Second Life could be a great tool for Open Source teams to hold meetings. In fact (and you heard it here first), when we release Subtext 2.0, we’re going to have our release party in Second Life.

comments edit

Rob
ConeryOn Tuesday night I met Rob Conery, aka Mr. Subsonic, aka Mr. Commerce Starter Kit. The man is shifty and goes by many aliases. He was in town visiting family and was able to pull away for a night to kick it.

From time to time, I’ve met people in person at various conferences who I only knew via their blogs. Usually, because you have no idea what to expect, those meetings can be full of surprises.

You’re nothing like how I pictured you in my head. I thought you’d be taller, have hair, and be more female.

Since I’ve chatted with Rob on Skype before, there were no surprises.  Except that he’s ginormously tall. I literally almost blurted out, You’re a freakin’ giant! But was able to hold my tongue.

Jon Galloway joined us after a harrowing drive up the rain soaked freeway from San Diego and we headed for Hurry Curry of Tokyo on Sawtelle for some delicious Japanese style Curry. After a couple of bottles of sake and Sapporo beer, we headed over to The Mór Bar for $2 Tuesdays with Moontribe.

The Mór Bar is a sweet spot to hang out, get a drink, talk and listen to good music. They have a nice arrangment of plush couch/benches in a U-shape arrangement at the front of the bar on a deck, which evokes a Lawrence of Arabia feel.

We settled in to geek out and Rob’s a fun guy to have a conversation with. Just brimming with great ideas and interesting stories. Shortly into it, Rob remarks on my total ADD (Attention Deficit Disorder). What the heck do you expect when you buy me a Maker’s Mark for the first round of drinks? Yeah, I get easily distracted after multiple shots of sake, beer, chased by a strong whiskey. What can I say?

So there you have it, Rob’s Tall, Jon’s Bald, and I have ADD. Any case, Rob, if you read this, I hope you got your Credit Card back. Good times. Good times.

comments edit

Jeff Atwood writes a great post on how to become a better programmer by not programming. For the most part, I totally agree with his premise that to become a better programmer, you need to…

Learn about your users. Learn about the industry. Learn about your business.

Understanding the big picture is absolutely essential to being a good developer. Especially when you’re in the 99^th^ percentile. The effort to get to 99.5^th^ is very great and provides diminishing margins of return.

However, most developers are not in the 99^th^ or even 90^th^ percentile (by definition), and thus still have room for improvement in programming ability, along with the important skills just mentioned.

I am not convinced by the idea that developers are either born with it or they are not. Where’s the empirical evidence to suport these types of claims? Can a programmer move from say the 50th to 90th percentile?

Jeff points out that study after study shows there is no relationship between a programmer’s amount of experience and code quality or productivity. I don’t doubt that for a second. I’ve worked with developers who have 10, 15, 20 years in the industry and couldn’t code a virtual rat through a maze consisting of two parallel lines.

But recent research points out that the belief in innate talent is “lacking in hard evidence to substantiate it” as well. I wrote about this topic recently in my post, The Question Of Innate Talent.

So how do I reconcile these seemingly contradictory statements?

Well going back to the Scientific America article, The Expert Mind, we get a clue.

Ericsson argues that what matters is not experience per se but “effortful study,” which entails continually tackling challenges that lie just beyond one’s competence. That is why it is possible for enthusiasts to spend tens of thousands of hours playing chess or golf or a musical instrument without ever advancing beyond the amateur level and why a properly trained student can overtake them in a relatively short time. It is interesting to note that time spent playing chess, even in tournaments, appears to contribute less than such study to a player’s progress; the main training value of such games is to point up weaknesses for future study.

So what we learn from this research is that experience does not matter to a person’s performance. Exactly what the studies cited by Atwood support. However, what does have an impact is effortful study.

Effortful
study

This makes a ton of sense to me. Typically, experience in any field, especially software development, often means solving similar problems over and over again using the same techniques as before. There is no way this contributes to being a better programmer.

Sure an experienced developer had to learn new technologies to stay relevant, but if the experienced developer applies the new technologies in the same way over and over again, the developer has not improved.

I’m currently looking at some legacy C# code written in a procedural style. The developer can write C# code, but hasn’t taken the time to learn to recognize when object oriented patterns would help solve a problem more elegantly. Thus his experience has not made him a better programmer.

However, with focused effortful study and training, a programmer can lift him or herself out of mediocrity. It’s not by programming more one gets better, but by programming better. Even better is to program more better (as in more often and better).

Keep in mind though, that this takes nothing away from Atwood’s main point. My point is that most developers (by definition) are not in the 90th percentile, at which point there is diminishing margins of return for effortful study. Most developers still have room for improvement in their coding skills as well as the ever important tangential skills to which Atwood refers.

In fact, I believe that for many developers, the tangential skills will distinguish and increase the value of a developer faster than improving programming skills. But don’t toss out that book on Design Patterns just yet.

comments edit

Steve Harman digs in and solves a longtime mystery for me regarding VPN connections and default gateways on remote networks.

Most clients we have require that we connect to their network via VPN. Nothing new about that of course. But some clients require that we check the Use default gateway on remote network option when setting up the VPN.

That effectively shuts down my ability to access resources on my local network as all traffic gets routed through the remote network.

VPN Dialog. Shows the Use Default Gateway on Remote Network
Checkbox

Fortunately, Steve didn’t give up like I did. He persisted and, following up on a tip by Jon Galloway, figured out how to configure Routing Tables to achieve what he needed.

This so beats my twine and wire MacGuyver solution of simply setting up Remote Desktop to another machine that is then connected to the VPN.

With persistence and problem solving skills like that, a company would be lucky to hire Steve. So we did this past month!

blogging comments edit

Thumbs
DownIf you’ve read my blog for any length of time, you know I tend to go on and on about the virtues of blogging and participating in Open Source projects.

You might even start to suspect that I think we could end wars, poverty, and hunger and sit around singing Cumbaya together in harmony if only everyone would blog and participate in Open Source.

Really now. I’m not that naïve. I’m sure we could pick a better song to sing around the campfire.

All kidding aside, I really have put my money where my mouth is.

Thumbs
UpIn the past, I’ve talked about the challenges of hiring, as well as my belief that blogs are a great means to connect with good developers.

That’s how I met and hired Jon Galloway who is a tremendous technical leader, developer, and business partner.

I also think that judging potential hires on open source contributions (as 37Signals suggests) is a great way to find good developers, though I’m not so inclined to be as extreme as they are and only hire Open Source developers.

But rather than just talking about hiring Open Source developers, we recently hired Steve Harman. Steve was a Java developer at a large financial institution when he started contributing to Subtext so that he could cut his teeth with C# and .NET.

Over time, he really took on a lot of responsibility and impressed me with both his good judgment, and his work ethic. By actually working with him on a project and seeing the quality of his code, I got a really good sense of him as a developer and potential coworker that is impossible to get from a three hour or even three week interview.

I’ve been responsible for hiring as a development manager at three companies, with varying degrees of success. It turns out, that my best hiring has been at VelocIT.

Hiring is full of landmines. I’ve hired people who were great in interviews, but ended up not being able to code their way out of a Hello World program.

That is why I’m such a firm believer in the power of blogs and open source contributions to filter out the true gems among the lumps of coal.

Of course, another great way to hire good people, though draws upon a smaller pool of talent, is to hire the best people you’ve worked with in the past. A while back we hired Pat Gannon who is a fantastic software developer. The only reason he doesn’t get mentioned much here is because he doesn’t have a blog, and you know how I feel about blogs.

Maybe if we didn’t we keep him so busy building systems, he’d have time to write a blog post or two.

comments edit

This is amusing. I tried to view a post on Slashdot and got the following error page.

Banned from Slashdot
Message

I have no idea why that would be the case. I think I have RSS Bandit configured to download the RSS feed every hour, which hardly qualifies as a DDOS attack. Interesting.

comments edit

Image of the sun from
http://www.noaanews.noaa.gov/stories2005/s2372.htm Today I turn 32 years old.

I remember when that sounded old to me. But with recent advances in health care and life expectancy, songs like Jay-Z’s new 30 Something (the chorus states “30 is the new 20”) remind me I’m still in my prime.

If only someone would tell my knees that.

My wife threw a nice party for me and some friends on Saturday. We had a huge Korean food spread. In true Korean style, it was way too much food, and I’ve been eating Korean food all week, not that I’m complaining or anything. We bought some marinated ribs (Kalbee) in KoreaTown and grilled it up proper on my friend’s grill.

Tonight will be low key. I’ll probably grab some sushi with the wife, and later hang out with Jeff Atwood who happens to be in town.

One of the new toys I received from my family is a Linksys CIT200 Skype phone. It’s not a wifi phone, but rather a cordless phone. It’ll allow me to roam around the house on those long Skype conference calls. So far, I think it’s pretty nice.

tech, code comments edit

Fingerprint via public domain
clipart How do you uniquely identify a person, without divulging the identity of that person?For example, given a set of personal artifacts, how would I arrange the set of artifacts grouped by the person to which they belonged?

The answer is quite easy, isn’t it (especially given the title of this blog post and the image to the right)? You can look at the fingerprints on the items.

Unless you happened to have a file that mapped the fingerprints to individuals, you won’t know who the comb and mirror belong to, for example, only that they do belong to the same person and not to the person who owns the scissors.

The analogous structure in the world of information theory and computer science is the digital fingerprint, often created via ahash function. MD5 and SHA1 are two of the most commonly used hash functions.

A hash function is a one way algorithm for converting data into a number which for the most part can be used to identify the data without revealing the data. This is why it is common to hash passwords before storing them in a database.

In order to authenticate a user, I don’t need to know the user’s password, I just need to know that the hash I’ve stored corresponds to the password you typed.

Don Park recently invented a system for representing IP Addresses without divulging the actual IP Address via a system of glyphs, which he calls Identicons. This serves as a nice means of identifying commenters on a blog, without divulging their actual IP address, which could be a privacy concern. The following image are some examples of identicons.

Some Identicon
Glyphs

What’s interesting to me about identicons is that they have wider uses beyond representing IP addresses. As Don states,

I think identicons have many use cases. One use is embedding them in wiki pages to identify authors. Another is using them in CRM to identify customers. I can go on and on. It’s not just about IP addresses but information that tends to move in ’herds’.

One way to look at identicons is that they are a visual representation of a hash value. They sort of add even more weight to the fingerprint analogy by being visual like a real fingerprint.

For example, for security reasons, many free software providers provide an MD5 checksum of their software binaries. You can see an example of this from the PuTTY download page.

Putty Download Page. Shows links to MD5
Checksums 

The next screenshot shows some actual hash values of exes.

Hash
Values

Looking at the first listing, we can see that for pageant.exe, the hash value is 01d89c3cbf5a9fd2606ba5fe3b59a982, kind of a mouthful, right? Another way that could be represented is via an Identicon, which would be more readily recognizable.

Of course, in this situation, the security minded person would use an automated MD5 checksum checker rather than manually confirming the binary. But do you trust your md5 checksum checker? A quick visual confirmation would be a nice additional vote of confidence in this scenario.

If you’re interested in playing with Identicons in .NET, I recommend taking a look at this .NET port of Don Park’s implementation written by Jeff Atwood and Jon Galloway.

I had the pleasure of reviewing the code with Jeff and he’s quite proud of his caching optimization. Rather than caching the Bitmap object, he caches the PNG output as a MemoryStream instance. That ends up saving a ton of memory.

code comments edit

Yesterday I mentioned that Jeff Atwood and Jon Galloway wrote an Identicon implementation for .NET. It works beautifully, but they distributed the code as a Web Site Project, which I cannot deploy to my blog as is.

For those of us who prefer Web Application projects, I repackaged their code as compiled DLL and a handler file. This distribution will work for both Web Application Projects as well as Website Projects.

Just download the binaries, copy the DLL to your bin directory, copy the IdenticonHandler.ashx file to your website directory, and you are good to go. You can simply add an image tag that references your identicon handler.

<img src="IdenticonHandler.ashx?hash=hash-of-ip-address" />

[Download Binaries]

You can also grab the Solution I used to prepare the binaries.

[Download Source]

Gravatar Tip!

If you use Gravatar, consider using the identicon handler as the default image. That’s what I did for this website. That way, if the user does not have a gravatar, the identicon will show up instead. Better than nothing!

comments edit

Windows Developer Power Tools Cover From
Amazon.com In order to promote the book Windows Developer Power Tools: Turbocharge Windows Development with More Than 140 Free and Open Source Tools, O’Reilly is declaring January 19^th^ (aka today) to be “Windows Developer Tools Day”.

I think declaring days will become a bigger trend in 2007. I’ve tried my hand at declaring July 26^th^ to be “Contribute to Open Source Day”. Jeff Atwood declared December 1^st^ to be “Support Your Favorite Small Software Vendor Day”.

Now we can add Windows Developer Tools Day to that geek calendar.

As I mentioned before, I contributed a couple of sections to the book–The section on TortoiseSVN and TortoiseCVS and the section on Subtext of course.

I’m genuinely looking forward to receiving my copy of this book for the many useful tools that are presented inside. Not only that, some of our clients are starting to migrate towards using Subversion, for which we’re providing some rudimentary training.

It’ll be helpful to tell them to buy this book and read my section on TortoiseSVN.

I believe the book also covers CruiseControl.NET. CruiseControl.NET has been an extremely valuable tool for Subtext’s continous integration and build process. The next step for some of our clients is to help them devote a server and time to setting up a continous integration process.

comments edit

Photo of a
DrillWe often hear that the current state of software development is still dysfunctional. Scott Rosenberg recently wrote a book to that effect called Dreaming In Code. He takes takes a look at the question Why is software so hard? According to Scott, Software developement’s history is marred by poor quality, missed deadlines, and cost overruns, primarily due to a persistent dysfunctional culture.

And he’s talking about software written by companies who are in the business ogf writing software. Well if software written by software companies is so bad, how bad is the software written by hardware companies?

Very bad.

I’m sure there are a few exceptions, but companies that I think should know better write atrociously poor software. And frankly, I’m getting tired of it. Here are some basic principles for hardware makers to keep in mind when writing software for their products.

1. The computer belongs to me, not you!

Did you ever notice that my documents are in a folder named My Documents. Not Your Documents (in Vista, it’s just Documents, but there’s still an implicit my in there).

That means that folder belongs to me. Not you.

So I beg you, stop adding your folders in there. There is a proper location for your stuff, in the Application Data directory. Since you’re too lazy to understand how Windows works, I’ll write some sample code for you in C# that demonstrates where to place your application’s files. I’m sure you can figure out the C++ equivalent.

string appDataPath = Environment.GetFolderPath
  (Environment.SpecialFolder.ApplicationData);

string yourAppDataPath = 
  Path.Combine(appDataPath, "YourApplicationName");
if(!Directory.Exists(yourAppDataPath))
{
  Directory.CreateDirectory(yourAppDataPath);
}

Brush up on the Environment.SpecialFolder enumeration. It’ll come in handy.

For user settings and application cache data, you might also consider using Isolated Storage. Just don’t put it in My Documents.

2. Don’t Assume The User Is An Administrator

While I’m ranting on this topic, I should also mention that any data your application needs to write should not go in the Program Folders directory. Not everybody runs their machine as an administrator and better you learn that lesson now than when Vista is widely disseminated.

3. Learn About the Platform And The Services It Provides You.

I recently bought a USB enabled Universal Power Supply (UPS) that was well reviewed. It appears to be a great hardware product, but the software is crap.

The point of getting a USB enabled UPS is that the UPS can shut down windows gracefully if the power goes out. But rather than intergrating cleanly with Windows XP Power Management, they wrote their own ugly little system tray applet. Why not take advantage of what the OS provides?

To me, this example and the previous example I mentioned with the Application Data folder belies a willful ignorance and disregard for writing good software for Windows. It’s due to an unwillingness to take the time to learn about the platform.

4. Honor Your Obligations and Promises.

When installing HP drivers and software for my scanner (which sadly doesn’t work with Vista at all), the installation process provides me with a checkbox “Add an HP Share-To-Web icon to the desktop?”.

I responded with my usual response, Hell no!, by unchecking the checkbox. But what happens when the installation is complete? There’s the icon on my desktop. Not only that, the icon is impossible to delete. I mean impossible!

I tried to right click it, there’s no delete option. I click properties and select remove icon from desktop and click Apply. The dialog hangs, never returning. Fantastic!

Please, give users a choice. And when they make a choice, honor it!

5. We Really Don’t Want Your Cruft

Really. Seriously. We don’t.

It’s simple. If a piece of software is something we really want, we’ll take the time to find it and install it. We don’t need you to install a bunch of crapware alongside your drivers or main application.

If it needs to be pork barreled in order to get it on our machines, we probably don’t want it.

Trust me on this one. The extra endorsement money you get for bundling probably won’t make up for the loss in customer satisfaction.

6. If The Software Sucks, We Think The Hardware Sucks.

Again, quite simple. The majority of user interaction with a piece of hardware is really via the software. If the software is clunky and hard to use, or worse, just flat out fails to work. We associate the failure with the entire product, hardware and all. After all, if a company can’t take the time to write quality software, why should we trust in the quality of the hardware?

Conclusion

So that’s it. All I ask is that hardware makers take as much care writing their software as they do building their hardware. Perhaps more care, given how flimsy hardware can be these days.

When the supporting software is good, customers will rave about your products.

comments edit

David
BeckhamLooks like Beckham is moving to La La Land. Sure, he is no longer on the English national team, nor does he start for Real Madrid, but I bet he’s still got some good playing days in those legs, and will definitely be a huge lift for the level of play in the MLS, or at least for the struggling Galaxy.

I’m just thinking I should’ve bought season tickets, because the price probably just went way up.

comments edit

Image of a RIM Blackberry
emulatorI’m currently working on an interesting project to develop a series of HTTP services used by games running on the RIM Blackberry. These services will enable players to compete against one another (though not in real time) in various games and see high scores, challenge friends, etc…. It brings a social aspect to gaming on your blackberry device.

The games are written in Java and I’m using a Blackberry emulator for testing the interaction between the game and the services. I’m running the service at localhost on my local machine to allow me to step through the debugger when necessary.

With all these web requests and response shuttling back and forth between the game and the service, it’d be nice to be able to debug that HTTP traffic using a network analyzer like Fiddler.

What Is Fiddler?

If you’re not familiar with Fiddler, it acts as a local HTTP Proxy on port 8888 allowing you to inspect HTTP traffic between your an application and a web application (even one running at localhost). WinINET-based applications (such as Internet Explorer) automatically use Fiddler when its running. For other applications, you need to configure the application to use Fiddler as a proxy.

It’s immensely useful when debugging web services and weird problems with web applications.

It Wasn’t Working For Me

Unfortunately, I ran into an annoying problem. The emulator is not a WinINET-based application nor does it allow configuring a proxy, thus Fiddler was not reporting any traffic.

Configuring Fiddler as a Reverse Proxy

Fortunately, I found instructions on the Fiddler site that shows you how to configure Fiddler as a Reverse Proxy.

A reverse proxy sits in front of your webserver and forwards requests on to your webserver. Thus the application doesn’t need to be configured to use it. All I had to do was ask the developers to change the application to make requests for port 8888 (I’ll explain later why I couldn’t just set up a HOSTS file entry).

I then added a rule to forward requests for localhost port 8888 to localhost port 80 like so:

if (oSession.host.toLowerCase() == “localhost:8888”) \     oSession.host = “localhost:80”;

Unfortunately, this didn’t work, creating some weird infinite loop when I would make a request to localhost:8888. To rectify this, I added an entry to the HOSTS file to map the hostname MOBILE to the ip address 127.0.0.1. Fiddler apparently doesn’t work as a simple port forwarder (I’ve got a solution for that, keep reading).

Image depicting a hosts file. The last entry shows the ip address
127.0.0.1 mapped to the host name
'Mobile'

I then updated the custom rule in Fiddler to route requests for mobile:8888 to port 80 of the localhost and again asked the developers to change the url encoded in the app (I don’t have the source for the client app).

if (oSession.host.toLowerCase() == “mobile:8888”) \     oSession.host = “localhost:80”;

Now I can monitor requests from the emulator to localhost using Fiddler. One benefit of using Fiddler is that I can replay requests tweaking form values and such.

Image of a Fiddler
session.

Dealing With Hard-Coded URLs

In the most recent build of the game, the game developers accidentally changed the hard-coded URL to point back to the QA environment. For the sake of this example, suppose it is http://mobile.example.com/.

Rather than asking them once again to change it, I decided to try and work around this. I added the QA server hostname to the HOSTS file just like I did with MOBILE before, pointing it to localhost. I then had to change IIS on my machine to run on a different port, since I planned to configure TcpTrace to listen in on port 80. I chose the perennial favorite alternate port for IIS, port 8080. and used TcpTrace to listen on port 80 and forward requests to port 8080.

Image of TcpTrace window forwarding requests for port 80 to
8080.

This allowed me to view the HTTP traffic back and forth between the emulator and the web service again using TcpTrace. Unfortunately, I could not get Fiddler to work in this setup, so I lost some of the ability to tweak and replay requests. This ended up being fine since the latest build is meant for final testing.

The following are some useful resources for HTTP debugging.

code comments edit

A big
wrenche Jeremy Miller believes that among the various beneficial and important qualities a codebase can have, the single most important quality of code is maintainability. I totally agree, having spent many hours maintaining legacy code written years ago as well as code I wrote a week ago.

As soon as a line of code is typed on the screen, it becomes legacy.You are now maintaining that code.

Enterprise software systems change.  Business rules change, technology platforms change, third party dependencies are upgraded.  …

Enterprise systems typically aren’t replaced because they stop working.  The end of life cycle for an application or system is often brought about because the system has become too difficult, risky, or expensive to modify to keep up with evolving needs.

According to Robert L. Glass in Facts and Fallacies of Software Engineering, research shows that maintenance typically consumes from 40 to 80 percent of software costs, typically making it the dominant life cycle phase of a software project.

Fortunately, Jeremy has your back. He’s written a multi-part treatise/manifesto on writing maintainable code. Some of the posts are a bit longer than your typical skim it while reading blogs during a compile entry, but are well worth reading.

Print them out, sit back with a fine glass of scotch, and savor the excellent knowledge he imparts.

I look forward to the next post in the series.

comments edit

While I really enjoyed the holidays, one part was really difficult for me. There was some great discussions happening about Subtext in the mailing list and in various blog posts, but I was too busy to really get involved.

I’m reading everything, but times are really busy for me right now as I’ve fallen a bit behind on the book and have to play catch up. Not only that, work is really picking up.

Unfortunately this means less time to work on Subtext and blog about it. Fortunately, others have picked up the slack over the holiday weekend. I wanted to highlight a few of those posts.

Adding Custom ASPX pages to Subtext

Now that Barry Dorans finally migrated his blog to Subtext, he’s writing about it. One post he wrote deals with how to workaround the fact that Subtext intercepts all requests for *.aspx pages by default. Thus if you try and add an aspx page for your own needs, it won’t get rendered. Barry walks you through how to add your own .aspx pages to a Subtext installation.

Barry also provides a quick tip on how to recalculate view stats in Subtext.

Subtext and IIS 7

Sascha Sertel wrote a couple of interesting posts that cover how to get Subtext up and running in IIS 7 on Vista.

His first post covers Installing Subtext in an IIS 7 virtual directory with SQL Server 2005. His guide provides some great troubleshooting advice for getting Subtext up and running in this scenario. Hopefully the next version of Subtext will support this scenario much better via improved documentation and error messaging.

He has a short follow up post that covers installing Subtext as its own Website in IIS 7. In truth, this probably applies to any web application in IIS 7.

His latest post in this series covers debugging Subtext on Vista using IIS 7 and Visual Studio 2005. While I personally use the built in Webserver.WebDev for debugging, I do need to test the code using IIS before deployment. This is useful information to have.

Merging Blogs

Keith Elder writes about his experience merging two separate blogs into a single blog on Subtext. Once he had the data imported into a local database, he deployed the Subtext code and ran exported the local data to BlogML and imported that same BlogML from the server. Another success story for BlogML.

So I may not be as heavily involved in current Subtext development at the moment as I would like, but development is still moving forward with or without me. That’s a good feeling.