0 comments suggest edit

I can get a bit overboard with my virtual paths. I tend to prefer virtual paths over relative paths since they feel safer to use. For example, when applying a background image via CSS, I will tend to do this:

``

body

{

    background: url(‘/images/bg.gif’);

}

My thinking was that since much of the code I write employs URL rewriting and Masterpages, I never know what the url of the page will be that references this css. However my thinking was wrong.

One problem I ran into is that on my development box, I tended to run this code in a virtual directory. For example, I have Subtext running in the virtual directory /Subtext.Web. So I end up changing the CSS like so:

``

body

{

    background: url(‘/Subtext.Web/images/bg.gif’);

}

Thus when I deploy to my web server, which is hosted on a root website, I have to remove the /Subtext.Web part. Now if I had read the CSS spec more closely, I would have noticed the following line:

Partial URLs are interpreted relative to the source of the style sheet, not relative to the document.

Thus, the correct CSS in my case (assuming my css file is in the root of the virtual application) is…

``

body

{

    background: url(‘images/bg.gif’);

}

Now I have true portability between my dev box and my production box.

It turns out that Netscape Navigator 4.x incorrectly interprets partial URLs relative to the html document that references the css file and not the css file itself. Perhaps this was where I got the wrongheaded notion embedded in my head way back in the day.

personal 0 comments suggest edit

Pimped Desktop Ryan Farley gives the lowdown on his tricked out desktop.

In the past I’ve tried to get into tricking out the desktop, but everytime I switched to a new computer, I felt less and less inclined to do invest the time. Besides, I remember some of these programs would slow down the OS. I like my desktop to be lean and mean.

But after seeing Ryan’s screenshot, I may have to consider playing around with some of the customizations.

It’s funny to me how many geeks I know would dread spending time selecting drapery and customizing the small details of their house (“Pick any color, honey. I don’t care.”) yet will obsess over every pixel of their desktop.

0 comments suggest edit

Link Jon Galloway has an interesting write-up on the latest changes to Google’s search algorithms code named “Jagger”

The short and sweet summary is that rather than letting websites “vote” on a page’s relevancy with a link, the trustworthiness of a page is taken into account. For example, a site that has been around longer is potentially considered more trustworthy (assuming it meets other criteria). A page that has incoming links from trustworthy sources is itself more trustworthy.

I had always thought that this was how the PageRank algorithm worked all along. After all, the Google boys original inspiration was the network of citations in academic papers and texts. A citation from a well-cited and trustworthy source boosted the respectability of the cited paper, whereas a citation from a nobody, didn’t account for much.

In the end, I am pretty happy about these changes as my ad-sense revenue has increased lately.

0 comments suggest edit

As I mentioned in my last post, my redesign was inspired by some of the lessons in this book, “Bulletproof Web Design by Dan Cederholm”.

The main focus of this book is how to use CSS and semantic (X)HTML markup to create flexible websites. By flexible, the author is referring to a web site’s ability to deal with the different ways a user may choose to view a site. For the most part, he covers how to make your site more accessible.

For example, many sites do not deal well with the change caused by a user resizing the text. Some sites do not deal well with this, totally breaking the design. If you specify font sizes in pixels for example, IE won’t allow text resizing at all, which gives the designer control, but at the cost of accessibility for those with high resolution monitors or poor eyesight.

Cederholm instructs the reader on several ways to make sites deal with text resizing in a more flexible manner while retaining control. For the most part though, the designer has to give up pixel perfect control in exchange for a better user experience.

The book also delves into accessibility tips such as making sure the sight is readable when images are turned off and when CSS is turned off for those with slow connections or using text to speech readers respectively.

Each chapter presents a sample of a website design that is not flexible. Most of the samples come from real-world sites, though some were made up. He then walks through the steps to recreate the design element using clean semantic xhtml and CSS. One key benefit of this approach, apart from the increased flexibility, is that the amount of markup is greatly reduced in most cases as 1 pixel images and empty table cells are no longer needed.

Lest one think Cederholm is an anti-table zealot, he points out that there are situations where using a table is correct and semantic: when displaying tabular data of course. He then demonstrates how to use tables and CSS properly to get the proper layout without resorting to nested tables and empty table cells. The key is that the table should model the data, not the layout and he succeeds.

In the end, Bulletproof is a quick and worthwhile read with clear diagrams and plenty of css examples. There were some examples I wish he had taken further. For example, he mentions several uses for the dictionary list element (<dl>) to semantically mark up code, but only presents one example of styling a dictionary list. Understandable since this was not meant to be a complete compendium of CSS examples. Even so, I found plenty of good advice which I ended up applying to this site. The site responds well to enlargening the text (to a limit).

If you are a fan of “CSS Zen Garden”, this book would serve as a nice complement. “CSS Zen Garden” inspires designers with what is possible to do with CSS. “Bulletproof Web Design” provides some of the tools to get there.

0 comments suggest edit

After completing two of the three books I said I would be reading in 2006, I decided to apply some of the lessons from the book Bulletproof Web Design by Dan Cederholm by slightly redesigning my site.

The change isn’t drastic on the surface, though I like to think it looks nicer and cleaner. Most of the changes are under the hood in the HTML and CSS. Most of you won’t notice since you read this via an RSS aggregator, but if you have a moment, take a look and let me know what you think.

A short book review is forthcoming.

csharp, code 0 comments suggest edit

While reviewing some code this weekend, I had the thought to do a search for the following string throughout the codebase, “catch(Exception” (using the regular expression search of course it looked more like “catch\s(\sException\s*)”.

My intent was to take a look to see how badly Catch(Exception...) was being abused or if it was being used correctly. One interesting pattern I noticed frequently was the following snippet…

try
{
    fs = new FileStream(filename, FileMode.Create);

    //Do Something
}
catch(Exception ex)
{
    throw ex;
}
finally
{
    if(fs != null)
        fs.Close();
}

My guess is that the developer who wrote this didn’t realize that you don’t need a catch block in order to use a finally block. The finally block will ALWAYS fire whether or not there is an exception block. Also, this code is resetting the callstack on the exception as I’ve written about before.

This really just should be.

try
{
    fs = new FileStream(filename, FileMode.Create);

    //Do Something
}
finally
{
    if(fs != null)
        fs.Close();
}

Another common mistake I found is demonstrated by the following code snippet.

try
{
    //Do Something.
}
catch(Exception e)
{
    throw e;
}

This is another example where the author of the code is losing stack trace information. Even worse, there is no reason to even perform a try catch, since all that the developer is doing is rethrowing the exact exception being caught. I ended up removing the try/catch blocks everywhere I found this pattern.

personal 0 comments suggest edit

When you hear the phrase, “use your head” you are typically being told to think. There are other uses of the head that are quite unwise. For example, trying to clear a soccer ball away from another player rushing in on the attack when you are a step too late. Unfortunately that’s exactly what I tried today.

My head just happened to get in the way of the shoulder of the onrushing soccer player when we both jumped to try and win the ball. It was really no contest as his shoulder won, leaving a nice inch long laceration on top of my scalp. Fortunately it wasn’t very deep and I was not knocked unconscious, though I bled a lot and had a nice Tom & Jerry bump on the head.

tom-jerry-bump

This earned me a trip to the ER which is NOTHING like the TV show. If it were, the show would have been cancelled after the first episode. I fail to see how interesting a show would be where the patients wait around for four hours before a doctor sees them to perform a grand total of five to ten minutes of actual work.

In any case, the extremely busy doctor made quick work of cleaning out the wound and stapling it shut with two painful squeezes of the stapler (no local anesthesia). I hadn’t realized how helpful office supplies could be when applied to the head.

The doctor said I show no signs of a concussion and should be ready to play again in a few days as soon as I feel comfortable. I’m glad I’ll be able to play next week, but I won’t be using my head as much.

0 comments suggest edit

I have a really old Kodak photography book laying around that delivers various tips for how to advance from your typical crapola™ snapshots into something worth boring your friends with on Flickr after your last vacation.

It is really too bad that I’ve forgotten everything the book had to say. Fortunately Robb Allen is starting a series of Photography lessons for our general photography improvement. Read lesson 1 and start taking better pics. Your friends and family will thank you for it.

My personal tip is to buy the biggest memory card you can afford, fill that sucker up when taking pictures, and delete vigorously before showing showing the pics off. Memory is getting cheaper and is way cheaper than paying for film and developing. Why settle for just one chance to get a great shot of your kid picking his nose when you can get three and keep the best. The odds are in your favor.

Just remember to delete vigorously because while memory is cheap, time isn’t.

0 comments suggest edit

Via Larkware News I noticed that Subversion 1.3 has been released. Looking at the release notes I noticed one thing in particular that caught my attention.

Official support for Windows ‘_svn’ directories (client and language bindings)

The “_svn” hack is now officially supported: since some versions of ASP.NET don’t allow directories beginning with dot (e.g., “.svn”, the standard Subversion working copy administrative directory), the svn command line client and svnversion now treat the environment variable SVN_ASP_DOT_NET_HACK specially on Windows. If this variable is set (to any value), they will use “_svn” instead of “.svn”. We recommend that all Subversion clients running on Windows take advantage of this behaviour. Note that once the environment variable is set, working copies with standard “.svn” directories will stop working, and will need to be re-checked-out to get “_svn” instead.

What this means for VS.NET developers using Subversion is that using Ankh to provide Source Code Control Integration (SCCI) becomes a more attractive option. One reason I held off on using Ankh is that it required using a separate build of Subversion. But now, I’m so comfortable using TortoiseSVN that I prefer it to using source control bindings, so I probably won’t switch just yet. The SCCI interface just doesn’t seem rich enough compared to the turtle and its shell extensions.

0 comments suggest edit

I happened to notice my dog Twiggy apparently admiring herself in the mirror. At one point she was right up against the mirror looking at herself. By the time I got my camera, she had decided to on a more comfortable vantage point to check herself out.

Twiggy
Mirror

And here I am working my ass off while she just lounges around. I really need to get her a j-o-b.

0 comments suggest edit

Yesterday I received a call from my very exasperated father who recently has been helping my Korean mother learn how to use the web and web-based email.

  1. Dad

    Talk to your mother!

  2. Me

    Umm.. Ok. About what?

  3. Dad

    She got a nasty email and now she’s furious that I sent it to her. I tried to explain the concept of SPAM to her, but she doesn’t believe me. Maybe she’ll believe you.

  4. Me (groaning)

    Okay! Put her on.

My poor mother didn’t understand how anyone could get her email address and send her such filth, so therefore via deduction, it had to be my dad. She was shocked and appalled that he would send this to her.

I calmly explain to her how companies just love to sell your data to other less scrupulous companies who then send out emails to EVERYBODY. I get it, she gets it, dad gets it. Everyone gets it. I think she understands now, but I wonder if she’ll continue to bother with email now.

0 comments suggest edit

These days it seems that the 40-hour workweek is a pipe dream of of a bygone past (if it ever was a reality). This seems especially true for the field of software development which seems to glorify working excessively long hours like an old fashioned pissing match.

It is pretty well documented that working long hours can end up being counter-productive. After working for a prolonged stretch, workers tend to get fatigued and the law of diminishing returns kicks in. I read of one study that demonstrates how productivity steeply declines in the 45th to the 50th hour. In my opinion, this is especially true for software development as code written in the wee hours of a marathon session tend to produce more work in the long run due to bugs. That gets budgeted elsewhere.

Software developers and management just don’t keep track of the real productivity numbers. They’ll remember that you got the code completed by the deadline via marathon sessions, but they won’t factor in the time spent debugging and fixing bugs found weeks later due to that session.

Not to mention the negative impact on employees relationships and physical health. It’s no wonder that a common new years resolution among developers was to get in better shape.

So I find it fitting that my friend Kyle sends me this article written by Joe Robinson, author of “Work to Live”. The title of it is “Bring back the 40-hour workweek – and let us take a long vacation

I’ve written about work-life balance before, but I should make clear before anyone gets the wrong idea that desiring work-life balance does not make one a slacker. Unfortunately I have been having trouble personally applying this philosophy since I started a company, but as an owner, every extra hour benefits me. For many salaried employees, creating an environment where the bravado of working long hours is encouraged, primarily appears to benefit your corporate masters (unless you are paid in overtime etc…).

So for this year, make your resolution to work less and live more (unless you really are a slacker in which case you should get off your lazy butt). Spend more time getting into shape and other hobbies you enjoy. If all you want to do is code, spend that extra time contributing to an open source project. You might learn something that helps keep you competitive.

0 comments suggest edit

This is an eye opening and interesting account of how Tom Oward was able to data mine Amazon’s wish list database to get a profile of “subversives” based on their requested reading list. Makes you think twice before adding a book to your wishlist.

Using a pair of 5-year-old computers, two home DSL connections, 42 hours of computer time, and 5 man hours, I now had documents describing the reading preferences of 260,000 U.S. citizens.

I downloaded all the files to an external 120 GB Firewire drive in UFS format. The raw data occupied little more than 5 GB. I initially wanted to move all the files into a single directory to facilitate searching, but as the directory contents exceeded 100,000 items, the speed became glacially slow, so I kept the data divided into chunks of 25,000 wishlists.

Next comes the fun part – what books are most dangerous? So many to choose from. Here’s a sample of the list I made. Feel free to make up your own list if you decide to try some data mining. Send it to the FBI. I’m sure they’ll appreciate your help in fighting terrorism.

Link

[Via Boing Boing]

0 comments suggest edit

I’ll try not to bore you with the typical reflections of the past year. Yesterday it pretty much rained all day, ruining our NYE plans. But where one door is closed, another door opens. We ended up having a grand time at our friends place playing board games and drinking white russians. A toast to 2006!

By the way, I have to give a shout out to my soccer buddies. These are a diverse group including some amazing soccer players of a range of ages who showed up despite the rain to get a sweet game on. It rained the entire time except for the last 15 minutes or so. You gotta love that type of devotion (or is it insanity?).

My favorite play was the one in which involved a give and go between a teammate and his mother. A mother son play steals the day.

0 comments suggest edit

So it looks like our New Years Plans have been derailed by the rain and the Fire Marshall event organizers as reported by the fire department’s blog. It stopped raining an hour ago, but the wet conditions and the sheer amount of electrical equipment needed for Giant Village has made it a safety hazard. Bummer.

personal 0 comments suggest edit

In a blatant ripoff of Scott H, I am going to post the best of “You’ve Been Haacked” 2005 edition. This year, you laughed, you cried and when you were done, you came over and briefly glanced at my blog. But I took it in stride and continued to write, rant and rave… and this was the best I could come up with in 2005^1^.

Technical

Personal

^1^ There is no objective criteria for choosing these posts. I simply deemed these to be the highlights.