Every day I look at my current code and go, “Damn, that’s some sweet shit!” But I also look at code I wrote a month ago and say, “What a freakin’ idiot I was to write that!” So in a month, the code I’m writing today will have been written by an idiot.
It looks like I am not the only one who feels that way.
It seems that at no matter which date if I look back to the code I wrote six months prior, I always wonder “what kind of crack was I smoking when I wrote that?” For the most part it’s not likely to end up on the daily wtf, but still, does this cycle ever end? Or at least get longer?
I suppose the optimistic way to look at it is that I am still learning pretty steadily, and not becoming stagnant. I’m also able to resist the temptation and go back and fiddle with what isn’t broke. I do kinda feel bad for anyone that has to maintain any of my older stuff (actually not really, suckers).
[Via Pragmatic Entropy]
By now, every developer and his mother knows that VS.NET 2005 and SQL
Server 2005 has been
Prepare for the generics explosion as legions of .NET developers
retrofit code, happily slapping
<T> wherever it fits.
I predict the number of angle brackets in C# will initially increase by 250% only to settle over time to around 75% above current numbers. If you don’t count the angle brackets in C# comments, could be even higher.
But before you go too hog wild with generics, do consider that generics have an overhead associated with them, especially generics involving a value type. Their benefits do not come completely free.
As Rico Mariani pointed out in his PDC
talk, generics involve a form of code generation by the run-time. His
rule of thumb was that when your collection contains around
or so items, you’ll the benefits outweigh the overhead. But as always,
measure measure measure.
In general, the strong typing and improved code productivity outweigh any performance concerns I have with small collections.
UPDATE: Whoops, I mistyped the number of items Rico mentioned. He said 500, not 50. Thanks for the correction Rico.
Just found out that Harriet Miers withdrew from the running for a position as a Supreme Court justice via The Onion.
The great people at WebHost4Life moved my database and web server to new Windows 2003 servers. They put them in the same server block and I noticed a significant speed increase in the time it takes my blog to load. This explains why my site was down this morning.
Hopefully, this server has much less abusive tenants than my last one did.
Eric Lippert does a great job of defining the term Idempotent. I’ve used this term many times both to sound smart and because it is so succinct.
The one place I find idempotence really important is creating update scripts for a database such as the Subtext database. In an open source project geared towards other devs, you just have to assume that people are tweaking and applying various updates to the database. You really have no idea in what condition the database is going to be in. That’s where idempotence can help.
For example, if an update script is going to add a column to a table, I try to make sure the column isn’t already there already, before adding the column. That way, if I run the script twice, three times, twenty times, the table is the same as if I ran it once. I don’t end up adding the column multiple times.
And he presents a very nice case. Nice enough that I take back the times I have condescendingly said that Windows Services are easy to write in .NET. I probably should look through some of the services I have written in the past. I know of one I could easily convert to a console app and gain functionality.
However, I think the decision sometimes isn’t so easy as that. One service I have written in the past is a Socket Server that takes in encrypted connections and communicates back with the client. Now that obviously needs to run all the time and is best served as a Windows Service. The problem then was since I had written the Windows Service code to be generalized, I was able to implement many other services very quickly, even ones with timers that ran on a schedule.
However, the most challenging ones to write, happened to be the ones that ran on a schedule, since the scheduling requirements kept changing and I realized I was going down the path to implementing…well…the Windows Task Scheduler.
In general, I think Jon’s right. If all you are doing is running a scheduled task, use Windows Task Scheduler until you reach the point that your system’s need are no longer met by the scheduler. This follows the principle of doing only what is necessary and implementing the simplest solution that works.
In a conversation, Jon mentioned that a lot of developers perceive Windows Services to be a more “professional” solution than task scheduling a console app. But one way to think of a service is an application that responds to requests and system events, not necessarily a scheduled task. So to satisfy both camps you could consider creating a service that takes in requests, and a scheduled task to make the requests. For example, a service might have a file system watcher active and a scheduled task might write the file. I don’t suggest adding all this complexity to something that can be very simple.
For me, I also like writing windows services because I have a system for quickly creating installation packages very quickly. What I need to do is spend some time creating an installer task for setting up a windows task scheduler job. That way I can do the same for a scheduled console app.
Welcome to the new pissing contest.
Oh my, I couldn’t possible link to that blog. It’s only worth half what my blog is worth.
According to this site, my blog is worth…
Ok, I’m ready to sell out! ;)
Of course, as Scott Reynolds points out,
A thing is only worth what someone will pay you for it.
Which is actually encouraging because I can probably find some fool willing to part with twice the figure listed above for my blog. But I better sell fast before the bubble bursts and blogs go through a huge market slump. Wouldn’t want my pedigreed blog having to resort to McDonald pitches to bring up its own worth.
I say to you that the VCR is to the American film producer and the American public as the Boston strangler is to the woman home alone. – Jack Valenti, former head of the MPAA
Yes it is true that Google’s “Do No Evil” motto is pure marketing schtick, and now that they are a large corporation, they can be just as evil as any other corporation, that still doesn’t take away from the benefits of the Google print project. Not all profit driven operations are evil. Those at Microsoft should know that.
Is It Going To Harm Publishers?\ Dare uses himself as an example in that he almost never buys technical books, but chooses to search the web to find references that he needs. However, if Google is to be believed, there is a big difference with web search and the search within a book feature.
The difference is that with web search you get the full content of what you are searching for. With Google Print, you get a snippet of a page in the book. Perhaps it contain all that you need, perhaps not. If it doesn’t, you’ll spend a lot of time searching trying to hit that exact page. Can you imagine trying to read through a volume of Art of Computer Programming like that? You might as well just physically go to the bookstore.
If you only needed one little piece of information from the book, you probably wouldn’t have bought it anyways, right? For example, Dare already admits he never buys technical books. So how will Google Print take Dare’s money from publishers? They aren’t getting his dollar already.
Personally, I find reading a book to be a great way to get a focused education on a technical topic. However, I would want to be able to search within the book to see that it does cover the topic in the depth I expect, and I hate running to Borders Book Store to do so. It’s a great relief when a book I am considering is part of Amazon.com’s Search Within a Book program.
Technical References are a small part of the total market\ Another key point to make is that technical reference books are a very tiny part of the entire book market. I certainly don’t want to read The Invisible Man via search. The general public are not going to search their way through the latest Stephen King novel. I don’t see how searching within a book is going to hurt the huge majority of publishers. As many point out, it will be an enabler of the long tail, perhaps selling books long forgotten by their publishers.
Legality\ As for the legality of the program, you should read Lawrence Lessig’s take on it. In his opinion, it most definitely constitutes fair use. If that is the case, whether or not it hurts the publishers becomes a moot point. Much like the pain that the VCR caused the movie industry was a moot point. Oh wait, the movie industry made millions off of the VCR…
Ok, I may have misfired with the last video, but this is truly hilarious. It passed the “Wife Test” (the last one didn’t).
I hear this is an old one, but it is a clip from the humorous Improv show, “Whose Line Is It Anyways?” with Drew Carey, Wayne Brady, etc all
In this one, Richard Simmons is a guest.
Watch it! (this time, I did not forget to link to it). It takes a bit of time to download and get started, so be patient.
I’ve talked before about the various physical pains that software developers face on the job. For me, it seems that my pain likes to migrate around my body. If I have pain, it almost always seems to be one at a time.
For example, recently, my hands started hurting again, but my back felt much better. More recently, my hands and back felt good, but my eyes started bugging out due to eye strain. Now I am back to my back hurting, and everything else is feeling good. I hope to get back to where everything feels good, but I think that situation only occurs in the womb.
My Contact Me page works again. Sorry for the prior inconvenience.
UPDATE: For the most part, I think young Phil Haack is full of shit in these first two paragraphs. I definitely now think unit tests should NOT touch the database. Instead, I do separate those into a separate integration test suite, as I had suggested in the last paragraph. So maybe Young Phil wasn’t so full of shit after all.
I know there are unit testing purists who say unit tests by definition should never touch the database. Instead you should use mock objects or some other contraption. And in part, I agree with them. Given unlimited time, I will gladly take that approach.
But I work on real projects with real clients and tight deadlines. So I will secretly admit that this is one area I am willing to sacrifice a bit of purity. Besides, at some point, you just have to test the full interaction of your objects with the database. You want to make sure your stored procedures are correct etc…
However, I do follow a few rules to make sure that this is as pure as possible.
First, I always try and test against a local database. Ideally, I will script the schema and lookup data so that my unit tests will create the database. MbUnit has an attribute that allows you to perform a setup operation on assembly load and teardown when the tested assembly unloads. That would be a good place to set up the database so you don’t have to do it for every test. However, often, I set up the database by hand once and let my tests just assume a clean database is already there.
Except for lookup data, my tests create all the data they will use using whichever API and objects I am testing. Each test runs within a COM+ 1.5 transaction using a RollBack attribute so that no changes are stored after each test. This ensures that each test is testing against the same exact database.
This is the reason I can be a bit lazy and set up the database by hand, since the none of the tests will change the data in the database. Although I would prefer to have a no-touch approach where the unit tests set up the database. For that, there is TestFu which is now part of TestDriven.Net.
From my experience, I think this approach is a good middle ground for many projects. A more purist approach might separate the tests that touch the database into a separate assembly, but still use NUnit or MbUnit to run them. Perhaps that assembly would be called IntegrationTests.dll instead of UnitTests.dll. It’s your choice.
You know you’re a big geek when a sequence of numbers with an interesting property just pops in your head. No, I’m not talking about myself (this time). Jayson Knight is the big geek as he noticed a pattern in a sequence of numbers that popped in his head…
This just popped into my head the other day for no other reason than to bug me: Square all odd numbers starting with 1…subtract 1 from the result…then divide by 8. Now look for the pattern in the results.
He even provides a code sample to do the math for you, but you can easily do it by hand on paper. The pattern he noticed can be phrased another way, the square of any odd number when divided by eight leaves a remainder of 1.
This is actually a pattern noticed by John Horton Conway and Richard Guy in 1996. They stated that in general, the odd squares are congruent to 1 (mod 8).
I couldn’t find their proof, but it is easily proved by induction. I’ll walk you through the steps.
The Proposition\ We want to prove that
(x2 - 1) mod 8 = 0 for all x >= 1 where x is an odd integer.
Note that this is the same as proving that
x2 mod 8 = 1. In other
words, if we prove this, we prove the interesting property Jayson’s
Verify the Base Case\
Here’s where our heavy duty third grade math skills come into play. We
try out the case where
x = 1.
(12 - 1) mod 8 = 0 mod 8
So yes, 0 mod 8 is zero, so we’re good with the base case.
Formulate the Inductive Hypothesis\ Ok, having demonstrated the fact for x = 1, let’s hypothesise that it is indeed true that
(x2 - 1) mod 8 = 0 for all odd x > 1 where x is an odd integer
Now prove it\ Here we prove the next case. So assuming our above hypothesis is true, we want to show that the it must be true for the next odd number. We want to show that
((x+2)2 - 1) mod 8 = 0
Well that can be multiplied out to…
((x2 + 4x + 4) - 1) mod 8 = 0Note I don’t subtract the one from the four.
So just re-arranging the numbers a bit we get…
((x2 - 1) + 4x + 4) mod 8 = 0
Now I factor the right side and get (You do remember factoring right?)
((x2 - 1) + 4(x + 1)) mod 8 = 0
Ok, you should notice here that we know what’s on the left side is certainly divisible by 8 due to our hypothesis. So we just need to prove that 4(x+1) is also divisible by 8. If two numbers are divisible by another number, we know the sum of the two numbers are also divisble by that number.
Well it should be pretty clear that 4(x+1) is divisible by eight. How? Well since x is an odd number, x + 1 must be an EVEN number. We can rewrite (x + 1) as 2n where n is an integer (the very definition of an even number). So our equation becomes…
(x2 - 1) + 4(2n) mod 8 = 0
Which is naturally…
((x2 - 1) + 8n) mod 8 = 0
And we’re pretty much done. We know that (x^2^ - 1) is divisible by eight due to our inductive hypothesis. We also know 8n is divisible by eight. Therefore the sum of the two numbers must be divisble by 8. And the proof is in the pudding.
Ok, some of you are probably thinking I am hand waving that last conclusion. So I will quickly prove the last step. Since we know that the left hand side is divisible by eight, we can substitute 8m where m is an integer (the very definition of a number divisible by eight).
That leaves us with…
(8m + 8n) mod 8 = 0
which factors to…
(8(m + n)) mod 8 = 0
Conclude the proof for formality sake\ And thus, proposition is true for all odd integers.
Sorry for such a long boringNo need to thank me for a long and
scintillating math post, but it’s been a loooong time since I’ve
stretched my math muscles. This was a fun excercise in inductive proofs.
So how does an inductive proof prove anything? At first glance, for those unfamiliar with inductive proofs, it hardly seems like we proved anything. Our proof rests on an assumption. We stated that if our assumption is true for one odd number, then next odd number must exihibit the same behavior. We went ahead and proved that to be true, but it still leaves the possibility that this isn’t true for any odd number.
That’s where our base case comes in. We showed that for x = 1, it is indeed true. So since it is true for x = 1, we’ve proved it is true for x = 3. Since it is true for x = 3, we know it is true for x = 5. Ad infinitum.
And that concludes today’s math lesson.
UPDATE: Fixed a couple typos. Thanks Jeremy! Also, optionsScalper in my comments list a lot of great links about number theory and congruences. I applied his correct suggestion to clarify the mod operations by putting a parenthesis around the left hand side.
In my last post, I didn’t explain the pattern to Jayson’s satisfaction and I had a typo in my proof that I have since corrected.
My proof demonstrated one pattern, namely that the square of an odd number minus one is divisible by eight. However, Jayson noticed that if you start with the first few odd numbers and go through those mathematical steps, the result of the operation leaves you with another series with interesting properties.
It turns out that series is the triangular series. I believe what Jayson wanted to know was why his function yielded this sequence. I shall dig into this here (notice I used the word shall? That’s a mathematician thang. You wouldn’t understand ;)) Here are the first few numbers in the sequence…
0, 1, 3, 6, 10,…
Another way to look at the series is…
f(0) = 0 f(1) = 0 + 1 = 1 f(2) = 0 + 1 + 2 = 3 f(3) = 0 + 1 + 2 + 3 = 6 f(4) = 0 + 1 + 2 + 3 + 4 = 10 . . . f(n) = 0 + 1 + 2 + ... + n - 1 + n = ???
The n^th^ number in the series is the sum of all the numbers before n and including n. There’s a simple formula to get the n^th^ number in this series. Legend has it that Karl Friedrich Gauss discovered this while a very young student. He was told to sum up the numbers from 1 to 100 as a means to keep him busy for a long time. In a very short while, he came up with the answer. He observed that you could simply pair the numbers up like so…
1 + 100 = 101 2 + 99 = 101 3 + 98 = 101 . . . 50 + 51 = 101 = 50 * 101 = 5050
It turns out, that the sum of all numbers n and below can be described by the simple formula…
So how does this equation relate to the one Jayson showed us? Well to refresh your memory, that equation could be described as such
f(xi) = (xi2 - 1)/8 = Ti
In English, that means that applying his function to the i^th^ odd number yields the i^th^ triangular number.
So let’s start doing some simple algebraic substitutions. First, we need to define what we mean by the “i^th^” odd number. What is the odd number at i=0? Well that should clearly be the first odd number, one. So we state…
xi = 2i + 1
That’ll make sure we are only dealing with odd numbers. Now let’s substitute for x~i~
f(xi) = f(2i + 1)
Ok, this next step is a little tricky. By definition, f(x) = (x^2^ - 1)/8. This is Jayson’s formula. So let’s expand out f(2i + 1) into this formula. My assistant, the color green will assist to keep this clear.
f(xi) = ((2i + 1)2 - 1)/8
By now, I am really wishing HTML supported math symbols easily. Now doing some multiplying.
f(xi) = (4i2 + 4i + 1 - 1)/8
Doing a bit of arithmetic leads us to
f(xi) = (4i2 + 4i)/8
f(xi) = 4i(i + 1)/8
Doing some division (man this math stuff is hard)
f(xi) = i(i + 1)/2
Does that look familiar? I hope you are having an aha moment (if you didn’t have it a long time ago). That is the formula for the i^th^ triangular number! Thus with a bit of algebra, I have demonstrated that…
f(xi) = (xi2 - 1)/8 = i(i + 1)/2 = Ti
So that is why his function reveals the triangular number series.
One interesting thing about triangular numbers are their connection to Pascal’s triangle as evident in this image I found at this site.
Trippy eh? You gotta love the various diversions mathematicians come up with to keep themselves busy.
I noticed a recent check-in has added a
TimeOut property to the
RollBack attribute in MbUnit. Woohoo!
A while ago I presented the source
code for a
attribute for NUnit based on Roy
Osherove’s work in the area. Well I
found a little problem with using the RollBack attribute that affects
the one I presented along with the one that comes packaged with MbUnit.
I uncovered the problem while running a particularly long running unit test. Every time I ran the test, it failed at just about exactly 61 seconds into it (I know, a unit taking that long is kind of useless for TDD, but I’ll get that time down to something manageable. I promise!).
I reran the test multiple times and the line of code it failed on would be different, but MbUnit was showing me that it was failing at 61 seconds every time. To prove it, I removed the RollBack attribute and ran the test and it succeeded after around 90 seconds (yeah, I have some heavy perf work to do, but it is a BIG test).
The error message I got each time was Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
Not a helpful message because I wasn’t attempting to complete the transaction. But the timing of the matter made it obvious to me I was running into a timeout issue.
The RollBack attribute works by enlisting a COM+ 1.5 transaction, which
allows you to use Enterprise Services without inheriting from
ServicedComponent using a feature called Services Without Components
or SWC for short (gotta love them TLAs). To work around the issue in
MbUnit, I simply removed the RollBack attribute and added the code to
start a COM+ transaction directly to the method. The one change I made
was to set the
TransactionTimeout property which takes an integer
timeout value in seconds.
public void MyTest()
ServiceConfig config = new ServiceConfig();
config.TransactionTimeout = 120;
config.Transaction = TransactionOption.RequiresNew;
//Run my test code…
//Abort the transaction.
At the same time, I revisited the
RollBack attribute I put together
for NUnit and added a
TransactionTimeout property to the attribute.
That way you can mark up a test like so…
public void MyTest()
//Run my test code…
As for MbUnit, I’ll mention this to the maintainers and we’ll hopefully see a fix soon.
The problem with extremists is that they inevitably color the mainstream’s perception of a thing, whether it be a race, a culture, or a software development practice.
In truth though, it is also important for the mainstream to use better judgement and stop falling for that trap. For example, I’ve read several articles and blog posts that attack unit testing (and by extension Test Driven Development) as a practice. What is interesting is that many of the points used to pillory unit testing are examples of taking the practice of unit testing to the extreme, and not necessarily a reasonable and mainstream usage of the practice.
So let’s make this very clear using a simple logical statement.
The fact that Unit Testing is a fundamental part of Extreme Programming does not imply that Extreme Programming is a fundamental part of Unit Testing.
For example, as I said many times, code coverage is not the end goal of unit testing. That’s extremist to say so. Your time is better spent focusing on automating tests for the most troublesome or important code.
Automated unit tests are NOT a replacement for system testing, beta testing, integration testing, nor any other kind of testing. Unit tests are only one small part of the testing equation, but they are an important part, just as all the other types of testing are important.
So in most cases, it pays to stop looking to the extremists to make a case against a practice (such as unit testing) and start talking to those using it in the real world and getting real results.
Last night the missus and I attended the launch party for American Idol Underground, the site I’ve been working on for a client.
The second best part of the party was when we arrived to two huge lines right outside of the Cabana Club. We expected a relatively small party, but it ended up swelling to a major event. Two lines extending in opposite direction full of meticulously coiffed “industry” types.
Figuring this was off to a bad start, we walked to the center of the two lines to figure out which line we were supposed to wait in. Waiting in line. That’s what us little people do. We wait in lines.
So when we made it to the center, we ran into the administrative assistant for the client. I asked her which line we were supposed to wait in and I loved her reply. “Oooh nooo. There’s no line for you.” She motioned the bouncers to let us straight in and we were given a staff badge that gave us VIP access.
I have to admit we felt just a bit like rockstars, except without all the lines of coke, bad hair, and breakups and reunion tours. So this is the special treatment that celebrities get at clubs. Cutting ahead of the masses of peons wating in line. The little people.
Inside we had access to the supposed VIP rooms, saw Spinderella from Salt ’n Pepa as well as that guy who played the bus driver and band manager in the movie Ray, Clifton Powell.
The best part of the party was the open bar. The music was fine too, but you have to love an open bar staffed by talented professional bartenders. Each drink was a worthy concoction, pleasing to the tastebuds, and pleasant on the eyes.
My former SkillJam coworkers in attendance certainly were livened by the open bar. You know how mixing water and potassium can cause an explosion? It’s a bit like that when you mix alcohol and my former coworkers…a party breaks out. Always a good time with them folks.
[Listening to: Solitude (Duke Ellington) - Ella Fitzgerald - Love Songs: Best Of The Verve Song Books (2:09)]