Moneyball of Hiring

hiring industry 15 comments suggest edit

The shaman squatted next to the entrails on the ground and stared intently at the pattern formed by the splatter. There was something there, but confirmation was needed. Turning away from the decomposing remains, the shaman consulted the dregs of a cup of tea, searching the shifting patterns of the swirling tea leaves for corroboration. There it was. A decision could be made. “Yes, this person will be successful here. We should hire this person.”

Spring Pouchong tea - CC BY SA 3.0 by Richard Corner

Such is the state of hiring developers today.

Our approach to hiring is misguided

The approach to hiring developers and managing their performance afterwards at many if not most tech companies is based on age old ritual and outdated ideas of what predicts how an employee will perform. Most of it ends up being very poor at predicting success and rife with bias.

For example, remember when questions like “How would you move Mt. Fuji?” were all the rage at companies like Microsoft and Google? The hope was that in answering such questions, the interviewee would demonstrate clever problem solving skills and intelligence. Surely this would help hire the best and brightest?

Nope.

According to Google’s Senior VP of People Operations Laszlo Bock, Google long ago realized these questions were complete wastes of time.

Years ago, we did a study to determine whether anyone at Google is particularly good at hiring. We looked at tens of thousands of interviews, and everyone who had done the interviews and what they scored the candidate, and how that person ultimately performed in their job. We found zero relationship.

We’re not the first to face this

Our industry is at a stage where we are rife with superstition about what factors predict putting together a great team. This sounds eerily familiar.

The central premise of Moneyball is that the collected wisdom of baseball insiders (including players, managers, coaches, scouts, and the front office) over the past century is subjective and often flawed. Statistics such as stolen bases, runs batted in, and batting average, typically used to gauge players, are relics of a 19th-century view of the game and the statistics available at that time.

Moneyball, a book by Michael Lewis documents how the Oakland Athletics baseball team decided to ignore tradition and use an evidence-based statistical approach to figuring out what makes a strong baseball team. This practice of using empirical statistical analysis towards baseball became known as sabermetrics.

Prior to this approach, conventional wisdom looked at stolen bases, runs batted in, and batting average as indicators of success. Home run hitters were held in especially high esteem, even those with low batting averages. It was not unlike our industry’s fascination with hiring Rock Stars. But the sabermetrics approach found otherwise.

Rigorous statistical analysis had demonstrated that on-base percentage and slugging percentage are better indicators of offensive success.

Did it work?

By re-evaluating the strategies that produce wins on the field, the 2002 Athletics, with approximately US$44 million in salary, were competitive with larger market teams such as the New York Yankees, who spent over US$125 million in payroll that same season…This approach brought the A’s to the playoffs in 2002 and 2003.

Moneyball of Hiring

It makes me wonder, where is the Moneyball of software hiring and performance management?

Companies like Google, as evidenced by the previously mentioned study, are applying a lot of data to the analysis of hiring and performance management. I bet that analysis is a competitive advantage in their ability to hire the best and form great teams. It gives them the ability to hire people overlooked by other companies still stuck in the superstition that making candidates code on white boards or reverse linked lists will find the best people.

Even so, this area is ripe to apply more science to it and study it on a grander scale. I would love to see multiple large companies collect and share this data for the greater good of the industry and society at large. Studies like this often are a force in reducing unconscious bias and increasing diversity.

Having this data in the open might remove this one competitive advantage in hiring, but companies can still compete by offering interesting work, great compensation, and benefits.

The good news is, there are a lot of people and companies thinking about this. This article, What’s Wrong with Job Interviews, and How to Fix Them is a great example.

We’ll never get it right

Even with all this data, we’ll never perfect hiring. Studying human behavior is a tricky affair. If we could predict it well, the stock market would be completely predictable.

Companies should embrace the fact that they will often be wrong. They will make mistakes in hiring. As much time as a company spends attempting to make their hiring process rock solid, they should also spend a similar amount of time building humane systems for correcting hiring mistakes. This is a theme I’ve touched upon before - the inevitability of failure.

Found a typo or error? Suggest an edit! If accepted, your contribution is listed automatically here.

Comments

avatar

15 responses

  1. Avatar for Arnab Roy Chowdhury
    Arnab Roy Chowdhury September 8th, 2015

    This is a nice topic I read today. Thanks for writing on this.

  2. Avatar for Edwin
    Edwin September 8th, 2015

    I failed both onsite interviews with Microsoft and Google. Indeed, I can't 
find the path between
nodes 
in 
a 
binary tree on a whiteboard. I pretty much gave up my aspirations to work on one of those behemoths. It was very hard and I felt like a total failure, but now I acknowledge that those companies hire only the top of the top, and I am not on that tier (in all honestly, I feel like the only reason I made it that far on the interview process is because I am Hispanic, and they are desperate to pad their diversity numbers up. Realistically I don't think I ever had a chance to receive an offer). I like what you wrote, it always felt to me like they are interested in hiring the dogs that know how to perform certain tricks in certain way, because it is after all, the status quo. They hit and miss a lot, but at least it is a framework that gives them something to work with, even if they still miss the occasional fat catcher that gets on base a lot, because he doesn't fit their vision of what a true googler (or softie) should say at those interviews.

  3. Avatar for Jeff P.
    Jeff P. September 8th, 2015

    I think the problem is that we don't really know how to "score" employees, so I'm not sure if the data would be useful. Stack ranking obviously doesn't work (as you want to stick hot pokers in your ears when you hear about "visibility" and "high potential"). In my experience, all of the brain teaser nonsense and white boarding is in fact useless, but purely as a gut check, I like to sit someone in front of a computer with real tools, Google available, and just see how they solve a problem. I mean, that's how we work, right? Beyond that, it's more important to understand if they play well with others.

  4. Avatar for haacked
    haacked September 8th, 2015

    > I think the problem is that we don't really know how to "score" employees, so I'm not sure if the data would be useful.

    This can never be an exact science, and I don't claim it to be. But we can do better than what we do today.

    We can make multiple hypotheses and then collect data, analyze the result, and see what we learn.

    For example, the statement that data can't improve the hiring process is itself a hypothesis. Ok, so prove it! Let's do a study.

    You say that brain teasers and white boarding is useless. The data agrees with you.

    So far, the data seems to indicate that having people do the actual work they would in the job they're interviewing for is effective. That's promising, but we need more studies.

    The point I'm advocating is that a lot of companies have a lot of data on this. We could take this shared data, categorize it, and look for any universal trends, and report on what we learn.

    Maybe we learn that you're better off throwing a dart at a wall full of resumes. Maybe we learn a few things that work, and a few things that don't. Until then, we're mostly working with conjecture.

  5. Avatar for adamralph
    adamralph September 8th, 2015

    I agree with the spirit, and I guess ultimate aim, of what your are saying. The crystal ball/tea leaf gazing practices are damaging or useless at best, but I'm having a hard time imagining the data which would solve this.

    Can you give some concrete examples of the data points you would collect?

    I'm not sure the analogy with baseball holds so well. Baseball is a highly constrained environment with very specific inputs (the actions you can take within the rules of the game) and desired outcomes (winning games). This doesn't describe software development very well.

  6. Avatar for drdamour
    drdamour September 8th, 2015

    saber works in the closed system of MLB because all the employers (MLB teams) are trying to achieve the same goals in roughly the same way with a very strict rule set (9 players, etc). Thus they can come up with analysis that makes sense for each candidate.

    Also baseball the sport, even though it's a team game, is largely made up of individual plays, especially on offense. Applying advanced statistical analysis to more team oriented sports (especially football & basketball) has proven rather difficult, thought it's an evolving field.

    Companies are so drastically different from sports with their rather stringent ruleset, and employee success mirrors the complex team dynamics of the other (non-mlb) team sports where success and failure are VERY hard to measure on an individual level (outside of absolute edge cases good or bad). The same person at the same company solving the same problem the same way can (will?) have completely different outcomes just by altering the membership of the team they work with.

    in short, noble effort, but damn good luck! (also it's a famous saying that moneyball gets you to the playoffs, but not through them)

  7. Avatar for Steve Fenton
    Steve Fenton September 9th, 2015

    I really enjoyed this article. I would, however, caution against the using the logic in this quote to say that the questions were ineffective - because it doesn't prove so:

    "Years ago, we did a study to determine whether anyone at Google is
    particularly good at hiring. We looked at tens of thousands of
    interviews, and everyone who had done the interviews and what they
    scored the candidate, and how that person ultimately performed in their
    job. We found zero relationship."

    There is a huge survivorship bias in this statement. The people who got hired *may* have hugely outperformed those who were rejected, but we'll never know because they didn't get hired.

  8. Avatar for EOkas
    EOkas September 9th, 2015

    As fasr as picking people based on statistcis, a coding gamification website gaining traction with french and other european coders https://www.codingame.com is offering companies the possibility to lock job offers for those who haven't solved problems of a certain difficulty or have enough coding points. Helps filter a bit.

  9. Avatar for Minnesota Steve
    Minnesota Steve September 9th, 2015

    Have you seen Elevated Careers? eHarmony's attempt to find your perfect job match. :-)

    I don't know if it'll work, but it is a clever way to look at the problem.

  10. Avatar for Frank Black
    Frank Black September 9th, 2015

    One would almost imagine that's what an HR department is for. But alas, they're busy thinking up inane procedures to follow.

  11. Avatar for RJay75
    RJay75 September 9th, 2015

    I think a bigger problem in the hiring process is just getting past HR to even get an interview. I've applied for positions and heard nothing. I've also reapplied at positions through a recruiter and at least got a phone interview. Resume writing has truly become a master out form. And recruiters have a job because you need someone to sweet talk you past HR because you have no idea what buzzwords they are keying of in resumes and cover letters.

    But good post and I do think the interview hiring landscape is changing.

  12. Avatar for some dude
    some dude September 13th, 2015

    The another problem is that it also hard to put score in someone performance, since we all often wonder why that under performer, incompetent actually able to became manager or team lead.

    It is easy to judge salesmen since we can just see the sales they made, for programmer there are no number to determine how well they work. Which is a shame since a lot of similar article failed to see it and think that data in performance 100% correct therefore there are no correlation in interview and performance.

  13. Avatar for William Noel
    William Noel September 28th, 2015

    I owned a software firm in the 80's. Same problem. We solved it by screening what we thought was the best, then sending them to a full afternoon with an outside psychologist.

    He screened them for aptitude and personality fit. Turnover went to zero and productivity soared. Why? We ended up with a fairly homogeneous group of workers well suited to their jobs. People wanted to be there and were capable of performing.

    I remember one female candidate who he believed would be a poor fit because the support department she was applying for was all male and she had an underlying "hates men" personality trait. The doc was convinced the department would be at war within a week.

    Best money we ever spent.

  14. Avatar for flukus
    flukus October 21st, 2015

    The problem is that our success or failure is almost always the result of a group, not an individual. So we're trying to compare an individual's hiring metrics to a groups.

  15. Avatar for Todd Drayton
    Todd Drayton April 11th, 2016

    Very insightful, although I would say that the usefulness of interview tests and questions depends on the quality and predicting power of those same tests and questions. Obviously some are stupid and have no business in the interview process. But there are coding tests and challenges that are well crafted and ask candidates to demonstrate specific abilities they'd need to use on the job. Trial projects can do this as well.