Tuesday, January 07, 2014

Programmers without TDD will be unemployable by 2022 (a prediction)

New year is traditionally the time of predictions, and several of the blogs I read have been engaging in predictions (e.g. Ian Sommerville “Software Engineerng looking forward 20 years.”). This is not a tradition I usually engage in myself but for once I’d like to make one. (I’ll get back to software economics next time, I need to make some conclusions.)

Actually, this is not a new prediction, it is a prediction I’ve been making verbally for a couple of years but I’ve never put it on the record so here goes:
By 2022 it will be not be possible to get a professional programming job if you do not practice TDD routinely.
I started making this prediction a couple of years ago when I said: “In ten years time”, sometimes when I’ve repeated the prediction I’ve stuck to 10-years, other times I’ve compensated and said 9-years or 8-years. I might be out slightly - if anything I think it will happen sooner rather than later, 2022 might be conservative.

By TDD I mean Test Driven Development - also called Test First (or Design Driven) Development. This might be Classic/Chicago-TDD, London-School-TDD or Dan North style Behaviour Driven Development. Broadly speaking the same skills and similar tools are involved although there are significant differences, i.e. if you don’t have the ability to do TDD you can’t do BDD, but there is more to BDD than to TDD.

The characteristics I am concerned with are:
  • Developer written automated unit test, e.g. if you write Java code you write unit tests in Java... or Ruby, or some other computer language
  • The automated unit tests are executed routinely, at least every day
This probably means refactoring, although as I’ve heard Jason Gorman point out: interest in refactoring training is far less than that in TDD training.

I’d like to think that TDD as standard - especially London School - also implies more delayed design decisions but I’m not sure this will follow through. In part that is because there is a cadre of “designers” (senior developers, older developers, often with the title “architect”) who are happy to talk, and possibly do, “design” but would not denigrate themselves to write code. Until we fix our career model big up front design is here to stay. (Another blog entry I must write one day...) 

I’m not making any predictions about the quality of the TDD undertaken. Like programming in general I expect the best will be truly excellent, while the bulk will be at best mediocre.

What I am claiming is:
  • It will not be acceptable to question TDD in an interview. It will be so accepted that anyone doesn’t know what TDD is, who can’t use TDD in an exercise or who claims “I don’t do TDD because its a waste of time” or “TDD is unproven” will not get the job. (I already know companies where this is the case, I expect it to be universal by 2022.)
  • Programmers will once again be expected to write unit tests for their work. (Before the home computer revolution I believe most professional programmers actually did this. My generation didn’t.)
  • Unit testing will be overwhelmingly automated. Manual testing is a sin. Manual unit testing doubly so.
And I believe, in general, software will be better (fewer bugs, more maintainable) as a result of these changes, and as a result programmer productivity will be generally higher (even if they write less code they will have fewer bugs to fix.)

Why do I feel confident in making this prediction?

 Exactly because of those last points: with any form of TDD in place the number of code bugs is reduced, maintainability is enhanced and productivity is increased. These are benefits both programmers and businesses want.

The timescale I suggest is purely intuition, this might happen before 2022 or it might happen after. I’m one of the worst people to ask because of my work I overwhelmingly see companies that don’t do this but would benefit from doing it - and if they listen to the advice they are paying me for they start doing it.

However I believe we are rapidly approaching “the tipping point”. Once TDD as standard reaches a certain critical mass it will become the norm, even those companies that don’t actively choose to do it will find that their programmers start doing it as simple professionalism.

A more interesting question to ask is: What does this mean? What are the implications?

Right now I think the industry is undergoing a major skills overhaul as all the programmers out there who don’t know how to do TDD learn how to do it. As TDD is a testable skill it is very easy to tell who has done it/can do it, and who just decided to “sex up” their CV/Resume. (This is unlike Agile in general where it is very difficult to tell who actually understand it and who has just read a book or two.)

In the next few years I think there will be plenty of work for those offering TDD training and coaching - I regularly get enquiries about C++ TDD, less so about other languages but TDD and TDD training is more widespread there. The work won’t dry up but it will change from being “Introduction to TDD” to “Improving TDD” and “Advanced TDD” style courses.

A bigger hit is going to be on Universities and other Colleges which claim to teach programming. Almost all the recent graduates I meet have not been taught TDD at all. If TDD has even been mentioned then they are ahead of the game. I do meet a few who have been taught to programme this way but they are few and far between.

Simply: if Colleges don’t teach TDD as part of programming courses their graduates aren’t going to employable, that will make the colleges less attractive to good students.

Unfortunately I also predict that it won’t be until colleges see their students can’t get jobs that colleges sit up and take notice.

If you are a potential student looking to study Computer Science/Software Engineering at College I recommend you ignore any college that does not teach programming with TDD. If you are a college looking to produce employable programmers from your IT course I recommend you embrace TDD as fast as possible - it will give you an advantage in recruiting students now, and give your students an advantage finding work.

(If you are a University or College that claims to run an “Agile” module then make sure teach TDD - yes, I’m thinking of one in particular, its kind of embarrassing, Ric.)

And if you are a University which still believes that your Computer Science students don’t really need to programme - because they are scientists, logisticians, mathematicians and shouldn’t be programming at all then make sure you write this in big red letters on your prospectus.

In business simply doing TDD, especially done well, will over time fix a lot of the day-to-day issues software companies and corporate IT have, the supply side will be improved. However unless companies address the supply side they won’t actually see much of this benefit, if anything things will get worse (read my software demand curve analysis or wait for the next posts on software economics.)

Finally, debuggers are going to be less important, good use of TDD removes most of the need for a debugger (thats where the time comes from), which means IDEs will be less important, which means the developers tool market is going to change.

(Postscript: sorry about the formatting problems with the first upload.)


  1. My guess is that you've been wrong for the couple of years.
    Why? I think TDD is a reasonably sound idea, most people realise they need to test stuff, most write the tests and some even write the tests first, I have no argument with the concept. However, one thing I've noticed, things in this industry change faster than that, this years silver bullet is next years old hat.
    I suspect that we will end up with more automated code writing systems, where the job is to accurately specify what you want probably in some graphical model, press the button and hey presto an automatically generated application doing just what you asked.

    BTW 'captcha' or whatever it is called is NOT a good thing, the stuff you have to type is not even human guessable

  2. I hope your prediction is right.
    However, I don't agree with your conclusion : "IDEs will be less important". IDEs should't be reduced to debuggers. IDEs are great tools to navigate in your project, launch your tests and, of course, perform powerful refactorings. TDD enables to do continuous refactorings and this activity is the main one of all TDD programmers. A great IDE is for me a must have.

  3. Fabien, yes IDEs are more than debuggers but I think debuggers are the main feature that many developers use them for. Also, the debugger is one of the most complex parts of the IDE. My hope is that with less need for an all-singing-all-dancing debugger in an IDE we would see more innovation. Which would mean the IDE - as you describe - would become even more useful!

  4. Allan - only a Sith deals in absolutes. When you say "will be unemployable" I hear an absolute. Just because something is a good idea does not mean it always happens! Look how many obese people there are. Or divorced marriage-guidance counsellors.
    And debuggers becoming less important. It depends on your point of view. No matter how good your development is, some faults will remain. Development is not manufacturing. The easiest faults to find are the ones that are found the easiest! As time goes buy, the average time to find and fix a fault increases. So I think with a better development process you might get fewer faults but their importance and "nastiness" will go up. So you could argue the importance of debuggers will go up.

  5. To say that debuggers are going to be less important is quite a naive view. Debuggers are a crucial tool for those of us who prefer to understand what is inside the black box; a learning tool for understanding code. I routinely use a debugger to step through code and understand the flow of a program; without a debugger good luck with that. Tests will certainly tell you the expected outcome, but certainly not _how_ it arrived there.

    Secondly, even when utilizing TDD, a debugger provides insight when tests fail and one can step through the offending code to determine the reason for the failure.

    I've been writing code since I was 8 on my grandfather's Tandy TRS-80. When I was first exposed to a monitor (debugger) it was like coming out of the stone ages – even though it was stepping through 6502 assembly on a C64 :)

  6. I am not a TDD developer, and with risk of sounding uncool and heretic, in my ~10 years career as a java programmer i havent seen TDD code which would really convince me that yes, the time wasted on writing them is won back later.Maybe i have seen only bad TDD but it looked as if the tests checked nearly only cases that really didnt matter, and , that was fact, the bugs appeared what really wasnt tested.While I agree that above can be discussed and i just maybe must learn much more about TDD, I concur with above posts saying to write IDE and debugging will be less important sounds abit silly.There will alwys be bugs, debuging to understand a a mechanism is gold, and plz try to write some code in notepad without all the IDE tools.

    1. I was the same as you several years ago, until I joined a Java project that already had full regression testing that ran every night. If you wrote code, you had to write a test to prove that the code worked. No regression test meant that the feature or bug fix was not complete. If a bug was found in the field, that meant that we didn't have a test for that scenario - so we created a test to shine a light on the bug; fix the bug and the test passes... no more bug.

      I was initially pretty down on the whole thing because it added time to developing new features, until one night the regression test broke when I added a new feature and made an inadvertent change to some existing code to support the new feature. It turned out that what I changed broke a different part of the system that depended on it. It was a seemingly simple change, but had unforeseen consequences.

      That incident is what opened my eyes to how fearless I could be in refactoring code to be better. If I truly didn't change anything in the API that would affect the client programmer, then the test wouldn't break. When it did, that meant that I needed to rethink what I changed.

      It really is a good feeling. I highly recommend it!

  7. People are not going back to stone age programming. IDE are tools to help you, not some evil. In future Developers will be more focused on developing less bugged code than documenting unit test cases in excel sheets. There will be tools to run tests which auto-create unit test cases based on tagging in code.

  8. Are people familiar with Steve Freeman's presentation "Test-driven development (that's not what we meant)"? https://vimeo.com/83960706
    Before TDD has any chance of becoming ubiquitous it'll first have to overcome that hurdle. Given that 10 years of agile has generated more far far more simulacra than true adoptions, our predominanting organisational cultures show a disposition towards local optimisation rather than transformational changes.

  9. In ten years, no one will remember what TDD was in the ancient times.

  10. So black & white...

    In 2022 I'd hope companies would expect interviewees capable to reflect on how and when TDD is a good fit, as well as how they use it in their development.

    TDD is great, but not a panacea. In every project, there's a competition for some limited resource: developers, budget, calendar time, environments, etc. Some are finite, others can be overcome through investment. By mandating one particular practice, you leave less room for other practices that might be more appropriate for the situation.

    Instead of throwing out predictions, one could:
    1. ask Kent Beck about when TDD is not appropriate, and the trade-off between TDD and getting features out quickly, even if of less quality.
    2. ask Peter Norvig about when a little "Think-First Design" (my words) trumps a Test-Driven Design in his sudoku shoot-out against Uncle Bob
    3. Look at the trade-off between a little upfront design vs. a lot of heavy refactoring due to bad design decisions evolved through TDD. In a complex system, it frequently happens that tests are designed wrong when they are first written (implemented in the wrong module, ends up with an inappropriate api, works in isolation, but can't be hooked up with the rest of the system, etc.). The tests are then the largest hurdle to refactoring.
    4. You could update relevant research, especially meta-studies. Most TDD fashionistas will quote you anecdotal evidence, or reference one or two small-group studies which proves their point. But I hope by 2022 there will be a lot more in-depth research available to provide the necessary nuances.

    Just for the record, I'm several years into a large project doing TDD. For the first 2 we mandated TDD. Now we just encourage it as a good habit, but are fine if people break the rules as long as they're justified. Of course - if their code breaks or anything else, we expect them to produce the justification.

    1. Thanks for a well through out reply, Anonymous - I only wish you had given your name.

      I agree to some degree with just about everything you've written.


  11. Our analysis show that TDD is a poor substitute for thinking. None of our top 10% coders to anything resembling TDD. They think, code and deliver.

    1. Anonymous, I'd love to see your analysis. Can you provide a link? Or describe it?

      I'm sure you will understand my reservation in agreeing with you until I've seen the analysis myself

  12. Laughable!

    You don't seriously believe your prediction do you?

    I have never ever worked anywhere that used TDD. I haven't even heard anyone talk (seriously) about using it

    There's far, far, too much wrong with TDD (or ANY methodology) that would make it suitable to fit all business needs - kinda like OOP! Look how that flopped, and you still get people preaching it to the roof tops.

    What it boils down to is systems are more complex than the sum of their parts. I could go on but I'll just encourage you to research it yourself.

    -- Chris

    1. Thanks Chris,

      > You don't seriously believe your prediction do you?

      Yes I do, I'm not in the habit of writing blog entries I don't stand behind and nothing in the last 2 months has lead me to change my opinion.

      I think your observation on OOP fly's in the face of reality: just about all modern languages and systems are object oriented. Except for the functional languages and even some of them support OOP.

      I do agree with you about systems being more complex than the sum of their parts, in fact I think that supports my case. These big systems are the sum of many parts, if the parts don't work then the system won't work. Simply wiring together defective parts will not fix the defects.

  13. I am agree with you allen what your conclusion ove here.This is gonna help me a lot.

  14. Check TDD is Dead (DHH). Just google it. It is about time the a prominent figure comes out. Even with best intentions good people and good developers become zealots due to radical or extreme methods. A few years ago there was a method called 'Extreme Programming' which saw programming as a pair activity. Unfortunately, it ended up with witch hunting about non social developers (sometime the 'best' of them).

    So live and let live - the world is full of good developers from different cultures and styles of work.

    Lastly, development skills are also a result of hard earned experience. Telling a developer that he should change is like telling him to question reality - what he does actually works but he needs to change.

    The loss will be of the industry. Maybe it will need to 'first test' it in order to wake up and let go.


Note: only a member of this blog may post a comment.