Why don't they teach these things in school?


Over the summer, I was fortunate enough to get into Google Summer of Code. I learned a lot (probably more than I've learned in the sum of all my university coursework). I'm really wondering why they don't teach a few of the things I learned sooner in school though. To name a few:

  • unit testing
  • version control
  • agile development

It seems to me that they spend a significant amount of time teaching other things like data structures and algorithms up front. While I still think those are very important to learn early on, why don't they teach more of these three before them? Or is it just my school that doesn't teach much of this stuff?

Don't get me wrong, I don't think it's desirable for universities to always teach the trendiest programming fads, but shouldn't my professors be teaching me something other than "draw a diagram before you start coding?"

12/9/2011 6:34:10 PM

Accepted Answer

The simplest answer to your question is that the fields of computer science and software development are both very new, and not very well understood. Although all scientific and engineering disciplines are advancing more rapidly in modern times, other fields have a lot more experience to draw on and there is a much broader shared understanding of how they work.

For example, despite recent advancements in materials science, civil engineers have known for about 2000 years how to build an arch that won't fall over, and this is something that can be taught and learned in university with relatively little controversy. Although I completely agree with you about the techniques that software developers should learn, this agreement is based on personal experience and informal reasoning. In order to be a socially accepted "best practice", we need quantitative data which can be very expensive to gather: how much does version control help? How does it help? Unit testing? We can reason about the effectiveness of various techniques, but actually proving that effectiveness conclusively would be very expensive. We'd need to run a complete, realistic software project from beginning to end, numerous times, with groups of programmers that have equivalent expertise, using different techniques. At the very least we'd need lots of data about existing projects which those projects would be unwilling to release.

Civil engineers have thousands of years of bridges to look at, with lots of information. Software developers, on the other hand, have only a few decades of information, most of which is kept secret, since there's little motivation for organizations to collate and publish information about their developers' effectiveness, even if they are collecting it (which most aren't).

There's also some confusion of fields. Software development, or software "engineering", is really a different thing from computer science. Software developers need a working knowledge of computer science, but working at the boundaries of algorithmic complexity or reasoning about parallelism isn't something that a working programmer will do every day; similarly, a real "computer scientist" will write tons of throw-away code that just doesn't work or doesn't do anything interesting, and won't benefit as much from the sort of rigor that an actual software product would.

The emergence of the internet and the open source community may provide enough data to start answering these questions conclusively, but even if the answers were available tomorrow, it will probably take 100 years for them to permeate international society to the point where everyone agrees on what should be taught in schools.

Finally there are some economic considerations. It has been a relatively short time since almost everyone involved in software development had cheap, easy access to dedicated machines to run whatever development tools they want. A few decades ago, completely dedicating a machine to just running your tests, or even housing an infinite history of source code, would have seemed frivolously expensive to a lot of people.

9/17/2008 2:44:39 AM

Because our teachers:

  1. Never tried unit testing,
  2. Don't know how to use version control and
  3. Haven't even heard of "agile development".

Students should take matters into their own hands. We did that, and turned out just fine, didn't we?


Leonardo da Vinci wrote,

Those who are enamored of practice without science are like a pilot who goes into a ship without rudder or compass and never has any certainty where he is going. Practice should always be based upon a sound knowledge of theory.

The good schools teach both theory (data structures, algorithms, etc.) as well as practice (unit testing, version control, etc.). This requires an appropriate mixture of faculty so that both sides of this coin can be properly taught. A faculty composed entirely of theoretical types with no real experience won't do. Similarly, a faculty composed entirely of practitioners will not do. You need a mix, and the good schools have that.


Computer science has always been somewhat contradictory; The part that's about computers isn't a science, and the part that's a science isn't about computers.

Universities tend to lean more on the 'science' end (algorithms, datastrctures, compilers, etc) because those things are much more 'timeless' than current industry best practices, which tend to evolve and change from year to year. Version Control, for instance, has undergone amazing changes in the last 5 or 10 years, but big-O is still big-O, and hashing, btrees, and recursion are still as useful as they were 40 years ago. Their idea is generally to give you enough foundations that you can then pick up tools like git and understand what it means when you're told that the underlying datastructure is an acyclic directed graph of SHA-1 hashes, and that the developers have worked hard to optimize the number of syscalls so that it's io-bound.

Now, think about where you learned all the things you had to know to understand that last sentence - if the answer is 'university', they're doing an okay job.


Everything is a passing fad. You will learn more in your first year out of college than all of your years in college. Computer science has nothing to do with computers.

College provides you with a tool box full of tools. This is a screwdriver, that is a crescent wrench. You MIGHT get to use each tool once in college. It is when you enter the real world is when you really find out what you have. You sort out the useful ones from the rest, which ones you want to leave at home on the workbench, just in case, and the ones you keep in your pocket every day.

Tqm, Iso, Cmm, Agile, etc. These are all fads they will come and they will go, none of the successful ones are more than just common sense. All successful engineers and companies use some flavor of common sense, that is what made them successful, few needed a name for it. The problem is you cannot sell common sense, a manager cannot prove their value to the company by training and buying common sense without a catchy name. Put a name on it that their superiors have read in some news article or magazine and the manager keeps their job and you keep yours. Very few of the companies that claim to follow these practices actually do. Most write a check to a consultant, and get their annual and or lifetime certificate to some club so that they can put a graphic on their website or a label on the box their product comes in. Many will argue that this is rare...been there, seen it, it happens. This is all part of business, you have to cut corners sometimes to stay profitable and keep the doors open and the lights on. The hardcore followers of all of these practices have all argued that the last one was a fad and this one isnt, the last one really was too expensive to follow, this one isnt. The last one was fake you just hired a consultant, this one is real. Like programming languages, these too will evolve.

Your ability to understand the realities of business, the university system, and your role in it is the key. Like anything in life, choose your battles. Its not the university or the business or the government or anyone else's job to teach you want you need or want to know. It is your job to look out for number one. Likewise you cant blame anyone else for providing you the time to do this, you have to do it. You will fall off the horse, you are not a victim, get up and get back on, no excuses, life is not fair deal with it. Do take advantage of handouts, dont pretend to be independent. And certainly pay your dues, dont suck a company dry of handouts, without giving them something (your best at the time?) in return.

Why do people think cmm or agile or any of the others is a fad? Why do they think they are not? Why did the professor teach you program that way? To avoid gotos or to avoid constants or to avoid this and that? Is it because it produces more reliable code? Better performing code? Reduces human error? Or is it because it is easier to grade papers/programs giving them more time to do research? Is it because they dont know how to program and they are just following someone elses book on the subject? Did they teach you that you cannot have maintainable, reliable, high performance code? You cannot even "choose any two" maintainable interferes both with reliable and high performance? Sometimes you sacrifice reliability for performance. Sometimes you dont care about reliability or performance, you just want to get from version 117.34.2 of yet another accounting software program to version 118.0.0. Your business model is from selling version upgrades and tech support and as far as software developers any old robot will do that can write the same code in the same way. Replace the burnt out one with the fresh out of college one and keep selling upgrades.

There are no universal answers to these questions, you have to find out what your opinion is, live with it and defend it. Change your mind, live with it, and defend it.

Question everything...will I really get burned if I touch the hot pot on the stove? Will the psychological effects of being afraid cause more damage than just getting burned? Is there a safe way to test the answer without getting hurt?

When I could afford it I would buy and eventually melt down transistors, caps, resistors, etc in my dorm room, all of which have a distinctive bad odor. It is far cheaper and easier to just buy an amp for your stereo than trying to build one the day after your first transistor class. Linus being the exception of course its easier to just buy an operating system than write one...You can get more done although what you learn in that time is different than what Linus learned.

The world inside and outside the university will adopt these formulas (cmm, agile, etc) for solving problems and when the next one comes out they will drop them just as fast. You dont have to use version control to be successful, there are just as many successes with as without (well actually because of the age of the industry there are many more successes without version control thus far). Likewise you can be successful with minimal testing (look at the really big names in the computer industry as examples). You can be successful by testing your own code, as well as being successful by following the rule that you should never test your own code. You can be successful using emacs and you can be successful using vi. You have to decide what mix works for you and if you are lucky find a place to work that agrees with you. With time what works for you will change, from tools to languages to programming style to fears, version control, documentation, etc. You will get married and have children and decide you might want to hide in the corner of that big company with the big health insurance package with the boring job and enjoy your kids instead of being the hotshot programmer at the small startup.

When you get out of college and into the real world, listen to and work with and argue with the "old timers". They have decades to centuries of combined experience, traps they have fallen into that you might avoid and or test on your own (maybe you realize you dont have to touch the hot pot to find out it will burn you). Most will have seen at least one or two of these fads come and go, and in particular how badly they were burned, and what they did to recover from it. They know many different ways to test things, and the names of the testing styles that have come and gone as well. What works, what doesnt. Where the risk is and how to avoid wasting time on a tangent. As you mature and you become the old timer, pass it forward. Pay for what you learned by trying to teach those that follow you. Remember to teach them HOW to fish, dont just give them a fish. And sometimes you have to let them fail before they will succeed, keep them from getting burned too badly.

What I really wanted to say here is right now we are in a rare situation where we can witness an evolution of a parallel universe (and perhaps influence it). Yes computer science is a young science compared to say physics. But at the same time it has evolved many times over. Depending on where you work and who you work with you may be able to observe hardware engineers. Programming languages in the hardware world is certainly not new, but it has not evolved as quickly as the software world. Software had a few decades head start. Hardware has always thought of software engineers as second class citizens. Our job is easy, their job is hard. (Note I am actually both a hardware and software engineer). What is interesting is that right now they are still dealing with what we would consider elementary or infantile problems. Why would I need to use version control, I am the only one working on this chip. Your experience with gcc or other cheap compilers or free IDEs cant possibly compare with the expensive tools I use, if the company thought you were worthy enough to use it or even know how to use it they would buy you a copy. And a long list of other excuses. I had the pleasure of learning both vhdl and verilog and becoming productive in both within a week from what was almost a dare from such a hardware engineer (despite my diploma saying electrical engineer my job title is software engineer). I wanted to learn these languages, when the tools were available to me I stayed at the office into the night and taught myself. From that point on that engineer in particular realized that what I was saying was true, languages are just syntax, programming fundamentals are the same, the tools all do the same thing. Its apples and apples not apples and oranges.

In general though it is still difficult to send the message that one of these two parallel industries has a lot more experience in languages, programming habits, source control, testing, tools, programming environments, etc than the other. The problem I am trying to solve is taking the hardware designs as they are being developed, create affordable functional simulators that we can tie in with a simulation (virtual machine) of the processor so that we can start testing the hardware and developing the test and deliverable software long before we go to silicon. No there is nothing "new" about this, but we have no mechanism to get the latest code, track changes in the code to see where we need to focus our time. No mechanism for tracking the documentation defining the user (programming) interface to the hardware. The one golden copy is in someone's email inbox in binary form and only changes when, well it doesnt you have to read the verilog to find out what is going on. Wait, that verilog is how old? That bug I spent all week on you figured out three weeks ago and fixed? So do we just fly to some vacation spot and party for six months waiting for the hardware folks to finish their task and throw it over the wall to us, or do we take this opportunity to try to be patient and optimistic and teach them that they there are common sense methods that are not that intrusive that allow them to both do their job, backup their work as well as share their stuff for peer review...

Remember that hardware engineers did leave college with a box of shiny new tools just like you did. You learned 17 different programming languages of which you may only use one, the rest of the languages you in your career will be invented after you leave college. When they left college they can tell you what they know about calculus and the theory of relativity how many electrons are in each of the elements and compute the charge around a Gaussian surface. But the bulk of their career is one, zero, and, or and not (hey we have those in common, all you really need to know about computers, one, zero, and, or and not hardware or software engineer). Granted the fundamental laws of physics, calculus, electrons are not going to change as fast as programming languages do. But the fundamentals of programming are the same across all languages and will continue to be into the future. Did you leave college knowing that or did you leave thinking java is different and better than C++ because this and that and the other?

Like any other business the universities job is to stay profitable. They have to hire the right academics to bring both the right students and the right research dollars and the right kinds of research to make the university profitable. They have to offer the right classes to bring the right students and produce the right graduates so that as the decades pass employers both near the university and hopefully far away will recognize that this university produces productive and profitable employees. (yes and sometimes you have to attract the right athletes in the right sport to get the right amount of TV time and the right amount of name recognition and sports revenue). Some universities will teach C++ and Java, some never will. Some will invent CMM, and some will teach Agile, some will do neither. If the university has any value at all there is something there for you to learn. They will not teach you everything there is to learn, but they will have something useful. Learn that something while you are there, collect a reasonable number of various forms of tools in your tool box. Leave the university and get a job. If your toolbox sucks maybe find another university and never mention the first. If it is an ok tool box use those tools and build some new ones on your own time. If its a pretty good tool box, say good things about that university and the good academics you learned this and that from and pay the school back for what they gave you. Even though you didnt get every possible tool in the universal catalogue of university tools you will walk away with a certain subset. Even if you dont graduate...


I taught these things when I was an Adjunct at the Oregon Institute of Technology. They are taught, just sparsely.


Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow
Email: [email protected]