The following question was deleted from StackOverflow as “primarily opinion-based”:
Most CS programs these days do not teach skills such as:
- source control
- configuration management
- integration (and continuous integration)
- code readability (AKA how to comment correctly)
- programming methodologies
- bug tracking
These topics are considered “easy enough” to be taught on-the-job (OTJ), even though mastering them can be very complex.
Should these skills be taught in universities? Can a real-world programmer really do without these? Is it sufficient to learn them OTJ, as part of a first-year programming experience?
Since some people have asked for my opinions, here they are, as of July 2009:
Universities should teach distributed source control starting with the first programming course. It is a fine way to distribute code to students; they can get bug fixes easily. In later courses they can use it to help support pair programming. We should treat it as easy, make it a habit, and not make a big fuss about it.
Universities should not teach configuration management. There is no consensus on what tools should be used or how they should be used. The dominant tools are either proprietary (Visual Studio) or far too complex to teach to beginning students (GNU Make). I have recently stopped using make
in my classes, and life is easier for everyone. Students compile all their source code, every time, using a shell script.
The difficulty with configuration management is that the complexity of current tools obscures the underlying ideas of lasting value. This is true of GNU Make and even more true of autotools.
It is nearly impossible to teach integration at university, let alone continuous integration, because everything has to be broken up into course-sized units of 10 to 14 weeks. Even worse, within a single course, programming work is typically broken into small pieces, each of which takes at most a few weeks to complete. These constraints militate strongly against learning how to integrate systems.
Code readability should be taught from the first programming course and should continue to play a role through graduation. In both my introductory courses and my upper-division undergraduate courses, about half students’ homework grade is for readability.
Programming methodology is another essential topic that should be taught at university, but perhaps not in the way you think. If university graduates really understand data abstraction, criteria for decomposing systems into modules, data-driven design, and maybe stepwise refinement, they are doing well. Yes, all this was understood by professionals as long ago as 1980, but these ideas are still fundamental, and I am confident of their lasting value.
Regarding some of the other stuff out there, UML does not have a foundation of lasting value, design patterns are a culture, not a methodology, and (this last backed up by experience) the design of class hierarchies and the use of inheritance is too advanced a topic for most undergraduate students.
I don’t see that bug tracking has enough substance to be taught at university, and I don’t see any obvious situations where students would find it helpful to use bug-tracking tools. There’s a big contrast with source control, where the opportunities and benefits are obvious.