Creative Deprecation

I came across Zach Tellman's essay [1], and I find it really lucid.

quote: "There is a tension between knowledge and creation: by pursuing one, he [the technologist] retreats from the other."

Here is an essay (really, an attempt) to connect it with some things I have observed. I believe it is possible to draw a connection to what economists call Schumpeter's Creative Destruction.

Creative Deprecation

In large-scale software development it is possible to observe a large-scale pattern. One of the effects is that, while working on a development project, software engineers typically have to commit some time to learning new technologies outside of their project. This pattern consists of something like a cycle that relates knowledge about technology to creation of new technology. Here is an analogy to economy that may help clarify the scope of this pattern:

The opening up of new markets and the organizational development from the craft shop and factory to such concerns as US Steel illustrate the process of industrial mutation that incessantly revolutionizes the economic structure from within, incessantly destroying the old one, incessantly creating a new one ... [The process] must be seen in its role in the perennial gale of creative destruction; it cannot be understood on the hypothesis that there is a perennial lull.
—Joseph Schumpeter, The Process of Creative Destruction, 1942

What matters here is the system view of economy, the purported economic structure. "The" Economy is a productive chaos with too many factors to keep track of. And so is software development, we can also see it as a "system" of people learning, creating and applying technology.

The currency are artifacts of technology. Software (especially open source software) is easily accessible and usable. What is not so easily transferable is in-depth knowledge, experience about a particular piece of software technology. Every new technology (take something like Megastore [3], but also the Google Apps ecosystem), and also backward incompatible changes to technology take time to be learnt and taken up by software engineers. And integrating technology often creates something synergistic, that comes with its own characteristics.

Understanding the cycle

Let us look at the cycle. At the beginning of it, there is a team building something new that is supposed to replace something old, or change it (radically) for the better. In the middle of the cycle, people widely acknowledge that the new thing is better and it is worth investing into using it. Towards the end, people have spent enormous efforts in putting the new technology to practical use, in ways not anticipated by the initiators. As more and more people have embraced the new way, and project roll it out, shortcomings appear. In the end, the new technology has become the standard, and the old is considered something low-level or arcane.

This is very different (and sometimes conflicting) with what is usually understood under "software development cycle". The latter cycle is only concerned with a single piece of technology. It is part of large system of creative deprecation and interacts with it.

Optimizing the cycle

The constantly changing world of available technologies and the speed at which basic assumptions are replaced by new, different ones creates friction for the software developer: it means knowledge, which had to be built up, is becoming obsolete. It means, time and resources have to devoted to learning the new stuff, even if the functional specification of the new technology is exactly the same as the old.

Software developers are humans, and as humans we have limited cognitive capacity to learn: if a technology is sufficiently complex, we cannot start using it and learn it "along the way". And here, methods exists that can help optimize the learning effect. One example of this is prototyping: we build something limited, admitting that we cannot build the whole thing at once, and the intention is to learn.

The way forward?

Claiming that this aspect has been ignored by the software industry would be going too far. There are many processes and development practices which can be understood as a concession to creative deprecation, from unit testing over javadoc to IBM's original code reviews (where developers formed committees, printed source code on paper, read it out and discuss).

The difficulty in taking advantage of creative deprecation is that our methods of "measuring" knowledge are not very developed. What are the indicators? Quality of past contributions would be a great indicator, but we don't seem to have accepted indicators for quality of artifacts either (test coverage is probably a good start).

More challenges come from the fact that individual knowledge does not add up. A team can have a wizard that wrote the book on X, and be clueless about X when that wizard is on vacation or transfers to another project. Agile methods that try to increase the "bus factor" are targeting exactly this aspect.
Preferring "simple" APIs (meaning: APIs that do not come with a high learning "price tag") is another method, trying to minimize the number of public methods are other examples.

Modest, and effective proposals are to

  • acknowledge that knowledge cycle matters and be aware that reducing complexity, in the big picture, is at least as important as adding features.
  • leave room for individuals to acquire knowledge, even if it is unrelated to the project
  • control the cycle, by controlling time and scope of the destruction and mitigating the effects of obsoleted knowledge. For instance, alternate cycles of expansion (features, integration of new technologies) followed by contraction (optimizations, cut-over, deprecation, refactorings, bug-fixes and other maintenance) when planning a project.

[1] http://ideolalia.com/becoming-eloi-becoming-morlock
[2] http://en.wikipedia.org/wiki/Creative_destruction#Schumpeterian_Creative_Destruction
[3] http://highscalability.com/blog/2011/1/11/google-megastore-3-billion-writes-and-20-billion-read-transa.html

No comments: