In the world scientific community, our rate of yearly innovation seems to be slowing. Let’s look at a chart of the important technological innovations per year divided by world population:

How do we interpret this data? What could it mean? Does this indicate that we’re lazy and stopped innovating? Or we’ve disincentivized innovation? Or have we ignored the human cost in innovation, and we’re feeling the effects of greater innovation requiring, simply, more humans to be involved in the process? All of these questions are tangential to this discussion, but they’re fun to think about. No, I think it’s enough to accept that this is happening, and try to make interesting points about computer software.
Mapping this onto computer technology works wonderfully. We saw in intense flurry of innovation around computers from 1940-1990, and then things kind of flattened out. Is the software on your computer materially different from something that existed in 1990? Not really. Moore’s Law ran for a few more cycles and allowed us to render more HTML elements, more 3D polygons, and do some nice machine learning. And perhaps you draw the line somewhere else. However, it’ll be hard to find someone who thinks that computer technology didn’t stagnate at some point. Computers have changed since 2000? What about 2010? Does the progress over the last 5 years match the progress from 1985 to 1990?
Software is a similar story. It’s kind of hard to compare modern software to something from 1985 or thereabouts, because hardware has advanced so much. And people tend to not be aware of minicomputers or other business-focused machines, and only want to compare desktop PCs. This is unfortunate, because in 1985 desktop PCs were a primordial ooze while IBM and DEC were selling machines with capabilities much, much more advanced than that. However, once you know where to look, it becomes clear that software progress slowed a long time ago. A few discoveries have been made, especially in the areas of compression, machine learning, computer vision, distributed computing, and cryptography (including of course cryptocurrency). But mostly the average programmer has had a remarkably similar job. Looking at GUI toolkits from 1990 is very much like looking at React or Ember in 2021. We use JavaScript (or TypeScript) instead of C++ now, but has the shape of the codebase really changed? We have the same concerns: z-index, layering, layout, callbacks, events… Meanwhile, if you talked to a programmer from 1990 and asked them to imagine the year 2021, they’d certainly say that an AI would be doing the GUI layout, and us programmers would be free to think at a higher-level. Yet here we are, pixel-pushing as always.
So if software isn’t advancing, why does it keep changing? And why do we hate on anyone who doesn’t change?
I think this XKCD is a bit wrong in the analogy there. MS-DOS is like gunpowder, because it’s a simple solution that works, yes. However, more importantly, it’s the solution you have and not some hypothetical solution that a consultant could bill you $300/hour to build. Businesses run on money, you know. The cost of maintaining a MS-DOS machine is much smaller than- …eh, enough of this tangent. You get the idea.
Anyway, back on the topic of the software world “changing”. By that I obviously mean that the software tools – the languages, frameworks, and environments that we use – keep changing every few years. Why is that, anyway? It seems like the simple solution is “because we’re crazy!”. But, I think that a more logical explanation is that there’s some basis in reality here. Something about the nature of software development drives us to rip out perfectly good infrastructure and code and replace it every few years. And I don’t think it’s boredom, or making work for oneself. It’s hard work replacing a perfectly good app written in, say, C++, with one in the New Shiny Language De Jure (let’s say, Go). And if you ask a programmer in the middle of one of those rewrites “is this fun?” the answer is usually “no, but it’s better than leaving it in C++!”. That hints at the true reason, I think. The true reason is, C++ (in this instance) has ceased to be a good substrate for this application. Perhaps any application.
I believe that programming languages/environments/frameworks/tools decay over time. Not through lack of contributions, but through bad contributions. It seems to be the common belief that software gets better over time. But people also believe that software gets “old” and “crufty” somehow. Magically! It especially happens to the code in your tech stack. But it never happens to languages or tools. How did your C++ code get so bad? Why, previous employees at the company made bad decisions. Surely C++ didn’t get worse. Surely the C++ ecosystem and community didn’t regress, right? Didn’t everyone once use Ruby?
But I think that’s exactly what happens. The quality of a given language, or framework, or tool, or ecosystem, or community, follows an arc. It starts off with intense innovation and high-quality code. Usually the people contributing are doing so our of a sense of duty, to make something better than the language/tool/framework/ecosystem that they are currently stuck using. Then, the software becomes popular. People start using it. Features become set in stone. Innovation suffers. The wild west “make something better” days are over. You now have responsibilities to your users. If you break your software, someone will complain. And you’ll get to fix the bug. It’s not fun anymore. It’s work. And this is the point where the original devs quietly leave, on to find some new fun thing to work on. And in come devs who are looking to make a name for themselves, by contributing to open-source. Because for better or worse, contributions to open-source are a good way to improve your resume. At some point, the project grows so large, corporations are built around it. Someone starts a company selling training around the software. It’s now a business. Contributions to the open-source project by large come from employees of that company. It’s become subsumed into the corporate stack. Open-source contributions coming in through GitHub aren’t accepted unless an employee of the company happens to look there in a moment of boredom.
While all of this is happening, quality goes sharply up at first, then sharply down. The software project is initially chaotic, then for awhile it’s a joy to use, then it becomes over-burdened with features and TODOs, and falls into disarray. The fact that it’s grown a commercial entity, or has reached some level of fame, isn’t relevant really.
So what do we do? As a developer, I’ve always thought of churn in the tech stack as an unavoidable thing. Stake your career on something which is reaching the several-year mark. The Lindy effect – Wikipedia is relevant here. As is Steve Jobs on his “glide slope” theory of tech, 1988 (SoundCloud).
Just imagine you’re in the middle of a sea of hungry wolves. The trendier part of the software community is holding a torch that keeps the wolves away. If you go too far ahead of the trend, you’ll be eaten. Fall too far behind, you’ll be eaten. You (and your business) need to keep pace with the trends, no matter how silly it seems.