Niklaus Wirth was right and that is a problem

Wirth’s law is not really a law. Actually, none of them ever are laws. They are adages:

a proverb or short statement expressing a general truth.

https://www.lexico.com/en/definition/adage

Here is another law that is not a real law: Moore’s law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years.

This means that we can expect the speed and capability of computers to increase while lowering the costs. Sadly, this is where Wirth’s law comes in:

Wirth’s law is an adage on computer performance which states that software is getting slower more rapidly than hardware is becoming faster.

https://en.wikipedia.org/wiki/Wirth%27s_law

And while Moore’s law has proven to be true since 1975, Wirth’s law seems to be true as well. Niklaus Wirth, the designer of Pascal, wrote an article in 1995:

About 25 years ago, an interactive text editor could be designed with as little as 8,000 bytes of storage. (Modern program editors request 100 times that much!) An operating system had to manage with 8,000 bytes, and a compiler had to fit into 32 Kbytes, whereas their modern descendants require megabytes. Has all this inflated software become any faster? On the contrary. Were it not for a thousand times faster hardware, modern software would be utterly unusable.

Niklaus Wirth – A Plea for Lean Software

The problem of modern software development is manyfold. Wirth points out one crucial aspect: time.

Time pressure is probably the foremost reason behind the emergence of bulky software.

Niklaus Wirth – A Plea for Lean Software

And while that was true back in 1995, that is no longer the most important factor. We now have to deal with a much bigger problem: abstraction. Developers never built things from scratch, and that has never been a problem, but now they have also become lazy.

It was Edsger W. Dijkstra who tried to improve the quality of code and coined the concept of structured programming. He tried to get programming out of the state of crisis it was in, and he found support in programmers like Harlan D. Mills, Richard C. Linger and Bernard I. Witt. For a short period of time, programming was seen as a real craftmanship. Programmers cared about the quality of their programs, and that included clarity and efficiency.

Those times have passed. With the introduction of higher-level languages such as Java, Ruby, PHP and Javascript all in 1995, the same year in which Wirth wrote his article, programming became more abstract.

Languages like these made programming a lot easier and took many things out of the programmer’s hands. They were object-oriented and came with things as an IDE and garbage collection.

This meant that programmers had fewer things to worry about, which is of course great. Sadly, everything comes with a price. Having fewer things to worry about, also means having fewer things to think about. 1995 was the year in which programmers stopped thinking about the quality of their programs.

It also marked the beginning of the widespread use of libraries, probably one of the bigger problems. Don’t get me wrong, I love libraries. They are the only reason I am able to get things done. However, a library never comes with the exact things that you need.

Because a library is not made for one specific project, it probably has a bit more functionalities than you really needed. No problem, you would say. However, things pile up pretty quickly. Even the people who like libraries, don’t want to reinvent the wheel. This results in what we call dependency hell. Nikola Duza wrote a post about that issue in Javascript.

The problem does not seem that big, but try to grasp what is happening here. In another tutorial that Nikola wrote, he built a simple todo-list. It works in your browser with HTML and Javascript. How many dependencies did he use? 13,000.

These numbers are insane, but this problem will only keep increasing. As new, very useful libraries keep being built, the number of dependencies per project will keep growing as well.

That means that the problem Niklaus was warning us about in 1995, only gets bigger over time.

And no, you don’t have to learn assembly and start writing your web application in that. A good way to start would be to split up libraries. Instead of creating one big library that does everything you could ever possibly need, just create many libraries. Your god-like library could still exist, but solely as a wrapper.

This way a programmer only has to select the libraries he really requires, while ignoring the functionalities he is not going to use in his application. Not only are his dependencies smaller, but they will also use less of their dependencies because the dependencies of the unused functionalities do not have to be installed.

Note: The proposed solution obviously is not the solution. It would for example require a good way of versioning software to avoid a new dependency hell.

6 thoughts on “Niklaus Wirth was right and that is a problem

  1. I always imagined that as Moore’s law (adage?) “ends” and the growth of processor speed flattens we’d be forced to write more efficient programs and the problem would kinda fix itself. Perhaps naive.

    1. Yup, Andrew, I think you’re being naïve. Moore’s Law has a bottom, but systems are now build with multiple processors, and that has no top. Chuck Moore created a 1024-processor chip. Once we hit the bottom of Moore’s Law (and we have), we just add more processors per chip, or move into the Quantum Realm, where 10,000-year classical problems are solvable in minutes.

      Problems don’t fix themselves. We, like water, take the easy way out, unless we make the conscious decision to do and to be better.

  2. “With the introduction of higher-level languages such as Java, Ruby, PHP and Javascript all in 1995, the same year in which Wirth wrote his article, programming became more abstract.”
    Lisp was created in 1958. The APL, Smalltalk, Refal, Prolog, and Ada languages also predate this. Those four languages reduced the abstractions.

    “It also marked the beginning of the widespread use of libraries, probably one of the bigger problems.”
    This is another silly claim. Libraries have been in use since automatic computing was created, as it’s a natural progression.

    “However, a library never comes with the exact things that you need.”
    I disagree with this as well. I use my libraries as examples which, at least in many cases, fulfill precisely what would be expected of them. This is falsifiable so that most anyone should be able to think of a counterexample.

    “Having fewer things to worry about, also means having fewer things to think about.”
    This is the essence of what a high-level language provides, enabling greater programs to be written. I enjoy machine code programming, but there’s no virtue in using a poor high-level language and pretending that makes oneself superior. I’d rather write a line of APL than one hundred lines of a different language, and have done just that.

    I agree the WWW and JavaScript are insanity. They follow the same path of UNIX and C, with cults, gross inefficiency, and a lowering of expectations, amongst other similarities. Don’t blame the poor computing environments on languages. It’s very worthwhile to improve an implementation of a language such as APL, permitting more programs to be written in these languages. That UNIX is so disgusting and baroque it bore the WWW, which now plays host to its own nonsense in the form of Electron and whatnot, is fault of the intelligent idiots who duped people into accepting their poorly-designed messes at opportune times.

    Don’t think me some silly fellow. I write my own libraries, across multiple languages. As of writing, I’m finishing up on SHA work across APL, Common Lisp, and Ada. The APL is an order of magnitude more concise, although the Ada’s most efficient, and the Common Lisp most versatile. Rarely do I ever deign to actually measure my programs on my machines in anything more than a casual way. The goal to good software is working in abstract terms. I write and design algorithms considering their time and memory complexities, not their constant factors, and I’ve even removed optimizations, making what was O(1) time and O(N) memory to become O(N) time and O(1) memory, because N in the system was usually always small, and it simplified the system.

    In sum, the entirety of computing is a mess. The hardware should do more, so the software could do less, more reliably. The software should be written to maximize reuse. A system which didn’t need a kernel micromanaging memory accesses would be much more efficient, but since UNIX makes this impossible, it doesn’t get done. True efficiency requires more than some minor adjustments, but abandoning the idiocy of foolish physicists from the 1970s, and I don’t mean Urbit either, which has successfully made a target more poor than Brainfuck. I also believe we suffer from “insidious optimizations”, those optimizations which cease to be viewed as such, with ASCII and octet-granular memory being two examples. I detail this in my articles, for those curious.

  3. I agree the reckless utilization of libraries is the root cause. Just take a look at any today website and we would see 6-7 different Javascript libraries, 4-5 CSS files and none of them are small, even when minified. These bulk libraries themselves relied on other libraries making the loading process awfully slow and debugging a nightmare.
    Often 80-90% of those libraries are not needed or utilized at all but then still need to be loaded or slugged around. People should conduct research about how much of the libraries code are actually utilized and we’ll see the whole extend of this code cancer. Not only they slow the computers down, take up valuable resources and swallow up energy but also present a real danger to computer security because less utilized codes are less vigorously tested and their behavior less stringent observed.
    Of all the library, today OOP-libraries are the worst due to the tight coupling and heavy boilerplates codes. Smart-linkers can contribute to lean code to some extends but unless we move on to functional programming and code composition, not inheritance, this ailment will continue to plague mankind for a long time.

  4. change program from 64 bits (pointers) to 16bits and magicaly program will be smallest. This same language, compilator etc.

    problem is with compilators. Yes.
    look how big file produce llvm, rust and how small produce .o from gcc. Why today i cant use 16bits or less program on 64 machine? I wrote a small program and DONT need profile for 128GiB memory or more. I need 16kb. Why my processor have MMU for big memory menagement and DONT have a small instruction (and register) set for my small program?

Leave a Reply to geophf Cancel reply

Your email address will not be published. Required fields are marked *