I am moving from Dropbox to Sync. The former just got too bloated with features I don’t need and want, and on Sync everything is encrypted, not only in transit, but also when stored remotely. Also, Sync gives 5GB storage space for free, and has more flexible and better price plans should I need more.
Anyway, when moving my stuff over, I cleaned it out somewhat, getting rid of really old stuff, and re-discovered, and re-read, this article by Niklaus Wirth. Written more than 20 years ago, it’s still interesting to read, or maybe again so, as the problems addressed have not gone away, I think.
It also sent me down a few avenues of memory lane.
At ETH Zurich, I had been a student of Wirth, if only for a few introductory courses. I was to become an electronics engineer – you know, the real stuff –, and back in these days, hardware was still pretty much separated from, or only loosely coupled to, control software, unlike today where you cannot design a piece of non-trivial hardware without a microprocessor on the board. Your washing machine has one. Probably your espresso machine as well. Your car has half a dozen, but most likely more.
When I entered the industry after obtaining my MSc degree, I quickly learned that I had to get into software development as well, if I wanted to have a significant say in the main aspects of my designs and engineering works. Within a year or so, I ended up writing fault-tolerant real-time programs for hardware designed by others.
Writing real-time controller software is hard. If your software controls a power station for a telephone exchange, with up to 2,000 Amperes1 of direct current (DC) flowing out through cables that are thick as your arm, any wrong decision or control signal by your program can have, well, interesting effects. Correctness matters, problems cannot be shrugged off by “oh, another blue-screen, let’s reboot”. For starters, there’s no-one around to see any error message in the first place, and take action. The controller must run autonomously, 24/7, unsupervised by any human. That’s where fault-tolerance comes into play. You need all sorts of crazy preventive, predictive, and self-correcting measures to cope with potential errors in your own controller hardware and software. Your program needs to watch itself and the platform its running on.2
For such environments, you want a friggingly strict programming language. We used Modula-2, the second language designed by Wirth, after Pascal and before Oberon. I loved that language. OK, I learned to love it: I had to take a crash course of a few days, because I didn’t know it. At ETH, we had used Pascal for our courses and exercises, so I at least had some ideas of the concepts, as Modula-2 is based on Pascal.
Wirth’s programming languages are simple and well-conceived. His answer to programming challenges and insights was always clearer and simpler language concepts, modularisation (incl. data abstraction), strict static typing, never simply adding features. Some people think Wirth’s languages require too much typing… oh well. The typing speed was never an impediment to my programming, which might say something about my limited abilities. Anyway, as I see it, Wirth designed languages as vehicles to help his students, and many students around the globe, to really understand the craft, its tradeoffs, traps, and solution approaches. Of course, the languages are well applicable for specific real-world problems, as in the case of my aforementioned controller project. Another example: the initial Macintosh operating system was mostly written in Pascal, if memory serves.
Wirth’s Oberon language is even stricter. No type coercions and the like. You cannot add an integer value to a floating point value, unless you explicitly type it out in your program text. What appears to be a nuisance in many cases is a Good Thing™ for controller software. The programmer must make explicit everything, enforcing very clear thinking and awareness.
Deep understanding, being precise, keeping things as simple as possible as regards concepts and technologies, be as specific as possible right in the program text, in order to be able to assess its correctness by humans and compilers, with as few as possible “hidden mechanics” – I think these are hallmarks of Wirth’s decades long endeavour as teacher at ETH.
Wirth is a peculiar person. He was dauntless in his approach and work, never shying away from simply doing, or making stuff. He didn’t give a shit about what the “mainstream in the industry” was. He wanted to teach his students what he considered to be correct and important, and for this he not only invented programming languages,3 but also designed and built whole computers and their operating systems. Alan Kay reportedly said: “People who are serious about software should make their own hardware”. Wirth did.
And all this in a way a single person can understand in all depths. For example, the Oberon language is defined using fewer than twenty pages. It’s syntax is carefully crafted to allow fast one-pass compiling. The Oberon system is both a working computer as well as a case study in design and engineering, running on hardware that is tiny and weak compared to today’s personal computers, and even mobile phones and tablets. The whole system is described in a single book, now available online here, down to the program texts.4
That is rather extraordinary, especially in today’s realm of technology that is marked by bloat and complexity, especially with everything and the kitchen sink connected and on-line. Admitted, the computers Wirth designed didn’t (or don’t, some are still in use at ETH) have all the chrome and bling, didn’t have plug-and-play for everything, and whatnot, and are thus potentially less user-friendly. I am not advocating that we all should get Oberon computers. But geez, do they use few resources! Makes you wonder if our current computers, devices, and programs couldn’t be realised with way less. But I guess that ship has sailed. Powerful hardware has become too cheap to profoundly care for parsimonious engineering.
In the context of Wirth’s article, I cannot resist to quote this quip by Alan Kay:
What Andy giveth, Bill taketh away,
referring of to Andy Grove, then CEO of Intel, and Bill Gates of Microsoft, pointing to the fact that increasingly bulky and complex software more or less eats up all progress in raw hardware performance, ie. using up more and more resources for not much more functionality.
I am going on too long already down my memories and various rabbit holes, so let me conclude with a last point. In the aforementioned paper, Wirth writes:
Time pressure is probably the foremost reason behind the emergence of bulky software. The time pressure that designers endure discourages careful planning. It also discourages improving acceptable solutions; instead, it encourages quickly conceived software additions and corrections. Time pressure gradually corrupts an engineer’s standard of quality and perfection. It has detrimental effects on people as well as products.
Indeed. I could not agree more. This has killed my motivation, too:
My decision was also driven by an ever-decreasing motivation to work in my professional field, namely information technology, mostly in enterprise environments. Less real engineering, more chatter. Less respect for quality work and concepts, more focus on just getting things up-and-running right now – and cheap.
Ahem. Yes. However, I am myself guilty of sometimes passing time pressure from our customers to my engineers. Time and cost pressure are powerful forces if you need to earn revenue as a start-up company to keep going, and to get a solution or product out of the door. No next version, no invoices, no salaries for everyone. Powerful.
But it’s not only start-ups. Before I moved here to Mauritius, I had consulted a major company in the medical equipment field, in a project to design and build a surgical robot for replacing human knee joints with a prosthetic. A prime example for real-time controller software that must be reliable, well-engineered and tested. It was disheartening to experience the time pressure put on the engineers from the top management. Down in the trenches, I experienced people scrambling to get the next version ready to demo, and to field test in the labs. Good engineering looks different. Watching that sausage made wasn’t pretty.
I have probably just become an old fogey, who now has returned to, and hangs on to, engineering ideals of times past that will never return. Re-visiting Wirth’s work has reminded me of the times when engineers were actually given the time needed to do their jobs. I don’t have any deep insights into the industry at large as of now, however, today’s modus operandi often looks – and feels, as user – like a lot of software development rushes ahead, adding more and more layers of complexity and bulk,5 nonchalantly shrugging off problems and defects – so what, they can be fixed in the next release, right? We’re agile now! Alas, maybe the “agile people” are even right, this has become the only way of working by now, given the current state of the programs, systems, and the trade in general.6
Now I feel like writing some Oberon code. Let’s see if I can find a compiler for some cheap ARM microcontroller board.
That’s close to 100 kW at 48 Volts. The station consisted of 24 parallel modules at 100 Amps each, which had to be monitored and controlled individually. Four modules of the 24 were for backup and redundancy. ↩︎
Of course, you realise the intrinsic problem here: what watches the watcher?! It’s tricky. ↩︎
Or further simplified his languages if needed for a specific use case, including writing a new compiler for this new version. I think Wirth loved writing compilers. ↩︎
There’s another landmark book in the same vein and spirit: Operating Systems, by A. Tanenbaum, describing and explaining MINIX, again including all C code; it became the foundation of, or at least inspiration for, Linux. ↩︎
Including voice control everywhere, artificial intelligence, home automation, … ↩︎
There are, of course, branches where meticulous engineering is still conditio sine qua non. For example, you shouldn’t shoot a rover to Mars without. Even though there’s an interesting story about a software design problem that manifested on Mars. Shows how tricky pre-emptive scheduling can be in controller software. ↩︎