Fuck all the way off with this "truth!" So much coding is made more enjoyable and profitable by people having fun like this. KoboldAI is one of my favorite projects, and like it or not, attention is driven to projects that are cute and lever human perception for their benefit. Mascots are even an important part of whether a technology becomes popular or not. This isn't stogy IBM black-tie mainframe-driven development anymore, nor should it be.
I know tons of professional developers who live and breathe computer science that enjoy having fun with their terminology and how they choose to represent and discuss computers, anthropomorphizing languages, projects, interfaces, iconography, etc.
There's a far more cynical part of my mind that says that's the whole point. How else do you extract value sometimes?
I was raised in a household where many, many things were anthropomorphized, even socks; but raised as a subhuman pet, well...
I self-identify as transhuman. It wasn't anything I chose but transhumanism chose me. I am intimately and inextricably connected with electronics and machines. Consider C-3PO, Anakin becoming Vader, or Adams' Eddie/Marvin.
To this day I surprise and amuse people by naming my devices and treating them as sentient. My devices, my home networks, are pets, or children, or plants that I care for, that I feed, that sort of help/serve me [perhaps that part is backwards]
And there is a certain sub-sentience, an autonomy, to many advanced systems. Anything connected to the Internet has a discernible mind and soul--you cannot deny this! How many decades has marketing referred to the CPU as "brain" of the computer or what have you?
It's weird when people only want to discuss my meatspace activities, as if cyberspace is irrelevant or invisible?
This mythology of "uploading our consciousness into the cloud" is well underway. Children build robots and corporations write software, imbuing it with their own business logic, lore, and logos. The Terminator franchise is not wrong, but most people still experience computing as benign or even pro-human.
That said, this one really is a truth: "Simplicity is prerequisite for reliability."
And this one is no longer true since LLMs: "Projects promoting programming in "natural language" are intrinsically doomed to fail."
Not convinced that this is no longer true... yet
Twenty years ago such projects were intrinsically doomed to fail. Today, they are on the cusp of not failing.
It's always vibe coding. It was just harder with Google and SO.
There is nothing simple about the way the Internet works but it continues to be proven robust against everything from temporary outage to nation-state revolution.
How's COBOL not simple though? COBOL is still in use in major banks today. We're not talking about an old Commodore 64 (love that machine by the way) still used by a lone mechanic in some rural area to compute wheel alignment (which does exists too): we're talking about at least hundred of millions of lines of COBOL still use in use throughout the world. Maybe still billions of line.
And it all just work.
COBOL has proved its reliability. I don't remember the language as particularly hard: a bit of a straightjacket but it isn't complicated, it's simple.
On the other hand, I would not even call COBOL in itself reliable. I used it twice in my career, and it always needed a tremendous amount of handholding from user and developers to run, and very often the main user HAD to be a developer.
The first time was in a major bank, and the second time it was in a major university. My job in both cases was to migrate away from it and have a system that could run independently rather than needing a developer babysit it.
This is pretty wild when you think about it. I wouldn't expect a lab to check another lab's work by re-writing their code (although, I'd love to hear some examples!), but if you don't, you're really powerless against whatever bugs they wrote into their scientific code.
https://news.ycombinator.com/item?id=24776336 - Oct 2020 (73 comments)
https://news.ycombinator.com/item?id=4926615 - Dec 2012 (67 comments)
https://news.ycombinator.com/item?id=2279260 - Mar 2011 (74 comments)
https://www.cs.utexas.edu/~EWD/ - for this and many more writings by Dijkstra.
- is designed for you
- never talks back
- can be mastered
selects for (and breeds!) a deep sense of arrogance and entitlement
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Ah.
I don't know if he'd be pleased or dismayed to learn that here in 2025, programming is often a whole lot easier than pure mathematics. Programming is often small science experiments performed on immensely complex systems. Math is still actually math.
A better question might be, how should you proceed when your truth does not want to be heard?
That said, I found this one particularly interesting given the recent rise of LLMs:
Projects promoting programming in "natural language" are intrinsically doomed to fail.
I assume he's referring to languages like COBOL and SQL, the latter still going strong, but I can't help but think that this part will change a lot in the coming decades.
Sure we'll likely still have some intermediary language with strong syntax and similar, just like how LLVM, C# and similar have their IL, but it's hard to think the majority of the programming is done typing in regular programming languages like JavaScript, C++ or similar in 2050.
[1]: I have of course learned about his algorithm
I, by the way, don’t celebrate him as other people do. I don’t think there’s ever an excuse for anyone to behave like that, no matter how brilliant.
In just this piece alone he makes several objectively false and/or narrow-minded statements, even back when he wrote it.
Not saying he wasn't smart, but being so narrow-minded precludes brilliance in my view.
So yeah, I've never really understood the fame he got either.
COBOL, at this point, has outlived Dijkstra and is poised to be a language-in-use for longer than he was a human-in-breathing. So I suspect he missed the mark on that one.
(I think, personally, we hackers have a bad habit of deciding a language that doesn't fit our favorite problem domains is a bad language. There are reasons, other than simple inertia, that COBOL sticks around in places where the main task is turning written laws into computer code...).
COBOL wasn't designed for that. The intention was a language that non-coders could code in. This was the precursor to things like FIT which was a precursor to today's Cucumber, like some history of child abuse carried on over the generations, so we still suffer today.
Ada was the one designed for quality.
Choosing to fight the disease will make enemies of your friends, it will rob you of your peace of mind, it will rob you of your faith in humanity, and bring you no closer to curing said disease.
The choice of CS departments to adopt the latter strategy is one made either from either wisdom or game theory, and it is in the self-interest of rational actors.
>But, Brethern, I ask you: is this honest? Is not our prolonged silence fretting away Computing Science's intellectual integrity? Are we decent by remaining silent? If not, how do we speak up?
If you are vexed by these questions, ask not the very same, but rather whether the intellectual integrity of public discourse (including nominally professional subsets of it) is worth your sense of safety, your sense of sanity, and your social circle, because you'll pay with those every time you play the unwinnable game of trying to convince the world of uncomfortable truths.
While this might have been true in 1975 with FORTRAN, I wonder if this holds true today?
We're gonna be stuck with cpp for at least a thousand years aren't we.
In general, you can assume that any technology or standard which had significant market share during a growth period will have, at the very least, a long tail of continued use for the foreseeable future. Stuff that's in use and works doesn't get replaced unless the alternatives beat out the switching cost.
For another example, I typed this comment on a QWERTY keyboard.
Just my two cents.
I promise if you get far along enough in your career you will realize this is very much not true, it's a thing people like to believe, but there are plenty of deeply stupid programmers out there with long, annoying careers.
There are a lot of incredibly clever programmers out there who will construct intricate webs of abstracted hell because they are clever. These guys are mostly not "stupid". One of my colleagues working on one of these code bases with me described it as "very smart people doing very stupid things".
And yes, they all have long and annoying careers.
Contrast this with the swathes of what I would characterise as "rat cunning" programmers who were everywhere during the Y2K crisis. They knew just enough to be dangerous, did some truly stupid things and disappeared from the programming world afterwards. The unkind might say they all turned into systems architects and project managers.
A programmer can find a niche where their skill has value for a long period of time, even if their situation (mental flexibility, willingness to learn, etc.) precludes exiting the niche. It can be a challenge to work with someone like that if you have to interface to them and their approach is to pull you all the way over to where they are.
... but sometimes that's how it is. And I've seen plenty of smart programmers smother their ability to provide value for real people under analysis paralysis while the "stupid" programmers bull-charge in with the first approach they can think of and write some ugly, stupid, spaghetti, working tools.
My point is, there are usually ways to phrase identical truths with which the writer does not intend to insult.