245 points by t14n 23 hours ago | 31 comments
Arch-TK 3 hours ago
I have a theory that the worse is better approach begets an environment where the worse is better approach is better.

At least hypothetically, I think there's an approach which is not "the right thing" or "worse is better" but rather more like "the right foundations".

Most interface complexity in my experience seems to be inherited from underlying interface complexity, and it takes a lot of work to fix that underlying interface complexity. This, I think is where "worse is better" shines. If you try to apply a "the right thing" approach to a system where you're dealing with shitty underlying interfaces (i.e. every popular operating system out there including every unix, and NT system) you end up with endless complexity and performance loss. So obviously nobody will want to do "the right thing" and everyone who takes the "worse is better" approach will end up way ahead of you in terms of delivering something. People will be happy (because people are almost always happy regardless of how crap your product is).

On the other hand, designing something with "the right foundations" means that "the right thing" no longer needs to involve "sacrifice implementation simplicity in favour of interface simplicity" to anywhere near the same extent because your implementation can focus on implementing whatever interface you want rather than first paving over a crappy underlying interface.

But the difficulty of "the right foundations" is that nobody knows what the right foundations are the first 10 times they implement them. This approach requires being able to rip the foundations up a few times. And nobody wants that, so "worse is better" wins again.

wismi 1 hour ago
I think there's a lot of truth to this. It reminds me of an idea in economics about the "second-best". From the wikipedia page:

"In welfare economics, the theory of the second best concerns the situation when one or more optimality conditions cannot be satisfied. The economists Richard Lipsey and Kelvin Lancaster showed in 1956 that if one optimality condition in an economic model cannot be satisfied, it is possible that the next-best solution involves changing other variables away from the values that would otherwise be optimal. Politically, the theory implies that if it is infeasible to remove a particular market distortion, introducing one or more additional market distortions in an interdependent market may partially counteract the first, and lead to a more efficient outcome."

https://en.wikipedia.org/wiki/Theory_of_the_second_best

jpc0 1 hour ago
I have a question for this premise.

How would you design a network interface using your right foundations model? I'm not talking about HTML or whatnot.

I have some sort of medium, copper, fiber whatever and I would like to send 10 bytes to the other side of it. What is the right foundations that would lead to an implementation which isn't overly complex.

me_again 31 minutes ago
Not a direct answer, but Ethernet is sometimes brought up as a successful example of Worse is Better. At one point Token Ring was a serious competitor - it had careful designs to avoid collisions when the network was busy, prioritize traffic, etc. But it was comparatively slow and expensive. Ethernet just says "eh, retry on collision.". And that simplistic foundation has carried on to where we have a standard for 800 Gigabit Ethernet.
tightbookkeeper 53 minutes ago
good comment. But I question how much you can package up inherent complexity in a simple interface, due to leaky abstraction.

The biggest benefit of simplicity in design is when the whole system is simple, so it’s easy to hack on and reason about.

bccdee 16 hours ago
> Both early Unix and C compilers had simple structures, are easy to port, require few machine resources to run, and provide about 50%-80% of what you want from an operating system and programming language.

> Unix and C are the ultimate computer viruses.

The key argument behind worse-is-better is that an OS which is easy to implement will dominate the market in an ecosystem with many competing hardware standards. Operating systems, programming languages, and software in general have not worked this way in a long time.

Rust is not worse-is-better, but it's become very popular anyway, because LLVM can cross-compile for anything. Kubernetes is not worse-is-better, but nobody needs to reimplement the k8s control plane. React is not worse-is-better, but it only needs to run on the one web platform, so it's fine.

Worse-is-better only applies to things that require an ecosystem of independent implementers providing compatible front-ends for diverging back-ends, and we've mostly standardized beyond that now.

xiphias2 6 hours ago
There are some differences between your examples in my opinion:

Rust started as an experiment of Mozilla team replacing C++ with something that helps them compete with Chrome in developing safe multi-threaded code more efficiently. It took a lot of experiments to get to the current type system, most of which gives real advantages by using affine types, but the compiler is at this point clearly over-engineered for the desired type system (and there are already ideas on how to improve on it). It's still too late to restart, as it looks like it takes 20 years to productionize something like Rust.

As for React I believe it's I believe an over-engineered architecture from the start for most web programming tasks (and for companies / programmers that don't have separate frontend and backend teams), but the low interest rates + AWS/Vercel pushed them on all newcomers (and most programmers are new programmers, as the number of programmers grew exponentially).

HTMX and Rails 8 are experiments in the opposite direction (moving back to the servers, nobuild, noSAAS), but I believe there's lot of space to further simplify the web programming stack.

psychoslave 4 hours ago
Unix and C are still there, and while on shallow level this can be more or less ignored, all abstractions end up to leak sooner than later.

Could the industry get rid of C and ridiculous esoteric abbreviation in identifiers, it could almost be a sane world to wander.

tightbookkeeper 57 minutes ago
Rust is not very popular in terms of number of users. It’s just over represented in online discussion.
ezekiel68 19 hours ago
I'm always happy whenever this old article goes viral. For two reasons: First, learning to accept the fact that the better solutions doesn't always win has helped me keep may sanity over more than two decades in the tech industry. And second, I'm old enough to have a pretty good idea what the guy meant when he replied, "It takes a tough man to make a tender chicken."
bbor 18 hours ago
I’m glad to know a new article that “everyone knows”! Thanks for pointing out the age.

And, at the risk of intentionally missing the metaphor: they do in fact make automated tenderizers, now ;) https://a.co/d/hybzu2U

hyggetrold 15 hours ago
It's a funny expression and it is rooted in advertising: https://en.wikipedia.org/wiki/Frank_Perdue
karel-3d 23 hours ago
I remember when I had a lesson about OSI layers, where the teacher has carefully described all the layers in detail

and then said something like "most of this is not important, these layers don't really exist, TCP/IP got popular first because it's just much simpler than OSI was"

pjc50 22 hours ago
Oh, there's an entirely different feature-length article to be written/found about how packet switching beat circuit switching and the "Internet approach" beat the telco approach. The great innovation of being able to deploy devices at the edges without needing clearance from the center.

I don't think very many people even remember X25. The one survivor from all the X standards seems to be X509?

kragen 22 hours ago
The OSI stack was also designed using the packet-switching approach. Rob Graham's "OSI Deprogrammer" is a book-length article about how TCP/IP beat OSI, and how the OSI model is entirely worthless: https://docs.google.com/document/d/1iL0fYmMmariFoSvLd9U5nPVH...

I'm not sure he's right, but I do think his point of view is important to understand.

lizknope 18 hours ago
Wow. That is awesome, I skimmed through it and it looks like something I will enjoy. I've still got my Tanenbaum Computer Networks book from 1996 and the first chapter starts with OSI, then TCP/IP, and explains the differences and why OSI failed.
kragen 17 hours ago
I'm interested to hear what you think! I haven't finished reading it yet. It's a bit repetititive.
OhMeadhbh 14 hours ago
It doesn't seem like a "take down" as much as a re-iteration of all the things IBM, DEC, GE and various Telcos did wrong when implementing OSI. I could reduce it to one sentence: "Everyone was so intent on monetizing their own networking implementation they never thought enough about interoperability."
kragen 12 hours ago
Not only isn't that an accurate summary of Graham's book, it isn't even a topic discussed in the first half of the book, which is all I've read so far. I suspect it isn't a topic discussed in the book at all; can you back up your assertion with some quotes?
OhMeadhbh 7 hours ago
Yes, but the book also has enough inaccuracies as to make it... I don't know what. For example, in the first chapter the author says "no one knows what a session is," which is patently false. I myself implemented control logic in telco equipment to respond to X.225 compliant messages to change the state of an abstract state machine used on either side of the connection. And while I'm sure it's possible to use CONS or CLNP to communicate with a CEEFAX terminal, that is far from the only use to which the various OSI compliant protocols were put.

Just because you don't understand something, that doesn't mean it's bad.

kragen 3 hours ago
I think he means to be saying that, in the context of TCP/IP, nobody knows what a “session” in the OSI sense would correspond to—not that nobody has ever implemented X.225. Presumably Graham knows that people have written X.225 implementations and isn't trying to convince his readers otherwise?

I don't know enough to judge his assertion that the session layer exists to solve problems created by half-duplex terminals. (He doesn't seem to specifically call out Ceefax.)

chuckadams 18 hours ago
Wow, you're not kidding about book-length: at 246 pages, that's an epic takedown. Learning all kinds of other things about networking along the way, though.

I do remember all nine layers of the OSI stack though: physical, data link, network, transport, session, presentation, application, financial, political.

fanf2 18 hours ago
chuckadams 17 hours ago
I can't claim to have BTDT, but I did get the T-shirt. That's how I learned them :)
fanf2 19 hours ago
OSI was was two network stacks fighting with each other, the circuit-switched telco X.25 successor and the packet-switched DEC / Xerox anti-Internet.

See also https://computer.rip/2021-03-27-the-actual-osi-model.html and https://dotat.at/@/2024-03-26-iso-osi-usw.html

kragen 17 hours ago
This is great, thanks! crawford's post pulls no punches:

> Teaching students about TCP/IP using the OSI model is like teaching students about small engine repair using a chart of the Wankel cycle. It's nonsensical to the point of farce. The OSI model is not some "ideal" model of networking, it is not a "gold standard" or even a "useful reference." It's the architecture of a specific network stack that failed to gain significant real-world adoption.

OhMeadhbh 13 hours ago
That's sort of like saying "TCP" is a failure because everyone now uses SSH instead of TELNET. TCP/IP still has to do all of the things the OSI stack does, it just does it in a different manner and there are (thankfully) plenty of well defined wire-formats and processing expectations so interoperability is pretty straight-forward. But I still think IPSec would have been MUCH easier to deploy had TCP/IP maintained a rational distinction between presentation and session layers. I guess what we learned is that SSL/TLS and FreeSWAN's assumptions about routing encrypted payloads were "good enough."

Also, if you're going to compare TCP/IP to various OSI implementations, you should compare the full stack including PEM, MOSS, SMIME, SSL/TLS, SSH. Each muddies the difference between presentation and application layers, but as in the previous paragraph, no one seems to care. Talking SMTP over SSH (or SSL/TLS) is totally fine; you don't need to have a sub-protocol to define how a presentation layer on top of a secure session layer works if you can make certain assumptions about the behaviour of the code on the other side of the network connection.

kragen 12 hours ago
Each of the articles linked upthread separately explain why everything you said in your comment is incorrect.
scroot 1 hour ago
This might be fiery take, but I think the X.400 standards for naming and messaging would have been a lot better than the chaotic email situation, and probably would have made more sense from a commercial/legal perspective than making DNS "the" global naming system
fanf2 19 hours ago
LDAP is “lightweight” compared to the X.500 directory access protocol. LDAP DNs are basically the same as X.500 DNs.

SNMP is “simple” compared to X.711 CMIP. But SNMP also uses ASN.1 and X.660 OIDs.

foobarian 18 hours ago
I finally understand why OSI failed, it's the naming! Dear lord.
OhMeadhbh 13 hours ago
Oh no. The naming is simple compared to ASN.1/BER parsing.
goalieca 12 hours ago
I’m always confused about when to use DER and when to use BER. Pretty much have to study the history to get it.
jonmon6691 10 hours ago
I found this article helpful when I had that same question. Basically BER has some less rigid specifications for how to encode the data that can be convenient for the implementation. Such as symbol-terminated sequences instead of having to know their length ahead of time. But this means that there are many equivalent serializations of the same underlying object, which is problematic for cryptography, so DER is an unambiguous subset of BER which will have only one correct possible serialization for a given object.

https://luca.ntop.org/Teaching/Appunti/asn1.html

rwmj 18 hours ago
chuckadams 18 hours ago
> "How do you scare a Bellhead?" he begins. "First, show them something like RealAudio or IPhone. Then tell them that right now performance is bandwidth-limited, but that additional infrastructure is being deployed."

You'd scare anyone like an amazed rural tribesman if you showed them an iPhone in 1996.

I know, IPhone was a Cisco thing, but my mind went there for a beat ;)

bee_rider 18 hours ago
Haha, I didn’t get it until your last sentence
19 hours ago
hiatus 17 hours ago
X.25 is still used in ham radio with AX.25
OhMeadhbh 13 hours ago
I hate to tell you this, X.25 is still used ALL OVER THE PLACE. But thankfully hardly anywhere near data networking customers.
PhilipRoman 18 hours ago
IMO the OSI layer system (even though using TCP/IP suite) has some merit in education. To most of us, the concept of layering protocols may seem obvious, but I've talked to people who are just learning this stuff, and they have a lot of trouble understanding it. Emphasizing that each layer is (in theory) cleanly separated and doesn't know about layers above and below, is a very useful step towards understanding abstractions.
ahoka 15 hours ago
The problem is that this is not true. There are no such clean strictly hierarchical layers in most of the protocols that make up the internet.
12 hours ago
17 hours ago
supportengineer 17 hours ago
I've never done any kernel programming but I assumed the OSI model corresponded to Linux kernel modules or a similar division internally.
gpderetta 23 hours ago
> The good news is that in 1995 we will have a good operating system and programming language; the bad news is that they will be Unix and C++.

And 30 years later they show few signs of letting go.

ezekiel68 18 hours ago
Yep. And nary a tear is shed these days over the death of the so-called superior Lisp machines.
gpderetta 18 hours ago
The Spirit of the Machine still lives in some form in emacs.
rwmj 18 hours ago
Maybe not in any position to do anything about it, but I'm quite sad :-/
OhMeadhbh 14 hours ago
Meh. Lisp machines still exist. They're just simulated in various Lisp's runtime environments. It turns out that a RISC machine running a Lisp interpreter or an executable compiled from Lisp source tends to perform better than a tagged/cdr Lisp Machine w/ hardware GC.

That being said... I've wanted to implement an old Explorer using an FPGA for a while. Maybe if I just mention it here, someone will get inspired and do it before I can get to it.

hayley-patton 9 hours ago
Lisp machines didn't have hardware GC, though they had hardware support for read/write barriers.
lispm 5 hours ago
Kind of. A lot of the GC support in the Symbolics 3600 architecture is on the Microcode level (current CPUs usually don't have such operations in Microcode). The word size of the CPU is 36bit. The CPU operations don't deal with the type and gc tags on the instruction level, this is done on the Microcode level. Things like "invisible pointers" are also dealt on the Microcode level.

Ephemeral GC, concentrates on garbage collecting objects in RAM: For example every memory page has a page tag, which marks it modified or not. The ephemeral GC uses this to scan only over changed pages in memory. The virtual memory subsystem keeps a table of swapped-out pages pointing to ephemeral pages. The EGC can then use this information...

Jach 17 hours ago
At a certain level, sure, but C++ at least has definitely lost out. In the 90s it seemed like it might really take over all sorts of application domains, it was incredibly popular. Now and for probably the last couple decades it and C have only kept around 10% of the global job market.
OhMeadhbh 14 hours ago
My gut feeling is there are still the same number of jobs for C++ today as there were in the 90s. It's just that they're hard to find because the total number of programming jobs has exploded. The reason you can't see the C++ jobs is because the newer, non-C++ jobs are crowding them out on job boards. [This is a hypothesis, one I haven't (dis)proven.]

For fun a few weeks ago I went looking for COBOL on VMS jobs. They're definitely still out there, but you do have to look for them. No one's going to send you an email asking if you're interested and if you don't hang out with COBOL/VMS people, you may not know they exist.

I think my point is the total number of C/C++ jobs today are probably the same or slightly higher than 1994. But the total number of Java and C# jobs (or Ruby or Elixr or JavaScript jobs) is dramatically higher than in 1994, if for no other reason than the fact these languages didn't exist in 1994.

[As an aside... if you're looking for a COBOL / VMS programmer/analyst... I spent much of the 80s as a VMS System Manager, coding custom dev tools in Bliss and some of the 90s working on the MicroFocus COBOL compiler for AIX. And while you would be crazy to ignore my 30+ years of POSIX/Unix(tm) experience, I think it would be fun to sling COBOL on VMS.]

jlarocco 13 hours ago
> My gut feeling is there are still the same number of jobs for C++ today as there were in the 90s. It's just that they're hard to find because the total number of programming jobs has exploded. The reason you can't see the C++ jobs is because the newer, non-C++ jobs are crowding them out on job boards. [This is a hypothesis, one I haven't (dis)proven.]

I don't know anything about the total number of C++ jobs, but there's a huge filter bubble effect for job searching. If you don't mention a language on your resume or list it under your skills then you're very unlikely to see any jobs for it or have anybody contact you for a job using it, whether we're talking about C++, Python, Typescript, or even technologies like Docker.

pjmlp 5 hours ago
Depends on the market, even the C++ wannabe replacements are implemented in compiler toolchains written in C++.

It gets a bit hard to replace something that your compiler depends on to exist in first place.

worstspotgain 15 hours ago
It's not C++ that has been replaced, it's VB.
stonemetal12 22 hours ago
Isn't "Worse is better" just a restatement of "Perfect is the enemy of Good", only slanted to make better\Perfect sound more enticing?

>The right thing takes forever to design, but it is quite small at every point along the way. To implement it to run fast is either impossible or beyond the capabilities of most implementors.

A deer is only 80% of a unicorn, but waiting for unicorns to exist is folly.

OhMeadhbh 13 hours ago
Yes and no. "Worse is Better" also implies you allow someone outside your problem domain to define abstractions you use to decompose the problem domain (and construct the solution domain.) So... I mean... that's probably not TOO bad if they're well-understood and well-supported. Until it isn't and you have to waste a lot of time emulating a system that allows you to model abstractions you want/need to use.

But at the end of the day everyone knows never to assume STD I/O will write an entire buffer to disk and YOU need to check for EINTR and C++ allows you to wrap arbitrary code in try...catch blocks so if you're using a poorly designed 3rd party library you can limit the blast radius. And it's common now to disclaim responsibility for damages from using a particular piece of software so there's no reason to spend extra time trying to get the design right (just ship it and when it kills someone you'll know it's time to revisit the bug list. (Looking at YOU, Boeing.))

I do sort of wonder what happens when someone successfully makes the argument that C++ Exceptions are a solution often mis-applied to the problem at hand and someone convinces a judge that Erlang-like supervisory trees constitute the "right" way to do things and using legacy language features is considered "negligence" by the courts. We're a long way off from that and the punch line here is a decent lawyer can nail you on gross negligence even if you convinced your customer to sign a liability waiver (at least in most (all?) of the US.)

Which is to say... I've always thought there is an interplay between the "worse is better" concept and the evolution of tech law in the US. Tort is the water in which we swim; it defines the context for the code we write.

19 hours ago
th43o2i4234234 12 hours ago
The critical point of the article holds true of everything in human social networks (be it religion/culture/philosophy/apps/industry...).

If you don't achieve virality, you're as good as dead. Once a episteme/meme spreads like wild-fire there's very little chance for a reassessment based on value/function - because the scope is now the big axis of valuation.

It's actually worse because humanity is now a single big borg. Even 30-40 years back, there were sparsely connected pools where different species of fish could exist - not any more. The elites of every single country is a part of the Anglosphere, and their populations mimic them (eventually).

This tumbling towards widespread mono-memetism in every single sphere of life is deeply dissatisfying about the modern human life, not just for PL/OS/... but also for culture etc.

Anthropocene of humanity itself.

esafak 11 hours ago
> If you don't achieve virality, you're as good as dead.

Are you? Maybe the worse solution peaks faster, but can be supplanted by a better solution in the future, like how Rust is displacing C/C++ in new projects. The better solution may never be popular yet persist.

pjmlp 5 hours ago
For Rust to fully displace C++, it needs to eventually bootstrap itself, until then, C++ will be around.

Additionally there are no significant new projects being done in Rust for the games industry, AI/ML, HPC, HFT, compiler backends, hardware design,....

9 hours ago
th43o2i4234234 11 hours ago
Rust is nowhere near displacing C++.

There's typically a "exhaustion" phase with mono-memetism/theories where everyone gets sick and tired of the "one and only way" and it becomes fashionable to try out new things (eg. Xtianity in Europe). We're not at this point where the olds can be toppled.

i_s 23 hours ago
I've read this a few times over the years and I think the argument is sound. But I wonder if it is sound in the same way this statement is:

"It is better to go picking blueberries before they are fully ripe, that way you won't have much competition."

JohnFen 22 hours ago
"Worse is better" has become like "move fast and break things". They're both sayings that reveal an often-overlooked truth, but they have both been taken far too far and result in worse things for everybody.
ezekiel68 18 hours ago
I see what you mean. Yet I feel like the first one (at least, as outlined in the article) is more about accepting an inevitability that you probably have little control over, while the second is more often adopted as a cultural process guideline for things you can control. But that's just my impression.
agumonkey 17 hours ago
I assume that 'worse' often means find adequation with average and mass. This ensures a longer existence, later you may absorb the "better" you didn't have early on. Look how dynamic languages started to have better data types, various traits (generators, closures..), jit .. all things they could pluck out of old "glorious" languages that were .. somehow too advanced for the mainstream. It's a strange schizophrenic situation.
enugu 6 hours ago
Ironically, the main feature that separates LISP from other modern languages is homoiconicity/macros(now that features like garbage collection are mainstream).

And this leads to an easier implementation - parsing is easier(which is why code transformation via macros becomes easy).

clarkevans 23 hours ago
I think the worse-is-better philosophy is not well encapsulated with the 4 priorities given. Perhaps it is 4 completely different priorities. Here's a strawman.

1. Minimal -- the design and implementation must be the smallest as possible, especially the scope (which should be deliberately "incomplete")

2. Timely -- the implementation must be delivered as soon as feasible, even if it comes before the design (get it working first, then figure out why)

3. Relevant -- the design and implementation must address important, unmet need, eschewing needs that are not urgent at the time (you can iterate or supplement)

4. Usable -- the implementation must be integrated with the existing, working and stable infrastructure (even if that integration causes design compromises)

The other dimensions, simplicity, correctness, consistency, and completeness are very nice to have, but they are not the primary drivers of this philosophy.

AnimalMuppet 22 hours ago
That seems like a fairly solid strawman.

I would say that Timely and Relevant drive Minimal. I would also say that Minimal and Usable are in tension with each other.

worstspotgain 14 hours ago
EINTR's design is one of computing's absolute classics. To MIT and New Jersey, we should add the McDougals approach: "I cannot work under these conditions." When faced with the PC loser-ing issue, just don't implement the code in question.

McDougals resolves the apparent conflict between the other two. It blames the interrupt hardware as the root cause. It produces non-working, incomplete software. It's kind of a modest proposal.

However, it also produces no ripples in the design fabric. With MIT, the OS source is a maintenance nightmare. With NJ, modern software still has to deal with archaic idiosyncrasies like EINTR. With McDougals, all the "conflict-free" portions of the software advance, those that write themselves.

The result is likely immediately shelved, perhaps as an open source PoC. Over time, someone might write some inelegant glue that makes interrupts appear to behave nicely. Alternatively, the world might become perfect to match the software.

If nothing else, the software will have mimicked the way we learn. We use imperfect examples to draw the idealized conclusion. Even if it never gets to run, it will be more readable and more inspiring than either MIT or NJ.

mseepgood 22 hours ago
Maybe don't call it "worse", maybe it's just you who has a skewed perception of "good"/"right".
hammock 22 hours ago
The distinction between the set of five "right" principles and the five "worse is better" principles is known as compromise in design.

It's the opposite of what marketers want you to think of when they say "uncompromising design."

Dansvidania 23 hours ago
looking at how it played out with javascript one can't but agree.

(edit: I mean it unironically)

api 23 hours ago
JavaScript really illustrates the ultimate path-dependence of evolution. It got widely deployed during a boom time and therefore we are stuck with it forever.
GiorgioG 23 hours ago
We don’t have to be stuck with it forever. Start pushing WASM instead of React, etc. We can get there, but as technologists we have to make a concerted effort to do so.
pjmlp 5 hours ago
Only if the tooling was half as good as Flash.
auggierose 7 hours ago
WASM instead of React? That does not even make sense.
adastra22 17 hours ago
We will still be stuck with it forever for compatibility reasons
Der_Einzige 23 hours ago
But I feel like other languages that we were de-facto "stuck with" in certain domains boomed and then busted - i.e. Lua, Pearl, etc
actionfromafar 23 hours ago
Javascript is the only language which straddled the Client Server Gap. If it weren't for Node, Javascript would not have been as popular.
bigstrat2003 16 hours ago
It's still absolutely baffling to me that anyone is willing to run JS server side. There are so many options which are much better suited, why are people willing to jam that square peg into the round hole?
sjamaan 55 minutes ago
Perhaps the siren song of "isomorphic JS", where you can run the same code on the server and on the client? I can see the use case for having complex model and validation code running on the client for speedy feedback and on the server for security, and perhaps also the idea you could render something entirely on the server when needed (eg for indexing and non-JS clients) and on the client when it's capable.

Personally, I wouldn't want to touch server-side JS with a 10 foot barge pole.

homebrewer 16 hours ago
Because they don't know and don't want to know anything else. Not a single polyglot developer I personally know have ever chosen JS for server side, not once.
15 hours ago
api 19 hours ago
There was something long ago called GWT -- Google Web Toolkit -- that tried to make Java into that language by having it compile to JavaScript.

It actually worked decently well, but was due to Java needlessly verbose.

WASM lets us run other languages efficiently in the browser but that just opens the field to a lot of languages, not one language to rule them all.

actionfromafar 6 hours ago
Also GWT apps were pretty slow to load and start, and were very "app"-like as opposed to web-page like at a time when that was not as familiar as it is today. That's how I remember it anyway. And pretty heavy, developer wise, at a time when "update a file on the FTP" was still normal.
TacticalCoder 23 hours ago
Yup first thing I thought. That pathetic piece of crap conceived in basically 15 days (not kidding)... BUT it is what we have on the front-end for web apps so there's that. JavaScript is the mediocre turd I love to hate.
karel-3d 23 hours ago
And it keeps being polished and improved to the point it's almost not a turd, and now has types sort of, and much better engines, and now there are ARM machines that are literally designed to run it faster, and now most of your actual PC applications are written in it.

But honestly it's kind of refreshing to see the original node.js presentation, where using javascript is sort of a side-note. He wanted to use callback-heavy language and JS fit the bill

https://youtu.be/EeYvFl7li9E

GiorgioG 23 hours ago
It will always be a turd. Typescript is a nice illusion, but underneath is still the same turd.
actionfromafar 23 hours ago
To me WASM is the wildcard. It lets other languages infect the Javascript host organism.
Johanx64 12 hours ago
Problem with Javascript is that it is not confined to webbrowsers and webapps, but it and it's associated business models (SaaS) finds it's way everywhere, at first desktop apps got enshitified by it, then all sorts of smart devices, and all the way to embedded systems in cars with their laggy sloppy UIs everywhere.

It probably is the most severe case of "worse is better" I've experienced so far.

Taniwha 8 hours ago
Of course the opposite is often stated as "Perfect is the enemy of Good"
jes5199 22 hours ago
what's the old saw about "unix design prioritizes simplicity over correctness, and on modern hardware simplicity is also no longer considered necessary"
marcosdumay 22 hours ago
There's a really important detail in that simplicity tends to lead to correctness.

Anyway, worse is better is about simplicity of implementation versus conceptual simplicity. By principle, that's a much harder choice.

shagie 23 hours ago
For another discussion on this https://wiki.c2.com/?WorseIsBetter which starts out with:

    RichardGabriel makes this observation on the survival value of software in the paper Lisp: Good News, Bad News, How to Win Big. See http://www.jwz.org/doc/worse-is-better.html for the section on WorseIsBetter.

    For those seeking the first node, see http://web.archive.org/web/19990210084721/http://www.ai.mit.edu/docs/articles/good-news/good-news.html.

    For even more context on WorseIsBetter see http://www.dreamsongs.com/WorseIsBetter.html. My favorite part is RichardGabriel arguing with himself.
dang 19 hours ago
These look to be the interesting threads on Gabriel's worse-is-better essays:

Lisp: Good News, Bad News, How to Win Big (1990) [pdf] - https://news.ycombinator.com/item?id=30045836 - Jan 2022 (32 comments)

Worse Is Better (2001) - https://news.ycombinator.com/item?id=27916370 - July 2021 (43 comments)

Lisp: Good News, Bad News, How to Win Big (1991) - https://news.ycombinator.com/item?id=22585733 - March 2020 (21 comments)

The Rise of Worse Is Better (1991) - https://news.ycombinator.com/item?id=21405780 - Oct 2019 (37 comments)

The Rise of Worse Is Better (1991) - https://news.ycombinator.com/item?id=16716275 - March 2018 (44 comments)

Worse is Better - https://news.ycombinator.com/item?id=16339932 - Feb 2018 (1 comment)

The Rise of Worse is Better - https://news.ycombinator.com/item?id=7202728 - Feb 2014 (21 comments)

The Rise of "Worse is Better" - https://news.ycombinator.com/item?id=2725100 - July 2011 (32 comments)

Lisp: Good News, Bad News, How to Win Big [1991] - https://news.ycombinator.com/item?id=2628170 - June 2011 (2 comments)

Worse is Better - https://news.ycombinator.com/item?id=2019328 - Dec 2010 (3 comments)

Worse Is Better - https://news.ycombinator.com/item?id=1905081 - Nov 2010 (1 comment)

Worse is better - https://news.ycombinator.com/item?id=1265510 - April 2010 (3 comments)

Worse Is Better - https://news.ycombinator.com/item?id=1112379 - Feb 2010 (5 comments)

Lisp: Worse is Better, Originally published in 1991 - https://news.ycombinator.com/item?id=1110539 - Feb 2010 (1 comment)

Lisp: Good News, Bad News, How to Win Big - https://news.ycombinator.com/item?id=552497 - April 2009 (2 comments)

dang 19 hours ago
... and these are the some of the threads discussing it or aspects of it. Others welcome!

Worse Is Better - https://news.ycombinator.com/item?id=36024819 - May 2023 (1 comment)

My story on “worse is better” (2018) - https://news.ycombinator.com/item?id=31339826 - May 2022 (100 comments)

When Worse Is Better (2011) - https://news.ycombinator.com/item?id=20606065 - Aug 2019 (13 comments)

EINTR and PC Loser-Ing: The “Worse Is Better” Case Study (2011) - https://news.ycombinator.com/item?id=20218924 - June 2019 (72 comments)

Worse is worse - https://news.ycombinator.com/item?id=17491066 - July 2018 (1 comment)

“Worse is Better” philosophy - https://news.ycombinator.com/item?id=17307940 - June 2018 (1 comment)

What “Worse is Better vs. The Right Thing” is really about (2012) - https://news.ycombinator.com/item?id=11097710 - Feb 2016 (35 comments)

The problematic culture of “Worse is Better” - https://news.ycombinator.com/item?id=8449680 - Oct 2014 (116 comments)

"Worse is Better" in the Google Play Store - https://news.ycombinator.com/item?id=6922127 - Dec 2013 (10 comments)

What “Worse is Better vs The Right Thing” is really about - https://news.ycombinator.com/item?id=4372301 - Aug 2012 (46 comments)

Worse is worse - https://news.ycombinator.com/item?id=437966 - Jan 2009 (3 comments)

dkasper 23 hours ago
1991 tag. An all time classic.
api 23 hours ago
I feel like we're seeing a bit of push-back today against worse-is-better in the area of languages. Rust in particular feels more like the MIT approach, albeit with an escape hatch via the explicit keyword "unsafe." Its type system is very thoroughly specified and correct as opposed to C's YOLO typing.
dokyun 18 hours ago
Rust is not the MIT approach, because an important aspect of that approach is that it's conceptually simple. Rust is a leviathan of complexity both in interface and implementation. Common Lisp is an MIT approach language, and approaches the same problems like memory and type safety by doing "the right thing" by default and offering more advanced options like type annotations and optimization levels in a "take it or leave it" manner. Rust will force you to program the way the compiler wants in the name of safety, while Common Lisp will allow you to program safely and freely, and decide which parts are important. An observation of this idea is that Rust compilers are terribly huge and slow because they use static type-tetris for everything, while Common Lisp compilers are very fast because they do most type-checking at runtime.
steveklabnik 18 hours ago
> An observation of this idea is that Rust compilers are terribly huge and slow because they use static type-tetris for everything,

Rust's typechecking passes are not the reason why the compiler is slow. Code generation dominates compile times. Type checking is pretty quick, and Rust makes some decisions that enable it to do so, like no global inference.

11 hours ago
dokyun 18 hours ago
....so why is it so slow to generate code?
steveklabnik 17 hours ago
The compiler doesn't do a whole lot to try and minimize LLVM-IR, and monomorphization produces quite a lot of it. This makes LLVM do a lot of work. (EDIT: maybe that's being too harsh, but what I mean is, there's been some work towards doing this but more that could possibly be done, but it's not a trivial problem.)

On my current project, "cargo check" takes ten seconds, and "cargo build" takes 16. That's 62.5% of the total compilation time taken by code generation, roughly (and linking, if you consider those two to be separate).

In my understanding, there can sometimes be problems with -Z time-passes, but when checking my main crate, type_check_crate takes 0.003 seconds, and llvm_passes + codegen_crate take 0.056 seconds. Out of a 0.269 second total compilation time, most things take less than 0.010 seconds, but other than the previous codegen mentioned, monomorphization_collector_graph_walk takes 0.157s, generate_crate_metadata takes 0.171 seconds, and linking takes 0.700 seconds total.

This general shape of what takes longest is consistent every time I've looked at it, and is roughly in line with what I've seen folks who work on compiler performance talk about.

api 15 hours ago
That's an interesting counter-take and it makes me wonder if Rust isn't a third thing... not the clean sparse MIT approach but also not the wild YOLO slop of C and Unix. Something correct but also complex and pragmatic. Maybe it's an attempt at a synthesis of the two -- Unix's gnarly pragmatism but with correctness.

My heart loves the clean sparse MIT approach, but I'm kinda forced to work in the other world because that approach has so far decisively failed.

I have my own thoughts as to why, and they're different from the typical ones:

(1) Worse is free -- the "worse" stuff tended to be less encumbered by patents and closed source / copyrights. This is particularly true after GNU, BSD, and Linux. Free stuff is viral. It spreads fast and overtakes everything else.

(2) Programmers like to show off, and "worse" provides more opportunities to revel in complexity and cleverness.

dokyun 14 hours ago
> That's an interesting counter-take and it makes me wonder if Rust isn't a third thing... not the clean sparse MIT approach but also not the wild YOLO slop of C and Unix. Something correct but also complex and pragmatic.

I had supposed this in a previous thread, and I agree it is another thing entirely, however whether Rust is 'correct' or 'pragmatic', I think is a matter of contention.

> (1) Worse is free -- the "worse" stuff tended to be less encumbered by patents and closed source / copyrights. This is particularly true after GNU, BSD, and Linux. Free stuff is viral. It spreads fast and overtakes everything else.

At the time it seems that the "worse" stuff was actually more encumbered--GNU started to exist as a consequence of Unix both becoming an industry standard and solidifying the position of nonfree software in industry, and it didn't get that way by being free--they simply licensed it away en masse to universities. The free software movement was born out of the MIT hacker ethic of sharing software freely (GNU brought the design philosophy of MIT to Unix systems, and largely stands opposed to the "worse is better" approach. It originally sought to replace parts of Unix with superior alternatives, such as Info over man pages). BSD didn't become free until much, much later, at which point Linux had already become relevant.

> (2) Programmers like to show off, and "worse" provides more opportunities to revel in complexity and cleverness.

I think maybe C or C++ lets you show off by hacking around the compiler to do certain things that look impressive like preprocessor hacks or obfuscated code, but I think many would agree that this kind of style isn't "correct": languages like Go were developed as a consequence of this in order to suck any and all fun you might get out of hacking C, in order to force you to write more "correct" programs. Lisp, on the contrary doesn't tell you what a correct or incorrect program is, and gives you every facility to write programs that are infinitely complex and clever.

To me, Rust looks like the result of trying to break the rules of industry languages, by trying to incorporate things like real macros and type systems into something that resembles a real-world language. But it's biggest flaw to me is that it doesn't break enough of them, rather it makes more in the process.

djha-skin 22 hours ago
I actually take this as evidence that Rust will always remain niche. It's just a very complicated language. Go or Zig is much easier to learn and reason about.

In Go you can immediately tell what the fields are in a config yaml file just by looking at struct annotations. Try doing that with Rust's Serde. Super opaque in my opinion.

busterarm 22 hours ago
Exactly!

Rust will only protect me from things my customers don't care about and don't understand.

By not using Rust and just dealing with it, I'm making more money faster than if I started with Rust.

Rust only matters in environments where that calculus comes out the other way.

crabmusket 19 hours ago
I'm pretty sure your customers care that your software doesn't segfault!
busterarm 18 hours ago
My customers aren't running my software, I am. They don't know if it segfaulted or not.

If your customers are running your software you might have a business model problem instead of a software quality one.

morning-coffee 18 hours ago
> Rust will only protect me from things my customers don't care about and don't understand.

Are you suggesting your customers don't care about CVE's, even indirectly when it affects them?

warkdarrior 20 hours ago
> Rust will only protect me from things my customers don't care about and don't understand.

That's the wrong kind of protection. Rust should protect you from things people other than your customers (who presumably are well behaved) care about.

busterarm 22 hours ago
Disagree.

I think this is a self-delusion experienced by Rustaceans because they overvalue a certain type of software correctness and because Rust gets that right, the rest of its warts are invisible.

orwin 23 hours ago
This... Honestly the first lines seemed I thought this was an arrogant take, but he made _really_ good point and now I tend to agree with him.

Still I am a bit bothered, does a counterargument exist?

ripap 23 hours ago
Worse is Better is Worse: https://www.dreamsongs.com/Files/worse-is-worse.pdf

Same author (name is an anagram).

jes5199 22 hours ago
the name also seems to be a reference to another pseudonym, https://en.wikipedia.org/wiki/Nicolas_Bourbaki
orwin 21 hours ago
I wish i could upvote you more than once. This is a very good counterargument.
samatman 21 hours ago
May as well complete the set: Is Worse Really Better? https://dreamsongs.com/Files/IsWorseReallyBetter.pdf

Also: Nickieben Bourbaki might be an anagram of something, but it is definitely not an anagram of Richard Gabriel, with or without the P. There's no G, there's no h, there's an N and a k, it isn't even particularly close.

That claim is my best interpretation of this sentence:

> Same author (name is an anagram).

Although perhaps it was not your intention to connect the clauses in that way.

gpderetta 23 hours ago
A pseudonym you mean?
jes5199 22 hours ago
both!
AnimalMuppet 23 hours ago
An anagram. Same letters, different order.
gpderetta 21 hours ago
I might be dense today, but how's Nickieben Bourbaki an anagram of Richard P. Gabriel?
kqr 22 hours ago
It is going to come down to context. For the most part, you never know quite what it is you are designing, so an iterative approach to design with fast feedback cycles will get you there quicker. Give people something to play with, see how they use it, and continue from there.

But sometimes you need to know what is you are designing before giving it to people, because there are large risks associated with improvising. In those cases, making The Right Thing is still expensive, but it may reduce the risk of catastrophe.

I think, however, that the latter cases are rarer than most people think. There are ways of safely experimenting even in high-risk domains, and I believe doing so ultimately lowers the risk even more than doing The Right Thing from the start. Because even if we think we can spend years nailing the requirements for something down, there are always things we didn't think of but which operational experience can tell us quickly.

gpderetta 23 hours ago
Empirically, it seems Worse is Better appear to have been correct many many times.
kayo_20211030 23 hours ago
The New Jersey approach has the benefit of dealing with time in a sensible way.

The time to market for the MIT approach is just too long if your revenue relies on actually shipping a product that covers the cost of the next iteration that will move you from 80% to 90%, or even 60% to 70%. It's an old joke, but in the long run we're all dead; and waiting for the production of an ivory tower implementation won't work out well. If it's all academic, and there's no commercial pressure, well, have at it. There's not much at stake except reputations.

Furthermore, in the real world, your users' requirements and your internal goals, theoretically covered by "the" design, will change. Not everything can be reasonably anticipated. The original design is now deficient, and its implementation, which is taking too long anyway, will be a perfect reflection of its deficiency; and, not fit for its new purpose.

AnimalMuppet 23 hours ago
I'll go further. Even if users' requirements and your goals don't change, you don't adequately understand them. You don't know what your users need perfectly. It's better to fire rapidly and adjust your aim than it is to try to have your first attempt be perfect. (Yes, people take this too far the other way...)

Get something out there and start getting feedback. You won't actually know until you do.

ezekiel68 18 hours ago
Yes and it almost serves as more of a coping mechanism (a thought framework that helps us accept what may not seem to be 'the better') than an airtight philosophical position.
NAHWheatCracker 23 hours ago
One counterargument is that people who claim "worse is better" are often making excuses for why their preferred technology didn't win.

Often in these arguments, worse means "shortcut" and better means "won". The difficulty is proving that not taking the shortcut had some other advantages that are assumed, like in the article.

tialaramex 22 hours ago
Winning is temporary. In 1900 Britain has an empire for example. Colonialism won right? The battleship, answering machines, VHS, the compact disk, steam engines.

This "victory" is fleeting. When people people tell you C++, a language which didn't even exist when I was born, is "forever" they are merely betraying the same lack of perspective as when Britain thought its Empire would last forever.

Feathercrown 22 hours ago
I think the counterargument is that something being easy to proliferate doesn't mean that it's good.
bee_rider 23 hours ago
I don’t get the name New Jersey approach, is it just the general association of New Jersey and poor quality? When I think of New Jersey and CS, I think of Princeton, which has a pretty good program IIRC.

Anyway, I wouldn’t put simplicity on the same level as the other things. Simplicity isn’t a virtue in and of itself, simplicity is valuable because it helps all of the other things.

Simplicity helps a bit with consistency, in the sense that you have more trouble doing really bizarre and inconsistent things in a simple design.

Simplicity helps massively with correctness. You can check things that you don’t understand. Personally, that means there’s a complexity prove after which I can’t guarantee correctness. This is the main one I object to. Simplicity and correctness simply don’t belong in different bullet-points.

Simplicity could be seen as providing completeness. The two ways to produce completeness are to either work for a really long time and make something huge, or reduce scope and make a little complete thing.

It’s all simplicity.

dmansen 23 hours ago
bee_rider 23 hours ago
Oh. Ok, that makes sense.
23 hours ago
ezekiel68 18 hours ago
I made the same error when I first read it years ago. It certainly felt like an academic reference.
sebastianconcpt 18 hours ago
To Worse is Better I'd say, careful with what you wish.
Der_Einzige 23 hours ago
I know this is an article about Lisp and the specific usage of this term in the context of software acceptance, but when you use a title that provocative I want to speak specifically about the idea of "Worse is Better" with respect to a more literal idea of "sometimes things get worse overtime but you are told they have improved"

For example, why is it that central vacuums are more rare in 2024 than they were in the 1980s, despite them being superior in every way compared to regular ones?

"Worse" vacuums are "better" for the economy? (because Dyson makes jobs and consumes resources?)

AnimalMuppet 23 hours ago
Central vacuums are worse in at least one specific way: Cost of fixing or replacing them when they break.
ezekiel68 18 hours ago
True; but my family and neighbors had them and they seemed to last forever. Maybe they were made so well that the company couldn't gain repeat sales often enough.
floren 16 hours ago
Also, I can take my regular vacuum out into the garage and clean the car.
pjc50 22 hours ago
What the heck is a central vaccum? One plumbed into the house? Isn't that spectacularly expensive?
stonemetal12 22 hours ago
>A main disadvantage of central vacuums is the higher initial cost. In the United States, the average central vacuum system has an installed cost of around $1,000.

https://en.wikipedia.org/wiki/Central_vacuum_cleaner

Considering the price of a house it isn't "spectacularly expensive". On the other hand vs the price of a hoover yeah a bit. Since it sits in a closet or garage and doesn't move, weight becomes a non issue so it can be a real behemoth of a vacuum.

pm215 21 hours ago
Seems like it could be rather expensive to retrofit, compared to cost of putting it in the house to start with (certainly I couldn't imagine getting it retrofitted into my UK house -- the labour alone to dig channels into walls and then make good again and redecorate would dwarf the equipment cost). But the thing about features put in a house when it's built is that they have to meet the economic bar of "do potential buyers value them enough to pay XYZ more than they otherwise would", which can be quite hard. If the average buyer wouldn't value the house at an extra $1000 then it's not worth the builder installing one...
jen20 17 hours ago
I have one (in a ~2yo house). It's a nice idea, but practically the hoses are a pain in the ass, and I still use my Dyson almost exclusively.
kragen 22 hours ago
Yes, one plumbed into the house.
sophacles 20 hours ago
> Isn't that spectacularly expensive?

Not spectacularly expensive when installed as the house is built. It's another run of pvc pipe in the walls before you close them up. Approx a day of labor and some pipe are the added expense - not much in terms of house building cost at all. Hardwood floors are much more expensive - and still need to be swept or vacuumed.

karel-3d 23 hours ago
(1991)
aredox 23 hours ago
And then we wonder why everything gets worse and worse.
st_goliath 23 hours ago
> ... everything gets worse and worse ...

    They're coming to take me a away haha
    they're coming to take me a away hoho hihi haha
    to the funny farm where code is beautiful all the time ...
-- Napoleon XIV, more or less...

Via: https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...

ta988 23 hours ago
what is getting worse and how?
hammock 23 hours ago
There was an article posted on here[1] a while back that I only just found again, introducing the term "expedience." The idea was that we think we live in a world where people have to have "the best" sweater, be on "the best" social network, drive "the best" car, etc. But when you look at what really WINS, it's not the best, it's the most "expedient" - i.e. sufficiently good, with built-in social proof, inoculated of buyer's remorse, etc.

Is Amazon "the best" place to go shopping? No, you might find better prices on individual items if you put a little more work into it, but it's the most expedient. Is Facebook/Instagram/Tiktok/insert here "the best" social network? No, but it is the most accessible, easy-to-use, useful one. Is a Tesla (perhaps outdated example since X) "the best" car - no, but it is the most expedient.

There is a tangent here that intersects with refinement culture as well. Among the group of society that (subconsciously) care about these "expedient" choices, you see everyone and everything start to look the same

[1]https://tinaja.computer/2017/10/13/expedience.html

dkarl 22 hours ago
"Expedient" is a common (or at least not rare) English word that means something like "practical and effective even if not directly attending to higher or deeper considerations."

For example, if two students in a class are having frequent confrontations that bring learning in the class to a halt, and attempts by teachers and counselors to address their conflict directly haven't been effective, the expedient solution might be to place them in separate classes. The "right thing" would be to address the problem on the social and emotional level, but if continued efforts to do so is likely to result in continued disruption to the students' education, it might be better to separate them. "Expedient" acknowledges the trade-off, while emphasizing the positive outcome.

Often a course of action is described as "expedient" when it seems to dodge an issue of morality or virtue. For example, if we solve climate change with geoengineering instead of by addressing thoughtless consumerism, corporate impunity, and lack of international accountability, many people would feel frustrated or let down by the solution because it would solve the problem without addressing the moral shortcomings that led to the problem. The word expedient stresses the positive side of this, the effectiveness and practicality of the solution, while acknowledging that it leaves other, perhaps deeper issues unaddressed.

circlefavshape 20 hours ago
> For example, if we solve climate change with geoengineering instead of by addressing thoughtless consumerism, corporate impunity, and lack of international accountability, many people would feel frustrated or let down by the solution because it would solve the problem without addressing the moral shortcomings that led to the problem.

Oof. Now I understand something I didn't before

MikeTheGreat 7 hours ago
Are you sure? No offence, but I don't think there's anything to understand here.

If we could solve climate change without "addressing thoughtless consumerism, corporate impunity, and lack of international accountability" we would all be f'ing _thrilled_.

As I type this Hurricane Helene just destroyed a good chunk of inland North Carolina (!!!) and Hurricane Wilton was just upgraded to a "category 5" storm.

If we could solve climate change the easy way we'd all be _thrilled_, because then we'd actually solve climate change.

inglor_cz 6 hours ago
"we would all be f'ing _thrilled_."

I think the argument is pretty much the opposite: not everyone would be thrilled. The wannabe priest class (think of Greta and her How dare you!), which is always with us, would be frustrated from the lack of something to preach about.

Of course there is always Israel vs. Palestine.

dmbche 2 hours ago
I highly recommend the feeling of living grass on the palm of your hand my friend. Or the kiss of the suns rays.
tovej 5 hours ago
I myself am convinced that man-made global warming can't be stopped without changing the way our political economies work, and I happen to believe that Palestinians are people and therefore deserve basic human rights (which they do not currently have). I'm also a researcher.

I _would_ be thrilled if there was an expedient way to solve climate change. But the best systems models for the earth all tell us that there's one way to solve the issue: just leave the carbon in the ground. That's it. Stop extracting it. Nothing else will solve the problem, it's a really simple, really bad, feedback loop.

This characterization of environmental or Palestinian activists as wanting to have the moral high ground is, imo, a knee jerk reaction. The people on the street aren't in it for clout, they're doing it because it is the right thing and they feel compelled to act. What gets me moving is not wanting to feel morally superior (a religious aspect, more at home in right wing politics), but an anxiety for the future, which is projected to include horrible death and suffering due to obvious problems that we could all fix if we just decided to recognize them.

inglor_cz 5 hours ago
The motivations of regular participants vs. leaders may be rather different. Only a specific type of person is attracted to leading crowds.
foldr 2 hours ago
You can apply the same cynicism to any political or protest movement. Not sure that it tells us very much.
inglor_cz 2 hours ago
I agree with your first sentence, only I would replace the word "cynicism" with "skepticism".

And I think it is actually useful. People will try to manipulate other people through emotions, and mobs are easy to manipulate. One should have fairly high barriers before joining a street mob, because its potential destructive power is enormous, and it also tends to elevate unsavory characters to positions of power.

I am not saying that those barriers should be infinitely high, but fairly high.

For us humans, it is easy to succumb to "righteousness in numbers".

dmbche 2 hours ago
So free association is less than ideal is what you're getting at?
inglor_cz 1 hour ago
If you abstract away enough, you will always get to "X is less than ideal".

Food is less than ideal, war is less than ideal, death is less than ideal, HN is less than ideal.

Are you satisfied with this sort of Twitter-like posting and thinking? I am not.

Pixels are basically free and we should strive to post more than one-sentence snarks. For one-sentence snarks and drive-by dismissals, Reddit is the ideal territory.

dmbche 1 hour ago
Indeed, you seem to enjoy long rambly paragraphs, which you are entitled to.

The time and attention of your fellows is valuable and merits some thought before writing. Conciseness and clarity are more valuable than the number of pixels used to type a sentence.

Good luck going forward

Edit0: And no, most would agree that free association is an ideal of the human condition - you're welcome to disagree. Feel free to chat with a lawyer.

foldr 1 hour ago
If innocent people being murdered or the threat of an imminent environmental catastrophe don't meet your 'high barriers', then nothing will. So though you claim in principle to approve of some protests, what you're saying in practice is that no-one should protest against anything because they'll probably just make things worse – because people in general are fairly awful and people who take charge of things are even worse. It's impossible to argue against this kind of cynicism as it's self-reinforcing, but it doesn't strike me as an interesting or insightful position to take. Especially when painting in broad brushstrokes rather than addressing issues with particular political or protest movements (which no doubt are not beyond criticism).

It's also important to weigh the harmful effects of apathy in the balance. These are easily forgotten but almost inestimably enormous. Just think of all the damage done in the decades (centuries) where hardly anyone could be bothered to protest against slavery, women's oppression, racial segregation, pollution, etc. etc.

inglor_cz 1 hour ago
War is often more complicated than "innocent people being murdered" and we both know it. The Israeli-Palestinian conflict isn't morally black and white, and the current Israeli-Hezbollah conflict is something else entirely.

I think you may be proving my point. Taking one side of a complicated situation because of a black-and-white moralistic thinking is potentially destructive, and organizations like Hamas benefit from that.

As for your slavery example, did slavery disappear because humanity awakened morally and started demonstrating in the streets, or because we gained a new non-human resource of raw power? Previous civilizations didn't engage in slavery because they were profoundly immoral, but because human and animal muscle was the only practical source of power. The specifics varied across the globe, but unfree labor was ubiquitous in pre-modern societies.

For a contemporary situation, imagine a 22nd century activist judging people of 2024 for eating meat from dead animals, when he can get a good steak by pressing a button on a steak-making machine. It wouldn't be demonstrations which made the difference between 2024 and 2124.

foldr 48 minutes ago
Protestors are protesting against things that they think are seriously wrong. What you think about the Israel-Palestine conflict or the history of the abolition of slavery is completely irrelevant to their decision whether or not to protest about something. (But err, yes, popular anti-slavery movements played an important role in the abolition of slavery. The Haitian revolution didn't happen because we 'gained a new non-human resource of raw power'.)
inglor_cz 28 minutes ago
"Protestors are protesting against things that they think are seriously wrong."

OK, but that was sort-of my point. The more outrage, the less you need to really think about things.

"err, yes, popular anti-slavery movements played an important role in the abolition of slavery"

That is a chicken-and-egg question. Why did those mass movements only emerge at the time of the Industrial Revolution, and why did they emerge first in places that were influenced by the Industrial Revolution the earliest, while other places (Russia, the Ottoman Empire, the Qing Empire) only followed suit after their own industrialization began?

I don't think the arrow of causality is so simple here. A hypothetical society that abolished slavery, serfdom etc. in the 15th century could easily prove non-viable against its slavery-powered foes, which had more brute force at their disposal. By 1820, the situation was very much turning around and it was the modern, personally freer societies that were more effective in commerce and at war.

Notably, even though Victorian Britain was very anti-slavery, starting with the monarch herself, it had no moral qualms against subjugating a quarter of humanity in another form of submission. Which tells me that it was less about morality (equality) and more about practicality of the situation.

foldr 0 minutes ago
Everyone agrees that innocent people are being killed in the Israel-Palestine conflict and that this is an outrage. The disagreement is over exactly which people fall into this category and who is to blame. Acknowledging the horror and being outraged by it does not preclude thought, and it is ungenerous and inaccurate (and, indeed, cynical!) to characterize all protests about the conflict as thoughtless.

Your take on slavery is pretty wild. The Industrial Revolution did not replace Haitian slaves with machines for harvesting sugar cane. Nor did Spartacus invent the steam engine.

bsenftner 3 hours ago
And I'm of the opinion that we'll not change our political economies without a material society adjustment, a threshold recognition between an emotional reasoning and a more controlled rational level of reasoning. Basically maturity; we've got a material volume of immature adults that derail any and all public and many private conversations with immature observations, short sighted reasoning, and the belief that their unprofessional opinion carries weight with those whose careers are the issue at hand. Until we do something about these, frankly, idiots, we're left adrift in a culture of unintended chaos.
clarity20 6 hours ago
I don't see why climate change needs "solving" per se, or how it can be "solved." To take your example, there have always been hurricanes. It's not correct to infer there's a human-induced tendency toward destruction that can be reversed by humans, or that yet another hurricane is actually a change to the climate in the first place.
moomin 6 hours ago
It isn’t correct to infer that from the fact that hurricanes exist, no.

Nor is it correct to ignore the decades of peer-reviewed research that concludes that we really are causing more hurricanes on the basis that hurricanes have always existed.

hoseja 4 hours ago
It's the meme:

A:"Only Global Communism can solve Climate Change."

B:"Nuclear power also solves climate change."

A:"I don't want to solve Climate Change, I want Global Communism."

downWidOutaFite 17 hours ago
No you don't because there is no expedient solution.
hnben 2 hours ago
non-native speaker here

> "Expedient" is a common (or at least not rare) English word that means something like "practical and effective even if not directly attending to higher or deeper considerations."

I was not aware of the word "Expedient" before. From your example I conclude, that it has the same meaning as "pragmatic", i.e. if I sed i/expedient/pragmatic/g then your comment still makes perfect sense to me. A quick google search also seems to support this conclusion.

--> is there a nuance of the word "expedient", that I am missing here?

dkarl 52 minutes ago
There's a lot of overlap, and you could certainly use "pragmatic" in the two example contexts I gave. The differences as best as I can sum them up are:

- "Pragmatic" can also be used to describe a person (a pragmatic person) or a mindset (a pragmatic approach to a problem.) "Expedient" isn't used to describe people.

- "Expedient" usually acknowledges the existence of a higher or more demanding standard that the solution does not meet, admitting that the solution is not perfect. You might choose a word like "pragmatic" to praise a solution with known shortcomings, but it doesn't imply known shortcomings as strongly as "expedient" does.

- "Expedient" can be used euphemistically. ("Pragmatic" can, too, but not nearly as often, and not as harshly.) "They took the expedient route" might, depending on the context, mean that they did something lazy or unethical because it was easier. The euphemistic usage is common enough that for some people it has an overall unsavory flavor, but I don't think it's tipped over into the euphemistic usage being the assumed one.

raincole 13 hours ago
> if we solve climate change with geoengineering instead of by addressing thoughtless consumerism, corporate impunity, and lack of international accountability, many people would feel frustrated or let down

If some people feel frustrated or let down because we achieve a literal miracle (by today's technology standards) that saves millions of lives I'm willing to call them mentally unhinged.

computably 12 hours ago
Presuming said miracle is possible, geoengineering at a scale capable of "solving" climate change would be a massive gamble. Humanity's ability to model the climate simply isn't at a point where we could say with any confidence what the long-term effects of any particular geoengineering "solution" would be, and short of an abrupt technological singularity, won't be for centuries. Without any appeal to moral judgments whatsoever, it's safe to say geoengineering would only be seriously attempted out of total desperation and signal persistent unresolved issues.
reshlo 4 hours ago
If we could solve climate change purely technologically, I wouldn’t feel let down by the fact that we solved climate change. I would absolutely support solving it in that way.

I would also feel frustrated by the knowledge that there were many people who were willing to sacrifice unimaginable numbers of humans and animals for the sake of making more profit for themselves, who were not held to account for their actions. If a person acts in a way that they should know will lead to future suffering, the development of an unforeseen technological solution to that suffering should not wipe their moral slate clean.

Trying to kill someone using a non-functional weapon, that you believe is functional, is not morally equivalent to taking no action just because it didn’t have an effect.

KerrAvon 18 hours ago
I think this is misleading -- geoengineering + social change will be necessary. You're not going to Scotty your way out of climate change.
ambicapter 18 hours ago
What's misleading here? He was using that as an example of usage of the word 'expedient', not actually suggesting climate change solutions.
samplatt 11 hours ago
A lot of people in this thread seem to think OP was suggesting that geoengineering is a possible way forward right now, when they were just positing it as an example.
groestl 9 hours ago
Since OP provided a different example, an expedient solution would be to edit OPs post and remove that paragraph about geoengineering, and delete all comments referring to it.
CM30 16 hours ago
The technical solution is the only practical solution. People aren't gonna give up a large percentage of their lifestyle for the sake of some greater 'good', especially not if their leaders and influencers seem to have zero interest in doing the same.

And anyone trying otherwise will struggle significantly at the polls. Mass carbon removal, renewable energy, recycling and maybe some technological solutions to limit the effects of atmospheric carbon seem like the more practical way to go.

adastra22 17 hours ago
Why not? Really, why not? If we had a profitable way to extract CO2 from the atmosphere at scale, to deacidify the oceans, clean up toxic waste, etc., what would be left? How would that not solve the problem?
Hasu 15 hours ago
No technology has been invented that doesn't have costs and tradeoffs. Technology that deacidifies the oceans will have other costs, other externalities that we cannot predict now. Determining how we want to deal with those costs/tradeoffs is a social problem, not a technical problem. Technical know-how can only inform us about the what tradeoffs are available, it can't tell us what we prefer.
adastra22 14 hours ago
The ion pumps enabled by the technology we are working on won’t have external effects. They basically just filter out certain small molecules from the ocean into crystal storage.
thfuran 14 hours ago
What kind of flow rate would be necessary to pull CO2 out of the ocean faster than it dissolves from the air, and is that achievable without affecting the surroundings?
adastra22 13 hours ago
CO2 wouldn’t be pulled out of the ocean. You’d have to liquify it from air in a different process. The molecular pumps are used to extract dissolved ions or solutes from sea water and only act as fast as diffusion.
thfuran 9 hours ago
So this is desalination? That seems unrelated.
adastra22 3 hours ago
I listed cleaning up toxic waste and co2 sequestration as separate things, yes.
einpoklum 5 hours ago
Because it's a distraction.

Global warming and other environmental crises are unfolding right now. While exploring possible future technological advances which would make coping with it easier is certainly a positive and useful pursuit - it cannot be the _main_ pursuit when facing those crises and challenges. That is:

1. We should not divert the discussion from present to possible fortuitous futures.

2. We must not confuse action with prospects.

3. We must not think of the two as "either-or". We can reduce emissions _and_ do R&D for new tech possibilities.

adastra22 3 hours ago
There are plenty of examples of technological advances that in one fell swoop entirely eliminate a class of societal problems. Haber-Bosch, for example, completely eliminated famine as a barrier to world populations growth. Penicillin and vaccines eliminated entire categories of terminal disease.

We don’t NEED to reduce emissions. So long as we clean up as much or more than we pollute, what’s the problem?

downWidOutaFite 17 hours ago
why are you fantasizing? none of that is going exist (caveat: unless we figure out infinite clean energy).
adastra22 17 hours ago
It’s not fantasy. I’m working on a startup to enable this technology. It’s a hard problem, but not impossible. And the energy needs are not as great as you think. A relatively small patch of the Sahara or Gobi desert or open ocean would be sufficient.
Log_out_ 9 hours ago
its indistinguishable from fantasy extrapolating from the given track record . Previous fantasy technology break through like harbor bosch were viable only "after" major diaasters and worldwars and brought with them ever more destructive hidden costs.
adastra22 3 hours ago
Haber-Bosch predates the world wars.
downWidOutaFite 9 hours ago
You might need to panel over the whole Sahara: https://www.wired.com/story/the-stupendous-energy-cost-of-di...
adastra22 3 hours ago
Much less than the whole Sahara.
capitainenemo 13 hours ago
sun shields? There's some testing of those going on right now. You wouldn't need a huge reduction in sunlight either. Not enough to impact plants.
hinkley 18 hours ago
See also: war crimes
lisper 20 hours ago
> Is Amazon "the best" place to go shopping? No, you might find better prices on individual items if you put a little more work into it, but it's the most expedient.

It's not just that. Every time you do business with a new web site you assume additional risk. Amazon is a known quantity. You can be pretty sure that they are not going to outright scam you, and they aren't going to be hacked by script kiddies. There is a significant risk of getting a counterfeit item, but they have a very liberal return policy, so the real cost to you in this case is a minute or two to get a return code and possibly a trip to the nearest Whole Foods to drop it off.

Amazon sucks in many ways, but at least their suckage is a known quantity. Predictability has significant value.

katbyte 17 hours ago
Yep it’s the return policy that allowed me to gamble on items that may or may not be real/functional vs spending a ton of time to find one elsewhere that maybe is, but if it’s not it’ll be hard to return
d0mine 17 hours ago
There is also "satisficing" (vs. maximizing).

Your model of the world is not perfect so instead of trying to find a globally optimal solution, you are satisfied with a local optimum that exceeds some threshold that has to suffices. https://en.wikipedia.org/wiki/Satisficing

hammock 16 hours ago
Love that. Well-done marketing* can orient a consumer into a preferred "local optimum territory" , leading to satisfiction(?) and sales

* For example, the limited selection of candy at the checkout aisle. All you have to do is get your brand in there. (Placement on the 4P's)

* Or, "best credit card for travelers." By offering travel rewards, you can acquire a group of cardmembers even if, e.g. a more valuable cashback card could have gotten them even greater benefits (Promotion on the 4P's)

rpastuszak 2 hours ago
WalterBright 19 hours ago
> you might find better prices on individual items if you put a little more work into it

That extra work costs you money, too. Calculate how much your job pays you per hour, then you can deduce the $cost of spending more time to get a better deal.

Gigachad 12 hours ago
This is a fairly crappy methodology though because the vast majority of people are not substituting paid working time with researching buying things online. So it hasn't "cost" them anything other than their free time which is far more complex than an hourly rate. Maybe they enjoy researching products and in that situation, it wasn't a waste of time at all.
ryandrake 11 hours ago
Exactly. I always hate these "your time is worth your hourly wage" arguments. They're often used to argue against things like changing the oil in your car, fixing a clogged drain, or DIYing anything.

Your time is only worth money if you'd otherwise be working at that rate, which is not the case for the vast majority of humans.

WalterBright 9 hours ago
Saving money is the same thing as earning it. Are you better off spending an hour to save $100 or an hour to save $10?

I once spent 2 hours negotiating the purchase of a car. It saved $5000. That works out to $2500 an hour. Was it worth it? Hell ya!

I've also worked hourly jobs in the past. There were often opportunities to work more hours or overtime. People often have side hustles, too.

contagiousflow 19 hours ago
Are you removing paid working time in doing the extra work? If not it is just an opportunity cost
WalterBright 18 hours ago
In my line of work, yes.
onlyrealcuzzo 22 hours ago
> Is a Tesla (perhaps outdated example since X) "the best" car - no, but it is the most expedient.

The most expedient car? Or BEV in the US?

hammock 22 hours ago
Fair point... it was a coastal California-centric point but there is plenty of nuance or adjustment to be made. At some point in the 90's we'd probably have said "Silver E class Mercedes" is the most expedient luxury sedan, if you wanted a different example.
UniverseHacker 18 hours ago
The early to mid 90s E class aka W124 definitely went down in history as one of the best quality and best designed cars ever made. Other luxury cars may be faster, more fun, or have more features… but the W124 probably is “the best” if you’re looking just at build quality and well thought out design details.

As a car nerd though, I never felt the need to buy one because they just seem fairly boring- other than a few rare models most were 4 door sedans with automatics, fairly small engines, and soft non sporty suspension.

adastra22 17 hours ago
I think Toyota would have gotten your point across better. Tesla is most certainly not expedient. It is a luxury purchase.
foobazgt 17 hours ago
It's definitely in a class above Toyota, but once you account for gas savings, the LR RWD costs about as much as the cheapest Corolla you can buy. People bagged on Tesla being too expensive when their $35K car was only available over the phone. Now, adjusted for inflation, the LR RWD is $28K and comes with 100mi additional range to boot. On top of that, it's $13K below the average car purchase price.

IMO, it destroys its competitors in the value market, and the media is being awfully silent about it. I guess it's far too easy to focus on Elon instead.

adastra22 14 hours ago
Really depends on where you live and what your electric rates are. These is my breakdown (Corolla comes out ahead as cheaper to operate): https://www.reddit.com/r/TeslaModel3/comments/14rj3fp/tesla_...
7 hours ago
marxisttemp 16 hours ago
“In a class above Toyota” in what sense? Certainly not in reliability or interior quality or CarPlay compatibility…
KerrAvon 18 hours ago
Neither. Going by the parent poster's gauge, the most "expedient" is probably Lucid; better engineering, better range, and better service.
KerrAvon 18 hours ago
but it's really not what "expedient" means
b3ing 14 hours ago
I might argue it’s the one most known by the most people, the “best” takes time to get there, Google was better than yahoo but it took years to become #1 in terms of hits
jprete 22 hours ago
Related - thinking takes a lot of energy, so people prefer options that are cheap to evaluate. This definitely contributes to the preference for expedient options.
21 hours ago
RcouF1uZ4gsC 17 hours ago
>Is Amazon "the best" place to go shopping?

The number one reason I use Amazon, is not for the best prices, but because of their return policy. Amazon returns are actually often more painless than physical store returns.

Being able to return something predictably and easily outweighs a small difference in price.

AnimalMuppet 22 hours ago
If you include the cost of gathering information, the expedient solution may in fact be the best.
aulin 10 hours ago
> inoculated of buyer's remorse

Non native here. What's the meaning of inoculated here?

It's not the first time that I struggle to parse this word. In italian it keeps the original latin meaning and can be translated with "injected with". You could inoculate a vaccine but you could also inoculate a poison, so it does not carry the immunity meaning by default. English (US?) as far as I can tell use it as a synonym of "immune", is that so?

pmg101 7 hours ago
It should be "inoculated against" but the meaning is clear.

A vaccine inoculates you against a disease by a physical mechanism: that is, it prevents you from getting that disease (to a greater or lesser degree).

Metaphorically being inoculated against something means it can no longer hurt you. For instance, maybe by not owning a car you're inoculated against vehicle depreciation. Or by wearing the same simple but quality outfit every day you're inoculated against the vagaries of fashion.

avidiax 7 hours ago
I think the author means "vaccinated". They mean that they've been made resistant to buyer's remorse.
2 hours ago
NAHWheatCracker 23 hours ago
I'll never understand the obsession with LISP. My guess is it just appeals to a certain type of person, sort of academic in my view. I'm not convinced that LISP was ever the-right-thing. The author didn't express anything about LISP vs C except to assert that C was a 50% solution and LISP was better.

I agree though that for practical purposes, practical solutions are just going to be more successful.

pjc50 23 hours ago
Over the years I've developed what I call the "lefthanded scissors" analogy: people assume that everyone's mind is wired the same way and that all good programmers are good in the same way and think in the same way, but what if that's not true? What if different people have a predisposition (like lefthandedness) to prefer different tools?

Then a righthanded person picks up the lefthanded scissors and deems them weird and uncomfortable. Which they are .. for the right hand of a right-handed person.

Other popular examples of such taste controversy are Python's semantic whitespace, the idioyncracies of Perl, the very unusual shape of J/APL, and anyone using FORTH for non-trivial purposes.

edit: https://news.ycombinator.com/item?id=41766753 comment about "other people's Lisp" reminds me, that working as a solo genius dev on your own from-scratch code and working in a team inside a large organization on legacy code are very different experiences, and the "inflexibility" of some languages can be a benefit to the latter.

eikenberry 18 hours ago
I've always preferred to use classical "fine" artists as an metaphor for development language preferences. Artists don't just pick up any medium at a whim, most have specific mediums that they excel in and ignore the others. For example Van Gogh produced his most famous works in oil paints. He had worked in other mediums but his talent definitely seemed to shine when working in oils and when he first found oils he took to them quickly and adopted them as his primary medium.

Most people definitely seem to have specific preferences built into their talents. I don't see why programming or programming languages would be any different from any other medium or art form.

Jach 16 hours ago
In baseball and other sports, pros and cons of different styles are readily and honestly talked about, even when coaches have a bias. See e.g. https://blog.paddlepalace.com/2014/01/coaching-tip-playing-t... for table tennis. Few comparable articles in programming exist; either people can't conceive of other styles or contexts, or just want to talk about the superiority of their bias regardless of context. If downsides are mentioned at all it's often not about the preferred thing, but some deficiency of something else.

A commonly made up deficiency attributed to Lisp is that it's particularly bad at large scale, either in teams or program size. That would surely be surprising news to the teams doing stuff today or in the past, some responsible for multi-million lines of code systems (some still in operation). Or to use an old example, the documentation for Symbolics Computers, pictured here in book form: https://www.thejach.com/public/symbolics-books-EugyAAEXUAUG_... Such a set of books doesn't come from a "lone wolves only" ecosystem and heritage. Not to mention doc and so on not shown for applications they made for 3D graphics or document editing (https://youtube.com/watch?v=ud0HhzAK30w)

arethuza 22 hours ago
I've seen Common Lisp source code that I didn't even recognise as Common Lisp because of over-enthusiastic use of reader macros...

Edit: I should also mention that once I worked out how reader macros worked I went on to enthusiastically (ab)use them for my own ends...

blenderob 21 hours ago
If there has been over-enthusiastic use of reader macros, I think we will have to admit that the over-enthusiastic developer is no longer writing Common Lisp code.

Those reader macros have morphed the language into a new bespoke language. So it is then natural for a new developer to face a steep learning curve to learn that new bespoke language before they can make sense of the code.

I'm not condoning overuse of macros. I'd hate to be working with such code too. I'm only stating that Common Lisp is that language that can be programmed to become a different language. They call it a "programmable programming language" for a reason.

sceptic123 21 hours ago
I would say the analogy fails because it's not about discomfort or weirdness (or taste); left-handed scissors just don't work if you try to use them right-handed. But right-handed scissors don't work if you try to use them left-handed either.

So it's not about taste or preference, left handed people learn how to use right-handed scissors, but they can also use left handed scissors in a way that a right-handed person would struggle to.

All that said, the analogy still works because most people don't understand why it doesn't work.

kayodelycaon 21 hours ago
> What if that's not true?

It is definitely not true. The existence of neurodivergence is proof enough.

I can visualize an entire application in my head like it is a forest. I can fly around and see various parts and how they fit together. I am unable to juggle the tokens necessary to do logic and basic math in my head but I have no trouble reading a graph in understanding the relationships between numbers.

No matter how many times I try to read it, Lisp is completely inscrutable to me. It requires the same token juggling math does.

kqr 23 hours ago
> I'm not convinced that LISP was ever the-right-thing.

Remember that this has to be read in historical context. At the time C was invented, things like garbage collection, message-passing object-orientation, generics, rich sets of conditionals, first-class functions, etc. were brand spanking new. They were The Right Thing to do (even judged in the harsh light of hindsight), but also quite complicated to implement – so much so that the New Jersey people skipped right past most of it.

Today these things are par for the course. At the time, they were the The Right Thing that made the system correct but complex, and had adoption penalties. As time passes, the bar for The Right Thing shifts, and today, it would probably not be embodied by Lisp, but maybe by something like Haskell or Rust?

kragen 22 hours ago
This paper was written in 01991. Garbage collection is from 01959. Message-passing (aka object-orientation) is from 01972. C is from 01973. If by "generics" you mean parametric polymorphism, those were added to Ada and C++ in the mid-80s, after having been invented in ML in 01973; in a sense they're an attempt to bring the virtues of dynamically-typed languages like Lisp to statically-typed languages. Even today, I don't think there's a Common Lisp implementation with any kind of parametrically-polymorphic static type system, not even as an optimization. First-class functions are also from 01959, or, arguably from 01936, in the untyped λ-calculus, or 01920, in combinatory logic.

Some of these things were brand spanking new in 01973, but none were in 01991.

There were Lisp systems for minicomputers like the PDP-11; BSD included one (which I think was ancestral to Franz Lisp) and XLISP ran on CP/M. And of course Smalltalk was developed almost entirely on PDP-11-sized minicomputers. But to my recollection virtually all "serious" software for microcomputers and minicomputers was written in low-level languages like assembly or C into the late 80s, not even PL/M—though Pascal did start to win in the late 80s, in significant part by adopting C's low-level features. Nowadays, microcomputers are big enough and fast enough that Lisp, Haskell, or even Rust is viable.

I don't think "the right thing" is mostly about what features your system has. I think it has more to do with designing those features to work predictably and compose effectively.

Jach 15 hours ago
What do you think of Coalton?
kragen 14 hours ago
I've never tried it! It does look like it might be the thing I was saying doesn't exist, though :-)
anthk 21 hours ago
Ever used SBCL?
kragen 20 hours ago
I love SBCL. Does Python optimize with parametric polymorphism now?
samatman 21 hours ago
> > At the time C was invented

> C is from 01973.

> Some of these things were brand spanking new in 01973

...right. The Rise of Worse is Better is a memoir, it's set in the past.

pjc50 23 hours ago
> things like garbage collection, message-passing object-orientation, generics, rich sets of conditionals, first-class functions, etc. were brand spanking new. They were The Right Thing to do

I quite like this view, because these things have clearly been copied everywhere such as my language of choice C#, but the one thing that nobody copied is the one that Lisp programmers rave about most: homoiconicity (brackets everywhere).

pjmlp 22 hours ago
Lisp-2, Prolog, Dylan, Erlang, Julia, R....

Are all homoiconic without being full of parenthesis all over the place, the actual meaning is code and data being interchangeable.

BoingBoomTschak 20 hours ago
I think following what's written in Wikipedia (https://en.wikipedia.org/wiki/Homoiconicity#Implementation_m...) and only using the adjective for stuff like Lisp, Tcl, Rebol is better than diluting its meaning to the point where it applies to any language with tree-sitter bindings and a parser.
kragen 20 hours ago
Prolog, Dylan, Julia, and R are stuff like Lisp, Tcl, and Rebol. I don't know about Erlang, and I don't know what pjmlp means by "Lisp-2", which I normally interpret as meaning a Lisp with different namespaces for functions and variables, following Steele's terminology.
susam 19 hours ago
> ... I don't know what pjmlp means by "Lisp-2", which I normally interpret as meaning a Lisp with different namespaces for functions and variables ...

There indeed was a language named LISP 2: https://dl.acm.org/doi/pdf/10.1145/1464291.1464362

I posted it on HN two weeks ago but it didn't get much traction: https://news.ycombinator.com/item?id=41640147

kragen 18 hours ago
Aha, thanks!
19 hours ago
BoingBoomTschak 19 hours ago
Looking at https://docs.julialang.org/en/v1/manual/metaprogramming/, I have a hard time considering it "fully" homoiconic, as the parsed data structure Expr(:call, :+, :a, Expr(:call, :*, :b, :c), 1) isn't the same representation as the code itself, even having some new elements like :call.

How it this different from tree-sitter, except that you can feed the modified code back to the language to evaluate?

I mean, don't get me wrong, Julia metaprogramming seems lovely, but it just seems to me that the word loses meaning if it can applied to all languages with AST macros, no matter how gnarly the AST data structure is (is Rust homoiconic because of proc_macro?).

kragen 17 hours ago
First, I want to point out that I've never written a single line of Julia code, so I'm no expert on this. I could easily be wrong about some of what I just found out by reading a small amount of Julia documentation.

However, that's not a parsed data structure; it's a Julia expression to construct one. Though I don't have Julia installed, apparently you can just as well write that as :(a + b*c + 1), as explained a few paragraphs further down the page than I guess you read: https://docs.julialang.org/en/v1/manual/metaprogramming/#Quo.... That's also how Julia represents it for output, by default. What you've written there is the equivalent of Lisp (list '+ 'a (list '* 'b 'c) 1), or perhaps (cons '+ (cons 'a (cons (cons '* (cons 'b (cons 'c '()))) (cons 1 '())))). The data structure is as simple as Prolog's, consisting of a .head such as :call and a list of .args. Prolog's, in turn, is only slightly more complex than Lisp's. From a certain point of view, Prolog's structure is actually simpler than Lisp's, but it's arguable.

How this is different from having a tree-sitter parser is that it's trivially easy to construct and analyze such structures, and not just at compile time.

Possibly the source of your confusion is the syntactic sugar which converts x + y + z into what we'd write in Lisp as (+ x y z), and also converts back? I would argue that such syntactic sugar is precisely what you want for constructing and analyzing expressions. That is, it's part of what makes Julia homoiconic in a more useful sense than J. Random Language equipped with a parser for its own syntax.

pjmlp 22 hours ago
Also Lisp came out in 1958, and ESPOL/NEWP in 1961, PL/I and its derivates, predating C by a decade.

Had AT&T been allowed to charge real money for UNIX, and the Worse is Better would never happened.

linguae 22 hours ago
Counterpoint: Some other “worse” would have probably taken over. We’ve seen this in the personal computing market, where MS-DOS (later Windows) and the x86 won out against competing operating systems and ISAs not because of technical superiority, but because of market forces and cost. Look at the evolution of the Web, especially JavaScript....

It’s likely that Unix might not had spread if AT&T didn’t give it relatively liberal licensing in the pre-divestiture era. But it’s also likely that something else that was cheap and readily adaptable would’ve taken over instead, even if it wasn’t technically superior to its competitors.

pjmlp 22 hours ago
Or maybe we would be enjoying VMS or something like that, not written in a language that 60 years later after its inception still doesn't do bounds checking and decays arrays into pointers to save typing four characters.
kragen 21 hours ago
BLISS-11 didn't do bounds-checking either. VMS was my favorite operating system until I got access to a Unix. People chose Unix because it was better—and not just by a little. Handicapping Unix wouldn't have improved the alternative systems.
pjmlp 21 hours ago
VMS also supported Pascal and BASIC dialects for systems programming, in equal footing to BLISS.

People choose free beer, it always goes even if warm.

kragen 20 hours ago
VAX Pascal is viable for systems programming (though standard Pascal wasn't). I only ever used VAX BASIC a tiny bit; was it interpreted like microcomputer BASICs? That made BASIC a nonstarter for "serious software" in the 80s. Not having records, local variables, or subroutine parameters was also a pretty big problem, but maybe VAX BASIC fixed that.

I wasn't paying for access to VMS either; it just didn't hold a candle to Unix.

pjmlp 19 hours ago
No one really uses standard C for systems programming, yet that always applies to other languages.

There is always Assembly, compiler extensions, or OS specific APIs, as part of the deliverable.

Funny how UNIX folks always have these two weights approach.

I beg to differ in wax quality for candles, but if we get to do juice with bitter lemons, so be it.

At least is refreshing during Summer.

kragen 18 hours ago
Do you know if VAX BASIC fixed the problems I mentioned in BASIC? Was it interpreted?

K&R C has separate compilation, pointer casting, a usable but janky and bug-prone variable-length string type, variadic functions, static variables, literal data of array and record types, bitwise operations, a filesystem access API, and an array iterator type. None of those require "assembly, compiler extensions, or OS specific APIs." (Well, I'm not sure if they specified variadic functions in the book. The first ANSI C did.)

Jensen & Wirth Pascal has none of those, making it completely inadequate for anything beyond classroom use, which was what it was designed for.

Each Pascal implementation that was used for systems programming did of course add all of these features, but many of them added them in incompatible ways, with the result that, for example, TeX had to implement its own string type (in WEB).

pjmlp 7 hours ago
VAX BASIC was always compiled, it was its predecessors that were not, bitsavers or wikipedia have the background info.

Take all the Assembly routines from libc, and K&R C turns into a macro assembler with nicer syntax. And not a good one, given that real macro assemblers actually have better macro capabilities, alongside their high level constructs.

Quite visible in the C dialects that were actually available outside UNIX, on computers people could afford, like the RatC (Made availalbe via A book on C) and Small-C (DDJ article series).

Well even dumbest standard pascal compilers like GPC do allow calling into Assembly. So it should count for Pascal as well, if that is the measure.

Then we have this thing sticking with ISO Pascal from and its dialects, always ignoring that this was seen as an issue, that that is why Modula-2 exists since 1978, and Extended Pascal since 1991, one year later after ANSI C (C89 got a short C90 revision fix).

Also following the K&R C alongside Assembly line, several companies were quite successful with Pascal dialect alongside Assembly, including a famous fruit company.

Back in 2024, C extensions keep being celebrated to the point the most famous UNIX clone can only be compiled with a specific compiler, and the second alternative is only possible after a search company has burned lots of money making it possible.

But hey, lets stick to Pascal and its dialects.

anthk 20 hours ago
Imagine if RMS brought Lisp OS with Emacs not being bound to GNU/Unix. JITted Emacs since the 2000 and Eshell as the default CLI, with a floating WM as an easy desktop. Multithreaded, own modules for images/PDF and whatever. No need for security, daily backups and rollbacks would be granted. Today you would be 'rewinding in time' from Emacs if any attack happened.
kragen 18 hours ago
Vaporware can always be better than actually existing software because vaporware doesn't have bugs or schedule/scope tradeoffs.
yoyohello13 23 hours ago
I like LISP, but I wouldn't say I'm an evangelist. Here's my 2 cents for why LISP has such a following.

1. LISP is easy to start with if you're not a programmer. There is very little syntax to get to grips with, and once you understand "everything is a list" it's super easy to expand out from there.

2. LISP really makes it easy to hack your way to a solution. With the REPL and the transparency of "code is data" model you can just start writing code and eventually get to a solution. You don't need to plan, or think about types, or deal with syntax errors. You just write your code and see it executed right there in the REPL.

For my part, I love LISP when it's just me doing the coding, but once you start adding other peoples custom DSL macros or whatever the heck it becomes unwieldy. Basically, I love my LISP and hate other peoples LISP.

turtleyacht 22 hours ago
Once you get far enough to need external dependencies, do you use your work as a library in another language or rewrite it?

http://www.winestockwebdesign.com/Essays/Lisp_Curse.html

anthk 15 hours ago
QuickLisp solves most of the issues. Quiop and closer-mop if you want a universal CLOS.

Scheme is not Common Lisp, it's kinda the opposite. It comes with batteries included and MCClim is the defacto GUI for it.

>Stuck with Emacs [from the URL]

Well, Lem tries to be Emacs for Common Lisp, but without having to think on two Lisp languages (albeit closely related) at once.

Once you have a REPL, autocomplete and some docstring looking up tool, you are god.

turtleyacht 1 hour ago
Good point. I neglected to consider standard libraries (stdlib) at the time. If those are allowed, then a lot more can be done with one's code without reaching much farther. Thank-you.
taeric 22 hours ago
LISP remains one of the only languages where manipulating the code looks exactly the same as executing it. This is often illustrated by pointing out that "eval" in lisp doesn't take in a string of characters. (https://taeric.github.io/CodeAsData.html is a blog I wrote on the idea a bit more.)

What this often meant was that getting a feature into your LISP program was something you could do without having to hack at the compiler.

Used to, people balked at how macros and such would break people's ability to step debug code. Which is still largely true, but step debugging is also sadly dead in a lot of other popular languages already.

linguae 22 hours ago
Keep in mind that this essay was written in the early 1990s. Today there are many programming languages available that offer features that are unavailable in C but have long been available in Lisp. This was not the case in the 1980s during the AI boom of that era. There was a large chasm between classic procedural languages (C, Fortran, Pascal, Simula) and dynamic languages (Smalltalk and Lisp). Prolog was a popular alternative to Lisp in AI circles, especially in Japan back when it was pursuing the Fifth Generation Computing Project. When looking at the language landscape in the 1980s in the context of AI, it makes sense that practitioners would gravitate toward Lisp and Prolog.

Today we benefit from having a variety of languages, each with tradeoffs regarding how their expressiveness matches the problem at hand and also the strength of its ecosystem (e.g., tools, libraries, community resources, etc.). I still think Lisp has advantages, particularly when it comes to its malleability through its syntax, its macro support, and the metaobject protocol.

As a Lisp fan who codes occasionally in Scheme and Common Lisp, I don’t always grab a Lisp when it’s time to code. Sometimes my language choices are predetermined by the ecosystem I’m using or by my team. I also think strongly-typed functional programming languages like Standard ML and Haskell are quite useful in some situations. I think the strength of Lisp is best seen in situations where flexibility and have malleable infrastructure is highly desirable.

mattgreenrocks 22 hours ago
Such a weird take on HN. Lisp should be experimented with if only to appreciate the profound beauty of a small, powerful, cohesive design. It is a wholly different feeling from industry standard languages which are constantly changing.

In Lisp, almost all of the language’s power is in “user space.”

The ramifications for that are deep and your beliefs as to whether that is good are largely shaped by whether you believe computation is better handled by large groups of people (thus, languages should restrict users) or smaller groups of people (thus, languages should empower users).

See this for more discussion: https://softwareengineering.stackexchange.com/a/237523

Blackthorn 12 hours ago
Nothing about Common Lisp is small or cohesive. Some schemes, maybe.
anthk 5 hours ago
CL it's much more cohesive than Scheme. They Schemes are barely compatibles upon themelves with ice-9's on Guile, SRFI's and whatnot.
pif 21 hours ago
> languages which are constantly changing

They change because they are used.

skribanto 17 hours ago
Well the main point is that there are changes that for most languages would need a change in the compiler/interpreter itself. However in lisp those kinds of things can be done in userspace
djha-skin 22 hours ago
In the words of "Programmers Are Also Human" YouTube channel, it's just more comfortable. REPL development, ease of refactoring, dynamic typing, good CFFI, all just adds up to a developer experience that I find to be, in a word, chill.
hcarvalhoalves 22 hours ago
See these demos for the kinds of interactive systems LISP enabled back in the day:

https://www.youtube.com/watch?v=o4-YnLpLgtk

https://www.youtube.com/watch?v=gV5obrYaogU

It makes working on VSCode today look like banging rocks together, let alone 30 years ago.

anthk 21 hours ago
And today with Lem/Emacs+slime+sbcl
Jach 16 hours ago
It's not a perfectly reliable tell, but people who write LISP instead of Lisp generally give themselves away as knowing nothing about the language. Why not kick the tires with Common Lisp, or even Clojure, and see if you can then understand for yourself why it sparks joy in people? I'm not saying it'll spark joy in you, just that you might understand. (Do you understand Haskell's draw to certain people? Rust's?) At the very least, perhaps you'll lose your notion that it primarily appeals to academic types. Common Lisp is and always has been an industrial language.

"Please don't assume Lisp is only useful for Animation and Graphics, AI, Bioinformatics, B2B and E-Commerce, Data Mining, EDA/Semiconductor applications, Expert Systems, Finance, Intelligent Agents, Knowledge Management, Mechanical CAD, Modeling and Simulation, Natural Language, Optimization, Research, Risk Analysis, Scheduling, Telecom, and Web Authoring just because these are the only things they happened to list." --Kent Pitman

NAHWheatCracker 14 hours ago
My only personal experience writing with Lisp was Scheme, which I wrote for two weeks back in college 18 years ago, so I truly know nothing about the language.

In retrospect, saying I don't understand was hyperbolic. Of course I understand that people have their preferred languages. The handful of languages I've used in my career each have their draw.

My comment was meant more to question the assertion that Lisp is the-right-thing, which sounds like asserting the-right-religion.

23 hours ago
buescher 19 hours ago
It's not really about lisp or even about vague reverse-snobbish ideas of "practicality" but about very specific categories of design compromises.
bbor 22 hours ago
It’s for the dreamers. The crazy ones among us that do not think of themselves as experts in programming machines to solve business problems, but rather novices in cajoling machines to think like humans do.
kragen 22 hours ago
The paper was written for an audience of Lisp programmers, so the things you're talking about were sort of out of scope. I'm not going to try to convince you that Lisp and ITS really did aim at "the right thing" in a way that C and Unix didn't; you'll see that it's true if you investigate.

Lisp definitely does depend on personality type. Quoting Steve Yegge's "Notes from the Mystery Machine Bus" (https://gist.github.com/cornchz/3313150):

> Software engineering has its own political axis, ranging from conservative to liberal. (...) We regard political conservatism as an ideological belief system that is significantly (but not completely) related to motivational concerns having to do with the psychological management of uncertainty and fear. (...) Liberalism doesn't lend itself quite as conveniently to a primary root motivation. But for our purposes we can think of it as a belief system that is motivated by the desire above all else to effect change. In corporate terms, as we observed, it's about changing the world. In software terms, liberalism aims to maximize the speed of feature development, while simultaneously maximizing the flexibility of the systems being built, so that feature development never needs to slow down or be compromised.

Lisp, like Perl and Forth, is an extremist "liberal" language, or family of languages. Its value system is centered on making it possible to write programs you couldn't write otherwise, rather than reducing the risk you'll screw it up. It aims at expressiveness and malleability, not safety.

The "right thing" design philosophy is somewhat orthogonal to that, but it also does pervade Lisp (especially Scheme) and, for example, Haskell. As you'd expect, the New Jersey philosophy pervades C, Unix shells, and Golang. Those are also fairly liberal languages, Golang less so. But a C compiler had to fit within the confines of the PDP-11 and produce fast enough code that Ken would be willing to use it for the Unix kernel, and it was being funded as part of a word processing project, so things had to work; debuggability and performance were priorities. (And both C and Unix were guided by bad experiences with Multics and, I infer, M6 and QED.) MACLISP and Interlisp were running on much more generous hardware and expected to produce novel research, not reliable production systems. So they had strong incentives to both be "liberal" and to seek after the "right thing" instead of preferring expediency.

adamnemecek 23 hours ago
It’s a signaling mechanism to say “I went to MIT” or at the very least to say “I could have gone to MIT”.
djha-skin 22 hours ago
I learned Racket at Brigham Young University in Utah, and Clojure, then Common Lisp on my own. I don't think it's just an MIT thing. It's in the ethos now. Lots of people know about SICP, for example.
23 hours ago
Blackthorn 23 hours ago
How do you know someone knows lisp? They'll tell you. And tell you. And tell you...

I like lisp for the most part, but holy shit is the enduring dialog surrounding it the absolute worst part of the whole family of languages by far. No, it doesn't have or give you superpowers. Please grow up.

anthk 15 hours ago
Do either the book on Symbolic Computation + PAIP and/or SICP, and you'll understand.
Blackthorn 12 hours ago
Yes, I've done SICP. Lisp does not give you superpowers.
22 hours ago