265 points by t14n 272 days ago | 33 comments
Arch-TK 271 days ago
I have a theory that the worse is better approach begets an environment where the worse is better approach is better.

At least hypothetically, I think there's an approach which is not "the right thing" or "worse is better" but rather more like "the right foundations".

Most interface complexity in my experience seems to be inherited from underlying interface complexity, and it takes a lot of work to fix that underlying interface complexity. This, I think is where "worse is better" shines. If you try to apply a "the right thing" approach to a system where you're dealing with shitty underlying interfaces (i.e. every popular operating system out there including every unix, and NT system) you end up with endless complexity and performance loss. So obviously nobody will want to do "the right thing" and everyone who takes the "worse is better" approach will end up way ahead of you in terms of delivering something. People will be happy (because people are almost always happy regardless of how crap your product is).

On the other hand, designing something with "the right foundations" means that "the right thing" no longer needs to involve "sacrifice implementation simplicity in favour of interface simplicity" to anywhere near the same extent because your implementation can focus on implementing whatever interface you want rather than first paving over a crappy underlying interface.

But the difficulty of "the right foundations" is that nobody knows what the right foundations are the first 10 times they implement them. This approach requires being able to rip the foundations up a few times. And nobody wants that, so "worse is better" wins again.

wismi 271 days ago
I think there's a lot of truth to this. It reminds me of an idea in economics about the "second-best". From the wikipedia page:

"In welfare economics, the theory of the second best concerns the situation when one or more optimality conditions cannot be satisfied. The economists Richard Lipsey and Kelvin Lancaster showed in 1956 that if one optimality condition in an economic model cannot be satisfied, it is possible that the next-best solution involves changing other variables away from the values that would otherwise be optimal. Politically, the theory implies that if it is infeasible to remove a particular market distortion, introducing one or more additional market distortions in an interdependent market may partially counteract the first, and lead to a more efficient outcome."

https://en.wikipedia.org/wiki/Theory_of_the_second_best

germandiago 271 days ago
I hate intervention but I like analysis. Good insights there, I did not know about this theory.
pron 270 days ago
That worse-is-better is self-reinforcing and that it's the only stable strategy in an environment with less-than-perfect cooperation (i.e. it's the only Nash equilibrium) may both be true at the same time. In fact, if the latter is true then the former is alsmost certainly true.

The real question is, then, whether doing "the right thing" is a stable and winning strategy at all, i.e. viable and affordable. As you yourself suspect, the answer may well be no. Not only because it takes a few tries to figure out the right foundations, but also because what foundation is right is likely to change over time as conditions change (e.g. hardware architecture changes, programming practices -- such as the use of AI assistants -- change etc.).

kagevf 271 days ago
> the worse is better approach is better.

I think this ties back to the idea of "get it working, then once it's working go back and make it fast | preformant | better for whatever meaning of better".

I think much of the consternation towards "worse is better" comes from re-inventing things to achieve the "make it better" improvements from scratch instead of leveraging existing knowledge. Re-inventing might be fine, but we shouldn't throw away knowledge and establshed techniques if we can avoid it.

zombot 270 days ago
That may be one failure mode, but another one is more prominent: Half-assing the next feature is more interesting than going back and making the last feature that you half-implemented actually work. That goes for both commercial and open-source software.
jpc0 271 days ago
I have a question for this premise.

How would you design a network interface using your right foundations model? I'm not talking about HTML or whatnot.

I have some sort of medium, copper, fiber whatever and I would like to send 10 bytes to the other side of it. What is the right foundations that would lead to an implementation which isn't overly complex.

Arch-TK 271 days ago
Unfortunately, I am not a network engineer. So I don't know how I would approach this problem other than to try to make sure that the resulting hardware is easy to deal with from firmware and software.

I have worked with hardware directly and there is something inherently simple about some hardware APIs versus others.

What's more, the complexity doesn't entirely relate to the underlying hardware or protocol complexity.

The issue is, though, that reality is complicated. This is where the right foundations are important. It's not necessarily that the right foundations themselves have simple internals, but that the right foundations successfully tame the complexity of reality.

The best place to work on developing the right foundations is therefore precisely at such interfaces between the real world and the software world.

shuntress 271 days ago
> I have some sort of medium, copper, fiber whatever and I would like to send 10 bytes to the other side of it. What is the right foundations that would lead to an implementation which isn't overly complex.

The bane of every project is understanding what you actually need to do

For example, it is entirely possible that the "right foundation" for your proposed scenario is: Hook one end up to a lightswitch, the other to a light bulb, hire two operators trained in morse code. Then once the 10 bytes are sent write them their cheques and shut it down.

me_again 271 days ago
Not a direct answer, but Ethernet is sometimes brought up as a successful example of Worse is Better. At one point Token Ring was a serious competitor - it had careful designs to avoid collisions when the network was busy, prioritize traffic, etc. But it was comparatively slow and expensive. Ethernet just says "eh, retry on collision.". And that simplistic foundation has carried on to where we have a standard for 800 Gigabit Ethernet.
shuntress 271 days ago
Since speed and expense are relatively important and collisions are relatively rare, this sounds more like "better is better".
musicale 270 days ago
Telegraph/morse code would probably work fine.

For this application I might also consider classic serial/RS-232 (c. 1969), which can be implemented with one signal wire (tx) and can connect to modern USB.

I'm not entirely sure whether they qualify as "right foundation" but they've worked well in practice.

gpderetta 271 days ago
> But the difficulty of "the right foundations" is that nobody knows what the right foundations are the first 10 times they implement them

Yes and it worse than that. The right foundations might change with time and changing requirements.

tightbookkeeper 271 days ago
good comment. But I question how much you can package up inherent complexity in a simple interface, due to leaky abstraction.

The biggest benefit of simplicity in design is when the whole system is simple, so it’s easy to hack on and reason about.

Arch-TK 271 days ago
Abstraction leaks are usually a result of a worse is better approach. But yes, as I think I said in my original comment, its very difficult to successfully completely pave over a poor base.

And yes, I agree that simplicity needs to start quite low down the stack (ideally at the hardware, or the firmware, or the drivers, or the kernel as a last resort) for the complexity not to explode as you keep adding layers.

ezekiel68 272 days ago
I'm always happy whenever this old article goes viral. For two reasons: First, learning to accept the fact that the better solutions doesn't always win has helped me keep may sanity over more than two decades in the tech industry. And second, I'm old enough to have a pretty good idea what the guy meant when he replied, "It takes a tough man to make a tender chicken."
bbor 272 days ago
I’m glad to know a new article that “everyone knows”! Thanks for pointing out the age.

And, at the risk of intentionally missing the metaphor: they do in fact make automated tenderizers, now ;) https://a.co/d/hybzu2U

hyggetrold 272 days ago
It's a funny expression and it is rooted in advertising: https://en.wikipedia.org/wiki/Frank_Perdue
bccdee 272 days ago
> Both early Unix and C compilers had simple structures, are easy to port, require few machine resources to run, and provide about 50%-80% of what you want from an operating system and programming language.

> Unix and C are the ultimate computer viruses.

The key argument behind worse-is-better is that an OS which is easy to implement will dominate the market in an ecosystem with many competing hardware standards. Operating systems, programming languages, and software in general have not worked this way in a long time.

Rust is not worse-is-better, but it's become very popular anyway, because LLVM can cross-compile for anything. Kubernetes is not worse-is-better, but nobody needs to reimplement the k8s control plane. React is not worse-is-better, but it only needs to run on the one web platform, so it's fine.

Worse-is-better only applies to things that require an ecosystem of independent implementers providing compatible front-ends for diverging back-ends, and we've mostly standardized beyond that now.

xiphias2 271 days ago
There are some differences between your examples in my opinion:

Rust started as an experiment of Mozilla team replacing C++ with something that helps them compete with Chrome in developing safe multi-threaded code more efficiently. It took a lot of experiments to get to the current type system, most of which gives real advantages by using affine types, but the compiler is at this point clearly over-engineered for the desired type system (and there are already ideas on how to improve on it). It's still too late to restart, as it looks like it takes 20 years to productionize something like Rust.

As for React I believe it's I believe an over-engineered architecture from the start for most web programming tasks (and for companies / programmers that don't have separate frontend and backend teams), but the low interest rates + AWS/Vercel pushed them on all newcomers (and most programmers are new programmers, as the number of programmers grew exponentially).

HTMX and Rails 8 are experiments in the opposite direction (moving back to the servers, nobuild, noSAAS), but I believe there's lot of space to further simplify the web programming stack.

tightbookkeeper 271 days ago
Rust is not very popular in terms of number of users. It’s just over represented in online discussion.
bccdee 271 days ago
It made it into the Linux kernel, and it's still a relatively young language. I don't think any language has made such a large impact since Java or Javascript, both of which are nearly 30 years old now.
epcoa 271 days ago
Go?? Ruby and Python I think count, Ruby is a JS contemporary and Python is close enough, no one paid it any attention before the late 90s. Rust is still struggling in the domain it wants to live (a C++ replacement), I still have a suspicion that a new new C++ will ultimately come along and dominate before Rust eats the big money where C++ still lives. And I say this as someone who has been rooting for Rust since stumbling upon Graydon's proposal in 2006 while swearing at yet another C++ monstrosity (though a moneymaking one).
bccdee 270 days ago
Go is an incremental improvement over Java and C#. I like it, but it tried extremely hard not to break new ground, and it succeeded at that. Java and Ruby and Python (odd how all these languages came out at almost the exactly the same time) had new ideas and new paradigms, like Rust does. Part of what I mean by "making a big impact" is affecting how we think about programs, which Rust has done and Go has not.
epcoa 270 days ago
Go is an incremental improvement over C#?? I think you’re going to find quite a bit of controversy with that take.

I think Go focusing on certain aspects of tooling and deployment early on might be considered improvements on C#/.NET (some, not all). There are aspects of the Golang standard library that are better designed for purpose than C#/.NET (Again even this has to be nuanced because quite a few 3rd party libraries in .NET were basically “standard”). .NET was hobbled by being effectively Windows only, not because of C#.

> Part of what I mean by "making a big impact" is affecting how we think about programs, which Rust has done and Go has not.

A hallmark of Go is the deprioritization of expressiveness compared to its contemporaries, and I argue this does just as much to affect how one thinks about programs.

A further hallmark of golang is CSP, which of course was nearly a 50 year old concept but not something mainstream at the time (which describes just about everything is Rust but perhaps for a number <50). And IMHO, Rust did not get this one right, async is ugly.

I can’t read your mind and this is too vague. It’s a matter of perspective but I’m not sure what the big new paradigm in Rust you are referring to. Rust doesn’t really break new ground either, which is partly why it isn’t less obscure, widely used languages are almost never the debut of new features, they have been cooking in research and more obscure PLs for years and years. It is at the end of the day still an imperative/mutable language. It just uses a hell of a lot more PL features than Go.

If we’re talking about the borrow checker, which was plan B (plan A was GC), I challenge the idea that it affects how we think about programs. The concept of ownership, lifetimes, aliasing and move semantics were already thought about heavily in C++ prior to Rust (many things codified in the 2011 standard). You still have to think about these things, it’s just that C++ lacks the ability to enforce it. And of course Rust lifted most of the design from Cyclone. There are few large scale Rust projects that can avoid RC either (or some other dynamic lifetime or surrogate reference copium).

You also haven’t mentioned Swift which in many ways is more clever than Rust. Its adoption is more hobbled by the circumstances surrounding its specific corporate driven ecosystem. Maybe that will change.

bccdee 269 days ago
I'm not arguing that Go wasn't well-made. But "having good tooling early on" isn't a new paradigm. Neither is message-based concurrency.

The new paradigm Rust introduced was compiler-enforced memory and concurrency safety in a non-GC language. I'm not saying nobody had thought about these things before, but as you said, C++ can't enforce any of it. Rust can.

When Python hit the market, it was competing with C++ for web dev, and the difference there is massive, because Python is dramatically simpler and more expressive. Java was fast, cross-platform and GC'd, and that also makes a massive difference as compared to C++. When you compare Rust with C++, Rust entirely eliminates dangerous categories of errors without the cost of a GC, and that's a big deal too.

Conversely, if you want to discuss Go vs C#, you can have a long talk about expressiveness and simplicity and it'll ultimately be mostly subjective unless you have a specific use case in mind. I like Go because it's less invested in object orientation than C# is, but Go isn't doing something transformatively new. It's just kinda nicer in some ways.

If Rust didn't guarantee memory safety, it would never have gotten into the Linux kernel. It'd fall into the same category as Go, Swift, D, Clojure, Nim, Zig, and every other programming language with good ideas but no killer value proposition. If Go didn't have Google's backing, it would never have seen mass adoption, because language choice is conservative. Picking something new is risky, and unless it has a big benefit or a big backer, people are going to avoid it where possible.

epcoa 268 days ago
> Neither is message-based concurrency.

The design and mechanics of goroutines are pretty unique to Go among its commonly used contemporaries.

> The new paradigm Rust introduced was compiler-enforced memory and concurrency safety in a non-GC language.

Ada and SPARK? Clearly Rust has become more “mainstream” and popular that’s not the argument (though both have had significant niche commercial success, they’re not obscure), just that the narrative comes across a bit reductionist.

bccdee 268 days ago
I was familiar with Ada, but I wasn't aware that it had an ownership system like Rust that could guarantee memory safety. So I looked it up, and the first result was a paper called "Safe Dynamic Memory Management in Ada and SPARK" [1]. Here's a quote (emphasis mine):

> In this work, we propose a restricted form of pointers for Ada that is safe enough to be included in the SPARK subset. As our main contribution, we show how to adapt the ideas underlying the safe pointers from permission-based languages like Rust or ParaSail, to safely restrict the use of pointers in more traditional imperative languages like Ada. In section 2, we provide rationale for the rules that we propose to include in the next version of Ada, which takes into account specifics of Ada such as by-copy/by-reference parameter passing and exception handling. In section 3, we outline how these rules make it possible to formally verify SPARK programs using such pointers. Finally, we present related work and conclude.

> Section 2: A Proposal for Ownership Types in Ada

If Ada and SPARK are now taking inspiration from Rust with regard to safe memory management, I think that's enough to demonstrate that Rust's ownership system is substantially novel, relative to Ada and SPARK.

epcoa 268 days ago
> I was familiar with Ada, but I wasn't aware that it had an ownership system like Rust that could guarantee memory safety.

The point for bringing up Ada was more that a non-academic "memory-safe" language without GC existed for decades (predating Linux). Old school Ada doesn't use the ownership system, and true doesn't give all the compile-time safety guarantees of Rust but it does give some (and rust in practice cant't either, run the output of grep unsafe | wc on the linux/rust source tree), it also provides some affordances lacking in safe rust. Memory-safety is not some binary or univariate concept that you seem somewhat close to implying. You could have written Linux in Ada from day one and been "safer" than C (imagine that universe).

Related:

https://pling.jondgoodwin.com/post/cyclone/ (which is also referred to in the aforementioned paper).

> I think that's enough to demonstrate that Rust's ownership system is substantially novel

Well, where did anyone say it wasn't, but it certainly wasn't novel in pushing safety as a first class concept for a low level systems language. The fact that it is as (deservedly) successful as it is must be due to other factors. I'm not saying Ada did it first as some kind of gotcha, but Ada did it first and it does not rule the world.

Also, interesting comments from the creator of Rust.

https://www.reddit.com/r/rust/comments/t9972l/comment/hztbsn...

https://www.reddit.com/r/rust/comments/le7m54/comment/gmb4zg...

I'll reiterate, I'm very pro-Rust. I'm just old and look at the current governance and state of corporate involvement and it gives me flashbacks to commercial Unix in the late 80s/early 90s. IIn any case, I think fetishizing the borrow checker or a specific PL feature as having much to with its success (or failure) is sort of irrelevant.

bccdee 268 days ago
> Ada did it first and it does not rule the world.

Well I mean, Ada didn't do "it" first—"it" being whatever Rust did that led to it making such a big splash. If Ada had done that first, it would have made the kind of splash that Rust has. Rust isn't the first systems language to ever consider being safer than C, but it is the first one to have such a large impact, and I think that's a product, primarily, of its technical decisions.

> I think fetishizing the borrow checker or a specific PL feature as having much to with its success (or failure) is sort of irrelevant.

I think that's an odd stance. Why would the technical specs of a language be irrelevant to its success? I think that's probably the most relevant factor, besides an institutional backer. C# was pushed hard by Microsoft and Go got a lot of support from Google. Rust wasn't part of an major tech player's pre-existing ecosystem, though, so it had to succeed on technical merit alone. None of its ideas are really that new except the borrow checker, and most of the time, when I've seen an institution talk about adopting Rust, there's been a lot of focus on the safety afforded by the borrow checker. I don't think that can be a coincidence.

That's not "fetishization." When a programming language with one new idea and little institutional backing gains traction in the mainstream, it's probably because that new idea in particular has a lot of merit.

epcoa 268 days ago
> Ada didn't do "it" first

It did the safety thing first, or at least much earlier.

> Rust wasn't part of an major tech player's pre-existing ecosystem,

Completely overlooking Mozilla’s backing early on is too extreme. Mozilla pumped a ton of money that got Rust off the ground. And of course even now the Rust Foundation isn’t poor or something.

Despite Google employing the Go people, not really part of their “ecosystem” for years either. They really hedged on that one. Go popularity mostly grew outside of Google for years

But yes both projects have/had plenty of money. This isn’t Nim or Lua or something.

> when I've seen an institution talk about adopting Rust, there's been a lot of focus on the safety afforded by the borrow checker.

So why did they just start caring about safety this recently?

> it's probably because that new idea in particular has a lot of merit.

The point is “safety” for a low level system language wasn’t a new idea at all, it was just that the world outside did not value/demand it.

> That's not "fetishization."

The fetishization I am referring to is focusing on the memory safety so far above the type system, tooling, open community building, and even the syntax. Conversely Rust can’t manage to have an ABI past extern “C”. Never said technical specs are irrelevant, that’s a straw man.

And finally, again this all to say, I think its memory safety story is not what is going to determine its fate and destiny in the next decade. It needs to evolve, this is more about governance than about a particular feature. That rust in Linux depends on unstable isn’t end of the world, it’s not something to be proud of either.

I think it’s naive to think things like Carbon don’t have some competitive effects either.

To touch back on “it's still a relatively young language.” Java, C#, Swift and Go were all well established in their niches younger than Rust. But, it really doesn’t matter one whit if Rust is 5 or 35. I’m more interested in the present trajectory and where and why Rust struggles or has struggled to make inroads, not everything is fair to chalk up to just age IMHO.

Valord 269 days ago
I agree that Go is not an improvement over C#. In my experience the language ergonomics of C# are better than Go. I've done both in industry and am much more productive with C#. It is too bad that C# and .NET are often perceived as bad because of Microsoft & Windows. .NET core has a lot of really things going on.

Not that Go is a bad language. It has its place. I recommend it for embedded systems and as an option for systems programming.

bccdee 269 days ago
Yeah I should clarify that Go considers itself to be an incremental improvement on the Java/C# paradigm. You could certainly argue that it oversimplifies things.
mplewis 271 days ago
Rust isn’t popular in web dev. It’s very popular in embedded.
tightbookkeeper 271 days ago
Is it? It seems popular with people getting into low level for the first time.
psychoslave 271 days ago
Unix and C are still there, and while on shallow level this can be more or less ignored, all abstractions end up to leak sooner than later.

Could the industry get rid of C and ridiculous esoteric abbreviation in identifiers, it could almost be a sane world to wander.

karel-3d 272 days ago
I remember when I had a lesson about OSI layers, where the teacher has carefully described all the layers in detail

and then said something like "most of this is not important, these layers don't really exist, TCP/IP got popular first because it's just much simpler than OSI was"

pjc50 272 days ago
Oh, there's an entirely different feature-length article to be written/found about how packet switching beat circuit switching and the "Internet approach" beat the telco approach. The great innovation of being able to deploy devices at the edges without needing clearance from the center.

I don't think very many people even remember X25. The one survivor from all the X standards seems to be X509?

kragen 272 days ago
The OSI stack was also designed using the packet-switching approach. Rob Graham's "OSI Deprogrammer" is a book-length article about how TCP/IP beat OSI, and how the OSI model is entirely worthless: https://docs.google.com/document/d/1iL0fYmMmariFoSvLd9U5nPVH...

I'm not sure he's right, but I do think his point of view is important to understand.

chuckadams 272 days ago
Wow, you're not kidding about book-length: at 246 pages, that's an epic takedown. Learning all kinds of other things about networking along the way, though.

I do remember all nine layers of the OSI stack though: physical, data link, network, transport, session, presentation, application, financial, political.

kazinator 270 days ago
But we freakin have those layers now. Above TCP/IP there is SSL, and then about that https, and within that there's some JSON RPC or whatever.

The first few OSI layers are fairly readily identifiable in TCP, IP and Ethernet, and some of the rest are built by applications.

kragen 270 days ago
Having just skimmed X.225 thanks to OhMeadhbh (https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-X.22...) I think Graham is correct that we don't have anything similar to the OSI "session" layer in the stack you're talking about; although the ARPANET TELNET protocol does have a lot of the same functionality, there's nothing analogous in SSL+https+JSON+RPC. I'm guessing the same is true of the "presentation" layer, but I'm certainly open to hearing why you think otherwise, if you do.

Also, if you happen to be familiar with X.225 (or read it following my prompt above), I have some questions in https://news.ycombinator.com/item?id=41789004!

Graham makes a good argument that, although there are some promising analogies between the physical/data-link/network/transport layers and the PHY/MAC/IP/TCP layers of TCP/IP over Ethernet, the model is overall a very poor fit even at those layers; it's better to not try to decompose network stacks into a predetermined set of layers, because the actual set of layers used is variable depending on the application and the environment.

fanf2 272 days ago
chuckadams 272 days ago
I can't claim to have BTDT, but I did get the T-shirt. That's how I learned them :)
lizknope 272 days ago
Wow. That is awesome, I skimmed through it and it looks like something I will enjoy. I've still got my Tanenbaum Computer Networks book from 1996 and the first chapter starts with OSI, then TCP/IP, and explains the differences and why OSI failed.
kragen 272 days ago
I'm interested to hear what you think! I haven't finished reading it yet. It's a bit repetititive.
OhMeadhbh 272 days ago
It doesn't seem like a "take down" as much as a re-iteration of all the things IBM, DEC, GE and various Telcos did wrong when implementing OSI. I could reduce it to one sentence: "Everyone was so intent on monetizing their own networking implementation they never thought enough about interoperability."
kragen 271 days ago
Not only isn't that an accurate summary of Graham's book, it isn't even a topic discussed in the first half of the book, which is all I've read so far. I suspect it isn't a topic discussed in the book at all; can you back up your assertion with some quotes?
OhMeadhbh 271 days ago
Yes, but the book also has enough inaccuracies as to make it... I don't know what. For example, in the first chapter the author says "no one knows what a session is," which is patently false. I myself implemented control logic in telco equipment to respond to X.225 compliant messages to change the state of an abstract state machine used on either side of the connection. And while I'm sure it's possible to use CONS or CLNP to communicate with a CEEFAX terminal, that is far from the only use to which the various OSI compliant protocols were put.

Just because you don't understand something, that doesn't mean it's bad.

kragen 271 days ago
I think he means to be saying that, in the context of TCP/IP, nobody knows what a “session” in the OSI sense would correspond to—not that nobody has ever implemented X.225. Presumably Graham knows that people have written X.225 implementations and isn't trying to convince his readers otherwise?

I don't know enough to judge his assertion that the session layer exists to solve problems created by half-duplex terminals. (He doesn't seem to specifically call out Ceefax.)

OhMeadhbh 271 days ago
From page 11: "OSI was designed primarily around the dumb terminal connected to a mainframe."

Also from page 11: "'session' meant something related to attaching videotex terminals to mainframes."

And yes, all Ceefax terminals were Videotex terminals, but not all Videotex terminals were Ceefax terminals.

I might recommend that instead of guessing what the session layer is, you (or more appropriately, Mr. Graham) go and look at what the specs say it is and what it's used for. The X.225 spec is available for download at https://www.itu.int/rec/T-REC-X.225-199511-I/en. X.215 is available at https://www.itu.int/rec/T-REC-X.215-199511-I/en.

The OSI session layer is not related to HTTP session cookies as the original author proposes.

kragen 270 days ago
Hey, thanks! I had looked for it when writing my previous comment, but I skipped past the ITU links on the assumption that I'd have to sign over my first-born children in order to get a copy. But it turns out that they aren't currently restricting its availability: https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-X.22...

Clearly "videotex" is wrong. Skimming X.225, though, an awful lot of it does seem to be concerned with ⓐ half-duplex terminals and ⓑ T.62 teletex (not videotex, closer to telex) terminals. So "attaching video terminals to mainframes" does seem like a fair summary. Possibly Graham doesn't know what the word "videotex" means, or thought (as I did, neither of us having lived in Germany) that "teletex" was a kind of videotex.

I agree that it doesn't have the kind of functionality that HTTP cookies provide. That's backwards, I think? HTTP cookies provide session context that persists over multiple TCP-level connections; the reuse of a single transport connection between different X.225 sessions is more like reusing a single TCP-level connection for multiple HTTP requests, or multiple users logging in and out, one after the other, on the same hardwired terminal without resetting it?

I wish it gave examples of usage so I knew what the resynchronization functionality was for. Maybe you know?

Expedited data is a thing that TCP also has, and I've never really understood what it was for there either. Maybe it's for sending a ^C over a remote terminal login session when the keyboard buffer is full because the application you're talking to is hung? The "activity interrupt" stuff seems like it would be a better fit for that, but maybe the expedited data facility was an older design that was retained for backward-compatibility?

fanf2 272 days ago
OSI was was two network stacks fighting with each other, the circuit-switched telco X.25 successor and the packet-switched DEC / Xerox anti-Internet.

See also https://computer.rip/2021-03-27-the-actual-osi-model.html and https://dotat.at/@/2024-03-26-iso-osi-usw.html

kragen 272 days ago
This is great, thanks! crawford's post pulls no punches:

> Teaching students about TCP/IP using the OSI model is like teaching students about small engine repair using a chart of the Wankel cycle. It's nonsensical to the point of farce. The OSI model is not some "ideal" model of networking, it is not a "gold standard" or even a "useful reference." It's the architecture of a specific network stack that failed to gain significant real-world adoption.

OhMeadhbh 272 days ago
That's sort of like saying "TCP" is a failure because everyone now uses SSH instead of TELNET. TCP/IP still has to do all of the things the OSI stack does, it just does it in a different manner and there are (thankfully) plenty of well defined wire-formats and processing expectations so interoperability is pretty straight-forward. But I still think IPSec would have been MUCH easier to deploy had TCP/IP maintained a rational distinction between presentation and session layers. I guess what we learned is that SSL/TLS and FreeSWAN's assumptions about routing encrypted payloads were "good enough."

Also, if you're going to compare TCP/IP to various OSI implementations, you should compare the full stack including PEM, MOSS, SMIME, SSL/TLS, SSH. Each muddies the difference between presentation and application layers, but as in the previous paragraph, no one seems to care. Talking SMTP over SSH (or SSL/TLS) is totally fine; you don't need to have a sub-protocol to define how a presentation layer on top of a secure session layer works if you can make certain assumptions about the behaviour of the code on the other side of the network connection.

kragen 271 days ago
Each of the articles linked upthread separately explain why everything you said in your comment is incorrect.
kragen 270 days ago
I withdraw the above comment; they do explain why many things in OhMeadhbh's comment are incorrect, but not everything.
fanf2 272 days ago
LDAP is “lightweight” compared to the X.500 directory access protocol. LDAP DNs are basically the same as X.500 DNs.

SNMP is “simple” compared to X.711 CMIP. But SNMP also uses ASN.1 and X.660 OIDs.

foobarian 272 days ago
I finally understand why OSI failed, it's the naming! Dear lord.
kragen 270 days ago
I think naming quality isn't really a difference between OSI and TCP/IP. I'm sending this message in URL-encoded UTF-8 in MIME over HTTP over TLS and TCP, IP, MPLS, DOCSIS, and an 802.11g CSMA/CA MAC in a CIDR IP block allocated by ARIN via LACNIC to an AS that belongs to a CLEC; ultimately you'll use an URL to read it in HTML, and if your UA is like mine, you'll use ECDHE ECDSA with AES-256-GCM and SHA-384, verified through a CA chain through E5 (which supports OCSP) and ISRG. But nobody bats an eye at that because that alphabet soup has been familiar for decades.
OhMeadhbh 272 days ago
Oh no. The naming is simple compared to ASN.1/BER parsing.
goalieca 271 days ago
I’m always confused about when to use DER and when to use BER. Pretty much have to study the history to get it.
jonmon6691 271 days ago
I found this article helpful when I had that same question. Basically BER has some less rigid specifications for how to encode the data that can be convenient for the implementation. Such as symbol-terminated sequences instead of having to know their length ahead of time. But this means that there are many equivalent serializations of the same underlying object, which is problematic for cryptography, so DER is an unambiguous subset of BER which will have only one correct possible serialization for a given object.

https://luca.ntop.org/Teaching/Appunti/asn1.html

rwmj 272 days ago
chuckadams 272 days ago
> "How do you scare a Bellhead?" he begins. "First, show them something like RealAudio or IPhone. Then tell them that right now performance is bandwidth-limited, but that additional infrastructure is being deployed."

You'd scare anyone like an amazed rural tribesman if you showed them an iPhone in 1996.

I know, IPhone was a Cisco thing, but my mind went there for a beat ;)

bee_rider 272 days ago
Haha, I didn’t get it until your last sentence
272 days ago
hiatus 272 days ago
X.25 is still used in ham radio with AX.25
OhMeadhbh 272 days ago
I hate to tell you this, X.25 is still used ALL OVER THE PLACE. But thankfully hardly anywhere near data networking customers.
scroot 271 days ago
This might be fiery take, but I think the X.400 standards for naming and messaging would have been a lot better than the chaotic email situation, and probably would have made more sense from a commercial/legal perspective than making DNS "the" global naming system
jll29 271 days ago
I enjoyed how we got taught the two models in the 1990s, and why one has the four layers you need and the "standard" has seven layers instead.

The professor asked "How many layers do you count?" - "Seven." - "How many members do you think the ISO/OSI committee had that designed it between them?" - [laughter] - "Seven.".

PhilipRoman 272 days ago
IMO the OSI layer system (even though using TCP/IP suite) has some merit in education. To most of us, the concept of layering protocols may seem obvious, but I've talked to people who are just learning this stuff, and they have a lot of trouble understanding it. Emphasizing that each layer is (in theory) cleanly separated and doesn't know about layers above and below, is a very useful step towards understanding abstractions.
ahoka 272 days ago
The problem is that this is not true. There are no such clean strictly hierarchical layers in most of the protocols that make up the internet.
kragen 270 days ago
There are; VNC doesn't know about IP, for example, and TCP doesn't know about Ethernet. IP is just as happy to run over PPP or Wireguard as over Ethernet. HTTP/1 knows about TCP/IP, but only a little bit, and you can easily run HTTP/1 over other protocols like TLS. Character-cell terminal protocols know very little indeed about the protocol layer under them and work almost equally well over telnet, rsh, SSH, a serial port, a modem, or a bare pseudo-TTY, the main surviving exception being window resize handling.

The problem is that ① the layers don't have a fixed relationship to each other the way the OSI model proposes, ② several of the OSI layers don't exist at all in real-life TCP/IP protocols, and ③ there are other layers in current stacks that have no analogue in the OSI model, like Wireguard, MPLS, SSH, TLS, and HTTP. If you want to understand the services HTTP provides to the protocols that ride on top of it, you need to read Roy Fielding's thesis, not X.225.

271 days ago
272 days ago
supportengineer 272 days ago
I've never done any kernel programming but I assumed the OSI model corresponded to Linux kernel modules or a similar division internally.
gpderetta 272 days ago
> The good news is that in 1995 we will have a good operating system and programming language; the bad news is that they will be Unix and C++.

And 30 years later they show few signs of letting go.

ezekiel68 272 days ago
Yep. And nary a tear is shed these days over the death of the so-called superior Lisp machines.
gpderetta 272 days ago
The Spirit of the Machine still lives in some form in emacs.
rwmj 272 days ago
Maybe not in any position to do anything about it, but I'm quite sad :-/
lioeters 270 days ago
I was gonna say, I do shed a tear now and then dreaming of Lisp machines and the future that could have been.
OhMeadhbh 272 days ago
Meh. Lisp machines still exist. They're just simulated in various Lisp's runtime environments. It turns out that a RISC machine running a Lisp interpreter or an executable compiled from Lisp source tends to perform better than a tagged/cdr Lisp Machine w/ hardware GC.

That being said... I've wanted to implement an old Explorer using an FPGA for a while. Maybe if I just mention it here, someone will get inspired and do it before I can get to it.

hayley-patton 271 days ago
Lisp machines didn't have hardware GC, though they had hardware support for read/write barriers.
lispm 271 days ago
Kind of. A lot of the GC support in the Symbolics 3600 architecture is on the Microcode level (current CPUs usually don't have such operations in Microcode). The word size of the CPU is 36bit. The CPU operations don't deal with the type and gc tags on the instruction level, this is done on the Microcode level. Things like "invisible pointers" are also dealt on the Microcode level.

Ephemeral GC, concentrates on garbage collecting objects in RAM: For example every memory page has a page tag, which marks it modified or not. The ephemeral GC uses this to scan only over changed pages in memory. The virtual memory subsystem keeps a table of swapped-out pages pointing to ephemeral pages. The EGC can then use this information...

Jach 272 days ago
At a certain level, sure, but C++ at least has definitely lost out. In the 90s it seemed like it might really take over all sorts of application domains, it was incredibly popular. Now and for probably the last couple decades it and C have only kept around 10% of the global job market.
OhMeadhbh 272 days ago
My gut feeling is there are still the same number of jobs for C++ today as there were in the 90s. It's just that they're hard to find because the total number of programming jobs has exploded. The reason you can't see the C++ jobs is because the newer, non-C++ jobs are crowding them out on job boards. [This is a hypothesis, one I haven't (dis)proven.]

For fun a few weeks ago I went looking for COBOL on VMS jobs. They're definitely still out there, but you do have to look for them. No one's going to send you an email asking if you're interested and if you don't hang out with COBOL/VMS people, you may not know they exist.

I think my point is the total number of C/C++ jobs today are probably the same or slightly higher than 1994. But the total number of Java and C# jobs (or Ruby or Elixr or JavaScript jobs) is dramatically higher than in 1994, if for no other reason than the fact these languages didn't exist in 1994.

[As an aside... if you're looking for a COBOL / VMS programmer/analyst... I spent much of the 80s as a VMS System Manager, coding custom dev tools in Bliss and some of the 90s working on the MicroFocus COBOL compiler for AIX. And while you would be crazy to ignore my 30+ years of POSIX/Unix(tm) experience, I think it would be fun to sling COBOL on VMS.]

jlarocco 272 days ago
> My gut feeling is there are still the same number of jobs for C++ today as there were in the 90s. It's just that they're hard to find because the total number of programming jobs has exploded. The reason you can't see the C++ jobs is because the newer, non-C++ jobs are crowding them out on job boards. [This is a hypothesis, one I haven't (dis)proven.]

I don't know anything about the total number of C++ jobs, but there's a huge filter bubble effect for job searching. If you don't mention a language on your resume or list it under your skills then you're very unlikely to see any jobs for it or have anybody contact you for a job using it, whether we're talking about C++, Python, Typescript, or even technologies like Docker.

worstspotgain 272 days ago
It's not C++ that has been replaced, it's VB.
pjmlp 271 days ago
Depends on the market, even the C++ wannabe replacements are implemented in compiler toolchains written in C++.

It gets a bit hard to replace something that your compiler depends on to exist in first place.

stonemetal12 272 days ago
Isn't "Worse is better" just a restatement of "Perfect is the enemy of Good", only slanted to make better\Perfect sound more enticing?

>The right thing takes forever to design, but it is quite small at every point along the way. To implement it to run fast is either impossible or beyond the capabilities of most implementors.

A deer is only 80% of a unicorn, but waiting for unicorns to exist is folly.

OhMeadhbh 272 days ago
Yes and no. "Worse is Better" also implies you allow someone outside your problem domain to define abstractions you use to decompose the problem domain (and construct the solution domain.) So... I mean... that's probably not TOO bad if they're well-understood and well-supported. Until it isn't and you have to waste a lot of time emulating a system that allows you to model abstractions you want/need to use.

But at the end of the day everyone knows never to assume STD I/O will write an entire buffer to disk and YOU need to check for EINTR and C++ allows you to wrap arbitrary code in try...catch blocks so if you're using a poorly designed 3rd party library you can limit the blast radius. And it's common now to disclaim responsibility for damages from using a particular piece of software so there's no reason to spend extra time trying to get the design right (just ship it and when it kills someone you'll know it's time to revisit the bug list. (Looking at YOU, Boeing.))

I do sort of wonder what happens when someone successfully makes the argument that C++ Exceptions are a solution often mis-applied to the problem at hand and someone convinces a judge that Erlang-like supervisory trees constitute the "right" way to do things and using legacy language features is considered "negligence" by the courts. We're a long way off from that and the punch line here is a decent lawyer can nail you on gross negligence even if you convinced your customer to sign a liability waiver (at least in most (all?) of the US.)

Which is to say... I've always thought there is an interplay between the "worse is better" concept and the evolution of tech law in the US. Tort is the water in which we swim; it defines the context for the code we write.

272 days ago
th43o2i4234234 271 days ago
The critical point of the article holds true of everything in human social networks (be it religion/culture/philosophy/apps/industry...).

If you don't achieve virality, you're as good as dead. Once a episteme/meme spreads like wild-fire there's very little chance for a reassessment based on value/function - because the scope is now the big axis of valuation.

It's actually worse because humanity is now a single big borg. Even 30-40 years back, there were sparsely connected pools where different species of fish could exist - not any more. The elites of every single country is a part of the Anglosphere, and their populations mimic them (eventually).

This tumbling towards widespread mono-memetism in every single sphere of life is deeply dissatisfying about the modern human life, not just for PL/OS/... but also for culture etc.

Anthropocene of humanity itself.

esafak 271 days ago
> If you don't achieve virality, you're as good as dead.

Are you? Maybe the worse solution peaks faster, but can be supplanted by a better solution in the future, like how Rust is displacing C/C++ in new projects. The better solution may never be popular yet persist.

pjmlp 271 days ago
For Rust to fully displace C++, it needs to eventually bootstrap itself, until then, C++ will be around.

Additionally there are no significant new projects being done in Rust for the games industry, AI/ML, HPC, HFT, compiler backends, hardware design,....

271 days ago
th43o2i4234234 271 days ago
Rust is nowhere near displacing C++.

There's typically a "exhaustion" phase with mono-memetism/theories where everyone gets sick and tired of the "one and only way" and it becomes fashionable to try out new things (eg. Xtianity in Europe). We're not at this point where the olds can be toppled.

271 days ago
JohnFen 272 days ago
"Worse is better" has become like "move fast and break things". They're both sayings that reveal an often-overlooked truth, but they have both been taken far too far and result in worse things for everybody.
ezekiel68 272 days ago
I see what you mean. Yet I feel like the first one (at least, as outlined in the article) is more about accepting an inevitability that you probably have little control over, while the second is more often adopted as a cultural process guideline for things you can control. But that's just my impression.
sesm 271 days ago
And then it transformed into "move things and break fast".
i_s 272 days ago
I've read this a few times over the years and I think the argument is sound. But I wonder if it is sound in the same way this statement is:

"It is better to go picking blueberries before they are fully ripe, that way you won't have much competition."

api 272 days ago
I feel like we're seeing a bit of push-back today against worse-is-better in the area of languages. Rust in particular feels more like the MIT approach, albeit with an escape hatch via the explicit keyword "unsafe." Its type system is very thoroughly specified and correct as opposed to C's YOLO typing.
dokyun 272 days ago
Rust is not the MIT approach, because an important aspect of that approach is that it's conceptually simple. Rust is a leviathan of complexity both in interface and implementation. Common Lisp is an MIT approach language, and approaches the same problems like memory and type safety by doing "the right thing" by default and offering more advanced options like type annotations and optimization levels in a "take it or leave it" manner. Rust will force you to program the way the compiler wants in the name of safety, while Common Lisp will allow you to program safely and freely, and decide which parts are important. An observation of this idea is that Rust compilers are terribly huge and slow because they use static type-tetris for everything, while Common Lisp compilers are very fast because they do most type-checking at runtime.
steveklabnik 272 days ago
> An observation of this idea is that Rust compilers are terribly huge and slow because they use static type-tetris for everything,

Rust's typechecking passes are not the reason why the compiler is slow. Code generation dominates compile times. Type checking is pretty quick, and Rust makes some decisions that enable it to do so, like no global inference.

271 days ago
dokyun 272 days ago
....so why is it so slow to generate code?
steveklabnik 272 days ago
The compiler doesn't do a whole lot to try and minimize LLVM-IR, and monomorphization produces quite a lot of it. This makes LLVM do a lot of work. (EDIT: maybe that's being too harsh, but what I mean is, there's been some work towards doing this but more that could possibly be done, but it's not a trivial problem.)

On my current project, "cargo check" takes ten seconds, and "cargo build" takes 16. That's 62.5% of the total compilation time taken by code generation, roughly (and linking, if you consider those two to be separate).

In my understanding, there can sometimes be problems with -Z time-passes, but when checking my main crate, type_check_crate takes 0.003 seconds, and llvm_passes + codegen_crate take 0.056 seconds. Out of a 0.269 second total compilation time, most things take less than 0.010 seconds, but other than the previous codegen mentioned, monomorphization_collector_graph_walk takes 0.157s, generate_crate_metadata takes 0.171 seconds, and linking takes 0.700 seconds total.

This general shape of what takes longest is consistent every time I've looked at it, and is roughly in line with what I've seen folks who work on compiler performance talk about.

hollerith 270 days ago
Have you heard of anyone proposing to add to rustc a flag or an attribute that would cause a generic type to be implemented by a dispatch at run time to avoid monomorphization and consequently decrease compile time?
steveklabnik 269 days ago
It’s been discussed, but not every trait is object safe, so it’s not a trivial task.
api 272 days ago
That's an interesting counter-take and it makes me wonder if Rust isn't a third thing... not the clean sparse MIT approach but also not the wild YOLO slop of C and Unix. Something correct but also complex and pragmatic. Maybe it's an attempt at a synthesis of the two -- Unix's gnarly pragmatism but with correctness.

My heart loves the clean sparse MIT approach, but I'm kinda forced to work in the other world because that approach has so far decisively failed.

I have my own thoughts as to why, and they're different from the typical ones:

(1) Worse is free -- the "worse" stuff tended to be less encumbered by patents and closed source / copyrights. This is particularly true after GNU, BSD, and Linux. Free stuff is viral. It spreads fast and overtakes everything else.

(2) Programmers like to show off, and "worse" provides more opportunities to revel in complexity and cleverness.

dokyun 272 days ago
> That's an interesting counter-take and it makes me wonder if Rust isn't a third thing... not the clean sparse MIT approach but also not the wild YOLO slop of C and Unix. Something correct but also complex and pragmatic.

I had supposed this in a previous thread, and I agree it is another thing entirely, however whether Rust is 'correct' or 'pragmatic', I think is a matter of contention.

> (1) Worse is free -- the "worse" stuff tended to be less encumbered by patents and closed source / copyrights. This is particularly true after GNU, BSD, and Linux. Free stuff is viral. It spreads fast and overtakes everything else.

At the time it seems that the "worse" stuff was actually more encumbered--GNU started to exist as a consequence of Unix both becoming an industry standard and solidifying the position of nonfree software in industry, and it didn't get that way by being free--they simply licensed it away en masse to universities. The free software movement was born out of the MIT hacker ethic of sharing software freely (GNU brought the design philosophy of MIT to Unix systems, and largely stands opposed to the "worse is better" approach. It originally sought to replace parts of Unix with superior alternatives, such as Info over man pages). BSD didn't become free until much, much later, at which point Linux had already become relevant.

> (2) Programmers like to show off, and "worse" provides more opportunities to revel in complexity and cleverness.

I think maybe C or C++ lets you show off by hacking around the compiler to do certain things that look impressive like preprocessor hacks or obfuscated code, but I think many would agree that this kind of style isn't "correct": languages like Go were developed as a consequence of this in order to suck any and all fun you might get out of hacking C, in order to force you to write more "correct" programs. Lisp, on the contrary doesn't tell you what a correct or incorrect program is, and gives you every facility to write programs that are infinitely complex and clever.

To me, Rust looks like the result of trying to break the rules of industry languages, by trying to incorporate things like real macros and type systems into something that resembles a real-world language. But it's biggest flaw to me is that it doesn't break enough of them, rather it makes more in the process.

djha-skin 272 days ago
I actually take this as evidence that Rust will always remain niche. It's just a very complicated language. Go or Zig is much easier to learn and reason about.

In Go you can immediately tell what the fields are in a config yaml file just by looking at struct annotations. Try doing that with Rust's Serde. Super opaque in my opinion.

busterarm 272 days ago
Exactly!

Rust will only protect me from things my customers don't care about and don't understand.

By not using Rust and just dealing with it, I'm making more money faster than if I started with Rust.

Rust only matters in environments where that calculus comes out the other way.

crabmusket 272 days ago
I'm pretty sure your customers care that your software doesn't segfault!
busterarm 272 days ago
My customers aren't running my software, I am. They don't know if it segfaulted or not.

If your customers are running your software you might have a business model problem instead of a software quality one.

warkdarrior 272 days ago
> Rust will only protect me from things my customers don't care about and don't understand.

That's the wrong kind of protection. Rust should protect you from things people other than your customers (who presumably are well behaved) care about.

morning-coffee 272 days ago
> Rust will only protect me from things my customers don't care about and don't understand.

Are you suggesting your customers don't care about CVE's, even indirectly when it affects them?

busterarm 272 days ago
Disagree.

I think this is a self-delusion experienced by Rustaceans because they overvalue a certain type of software correctness and because Rust gets that right, the rest of its warts are invisible.

agumonkey 272 days ago
I assume that 'worse' often means find adequation with average and mass. This ensures a longer existence, later you may absorb the "better" you didn't have early on. Look how dynamic languages started to have better data types, various traits (generators, closures..), jit .. all things they could pluck out of old "glorious" languages that were .. somehow too advanced for the mainstream. It's a strange schizophrenic situation.
clarkevans 272 days ago
I think the worse-is-better philosophy is not well encapsulated with the 4 priorities given. Perhaps it is 4 completely different priorities. Here's a strawman.

1. Minimal -- the design and implementation must be the smallest as possible, especially the scope (which should be deliberately "incomplete")

2. Timely -- the implementation must be delivered as soon as feasible, even if it comes before the design (get it working first, then figure out why)

3. Relevant -- the design and implementation must address important, unmet need, eschewing needs that are not urgent at the time (you can iterate or supplement)

4. Usable -- the implementation must be integrated with the existing, working and stable infrastructure (even if that integration causes design compromises)

The other dimensions, simplicity, correctness, consistency, and completeness are very nice to have, but they are not the primary drivers of this philosophy.

AnimalMuppet 272 days ago
That seems like a fairly solid strawman.

I would say that Timely and Relevant drive Minimal. I would also say that Minimal and Usable are in tension with each other.

mseepgood 272 days ago
Maybe don't call it "worse", maybe it's just you who has a skewed perception of "good"/"right".
hammock 272 days ago
The distinction between the set of five "right" principles and the five "worse is better" principles is known as compromise in design.

It's the opposite of what marketers want you to think of when they say "uncompromising design."

Dansvidania 272 days ago
looking at how it played out with javascript one can't but agree.

(edit: I mean it unironically)

api 272 days ago
JavaScript really illustrates the ultimate path-dependence of evolution. It got widely deployed during a boom time and therefore we are stuck with it forever.
GiorgioG 272 days ago
We don’t have to be stuck with it forever. Start pushing WASM instead of React, etc. We can get there, but as technologists we have to make a concerted effort to do so.
jll29 271 days ago
The task of devising a novel application model for distributed Web applications is left to the reader as an exercise.

(It's not _just_ JavaScript that one should get rid off, but the abuse of the whole HTML+X set of stacks to implement interactive Web applications. There must be a better way than HTML/CGI, HTML/XML, Java applets etc.; WASM could be part of a solution, but not a solution in itself., as there should be standard UI elements etc.)

adamc 271 days ago
But the essay lays out the reality: For WASM to replace JavaScript, it needs to be better (in the sense of easy to adopt, solves a problem right now) every step of the way. That is not, as yet, true.
adastra22 272 days ago
We will still be stuck with it forever for compatibility reasons
auggierose 271 days ago
WASM instead of React? That does not even make sense.
pjmlp 271 days ago
Only if the tooling was half as good as Flash.
Der_Einzige 272 days ago
But I feel like other languages that we were de-facto "stuck with" in certain domains boomed and then busted - i.e. Lua, Pearl, etc
actionfromafar 272 days ago
Javascript is the only language which straddled the Client Server Gap. If it weren't for Node, Javascript would not have been as popular.
bigstrat2003 272 days ago
It's still absolutely baffling to me that anyone is willing to run JS server side. There are so many options which are much better suited, why are people willing to jam that square peg into the round hole?
sjamaan 271 days ago
Perhaps the siren song of "isomorphic JS", where you can run the same code on the server and on the client? I can see the use case for having complex model and validation code running on the client for speedy feedback and on the server for security, and perhaps also the idea you could render something entirely on the server when needed (eg for indexing and non-JS clients) and on the client when it's capable.

Personally, I wouldn't want to touch server-side JS with a 10 foot barge pole.

homebrewer 272 days ago
Because they don't know and don't want to know anything else. Not a single polyglot developer I personally know have ever chosen JS for server side, not once.
272 days ago
api 272 days ago
There was something long ago called GWT -- Google Web Toolkit -- that tried to make Java into that language by having it compile to JavaScript.

It actually worked decently well, but was due to Java needlessly verbose.

WASM lets us run other languages efficiently in the browser but that just opens the field to a lot of languages, not one language to rule them all.

actionfromafar 271 days ago
Also GWT apps were pretty slow to load and start, and were very "app"-like as opposed to web-page like at a time when that was not as familiar as it is today. That's how I remember it anyway. And pretty heavy, developer wise, at a time when "update a file on the FTP" was still normal.
TacticalCoder 272 days ago
Yup first thing I thought. That pathetic piece of crap conceived in basically 15 days (not kidding)... BUT it is what we have on the front-end for web apps so there's that. JavaScript is the mediocre turd I love to hate.
karel-3d 272 days ago
And it keeps being polished and improved to the point it's almost not a turd, and now has types sort of, and much better engines, and now there are ARM machines that are literally designed to run it faster, and now most of your actual PC applications are written in it.

But honestly it's kind of refreshing to see the original node.js presentation, where using javascript is sort of a side-note. He wanted to use callback-heavy language and JS fit the bill

https://youtu.be/EeYvFl7li9E

GiorgioG 272 days ago
It will always be a turd. Typescript is a nice illusion, but underneath is still the same turd.
actionfromafar 272 days ago
To me WASM is the wildcard. It lets other languages infect the Javascript host organism.
Johanx64 271 days ago
Problem with Javascript is that it is not confined to webbrowsers and webapps, but it and it's associated business models (SaaS) finds it's way everywhere, at first desktop apps got enshitified by it, then all sorts of smart devices, and all the way to embedded systems in cars with their laggy sloppy UIs everywhere.

It probably is the most severe case of "worse is better" I've experienced so far.

Der_Einzige 272 days ago
I know this is an article about Lisp and the specific usage of this term in the context of software acceptance, but when you use a title that provocative I want to speak specifically about the idea of "Worse is Better" with respect to a more literal idea of "sometimes things get worse overtime but you are told they have improved"

For example, why is it that central vacuums are more rare in 2024 than they were in the 1980s, despite them being superior in every way compared to regular ones?

"Worse" vacuums are "better" for the economy? (because Dyson makes jobs and consumes resources?)

AnimalMuppet 272 days ago
Central vacuums are worse in at least one specific way: Cost of fixing or replacing them when they break.
ezekiel68 272 days ago
True; but my family and neighbors had them and they seemed to last forever. Maybe they were made so well that the company couldn't gain repeat sales often enough.
floren 272 days ago
Also, I can take my regular vacuum out into the garage and clean the car.
pjc50 272 days ago
What the heck is a central vaccum? One plumbed into the house? Isn't that spectacularly expensive?
sophacles 272 days ago
> Isn't that spectacularly expensive?

Not spectacularly expensive when installed as the house is built. It's another run of pvc pipe in the walls before you close them up. Approx a day of labor and some pipe are the added expense - not much in terms of house building cost at all. Hardwood floors are much more expensive - and still need to be swept or vacuumed.

stonemetal12 272 days ago
>A main disadvantage of central vacuums is the higher initial cost. In the United States, the average central vacuum system has an installed cost of around $1,000.

https://en.wikipedia.org/wiki/Central_vacuum_cleaner

Considering the price of a house it isn't "spectacularly expensive". On the other hand vs the price of a hoover yeah a bit. Since it sits in a closet or garage and doesn't move, weight becomes a non issue so it can be a real behemoth of a vacuum.

pm215 272 days ago
Seems like it could be rather expensive to retrofit, compared to cost of putting it in the house to start with (certainly I couldn't imagine getting it retrofitted into my UK house -- the labour alone to dig channels into walls and then make good again and redecorate would dwarf the equipment cost). But the thing about features put in a house when it's built is that they have to meet the economic bar of "do potential buyers value them enough to pay XYZ more than they otherwise would", which can be quite hard. If the average buyer wouldn't value the house at an extra $1000 then it's not worth the builder installing one...
jen20 272 days ago
I have one (in a ~2yo house). It's a nice idea, but practically the hoses are a pain in the ass, and I still use my Dyson almost exclusively.
kragen 272 days ago
Yes, one plumbed into the house.
worstspotgain 272 days ago
EINTR's design is one of computing's absolute classics. To MIT and New Jersey, we should add the McDougals approach: "I cannot work under these conditions." When faced with the PC loser-ing issue, just don't implement the code in question.

McDougals resolves the apparent conflict between the other two. It blames the interrupt hardware as the root cause. It produces non-working, incomplete software. It's kind of a modest proposal.

However, it also produces no ripples in the design fabric. With MIT, the OS source is a maintenance nightmare. With NJ, modern software still has to deal with archaic idiosyncrasies like EINTR. With McDougals, all the "conflict-free" portions of the software advance, those that write themselves.

The result is likely immediately shelved, perhaps as an open source PoC. Over time, someone might write some inelegant glue that makes interrupts appear to behave nicely. Alternatively, the world might become perfect to match the software.

If nothing else, the software will have mimicked the way we learn. We use imperfect examples to draw the idealized conclusion. Even if it never gets to run, it will be more readable and more inspiring than either MIT or NJ.

shagie 272 days ago
For another discussion on this https://wiki.c2.com/?WorseIsBetter which starts out with:

    RichardGabriel makes this observation on the survival value of software in the paper Lisp: Good News, Bad News, How to Win Big. See http://www.jwz.org/doc/worse-is-better.html for the section on WorseIsBetter.

    For those seeking the first node, see http://web.archive.org/web/19990210084721/http://www.ai.mit.edu/docs/articles/good-news/good-news.html.

    For even more context on WorseIsBetter see http://www.dreamsongs.com/WorseIsBetter.html. My favorite part is RichardGabriel arguing with himself.
jes5199 272 days ago
what's the old saw about "unix design prioritizes simplicity over correctness, and on modern hardware simplicity is also no longer considered necessary"
marcosdumay 272 days ago
There's a really important detail in that simplicity tends to lead to correctness.

Anyway, worse is better is about simplicity of implementation versus conceptual simplicity. By principle, that's a much harder choice.

enugu 271 days ago
Ironically, the main feature that separates LISP from other modern languages is homoiconicity/macros(now that features like garbage collection are mainstream).

And this leads to an easier implementation - parsing is easier(which is why code transformation via macros becomes easy).

kazinator 271 days ago
A language which has that feature tends to get classified as a member of the Lisp family, even if it is horribly "unlispy" under the hood in its semantics.
dang 272 days ago
These look to be the interesting threads on Gabriel's worse-is-better essays:

Lisp: Good News, Bad News, How to Win Big (1990) [pdf] - https://news.ycombinator.com/item?id=30045836 - Jan 2022 (32 comments)

Worse Is Better (2001) - https://news.ycombinator.com/item?id=27916370 - July 2021 (43 comments)

Lisp: Good News, Bad News, How to Win Big (1991) - https://news.ycombinator.com/item?id=22585733 - March 2020 (21 comments)

The Rise of Worse Is Better (1991) - https://news.ycombinator.com/item?id=21405780 - Oct 2019 (37 comments)

The Rise of Worse Is Better (1991) - https://news.ycombinator.com/item?id=16716275 - March 2018 (44 comments)

Worse is Better - https://news.ycombinator.com/item?id=16339932 - Feb 2018 (1 comment)

The Rise of Worse is Better - https://news.ycombinator.com/item?id=7202728 - Feb 2014 (21 comments)

The Rise of "Worse is Better" - https://news.ycombinator.com/item?id=2725100 - July 2011 (32 comments)

Lisp: Good News, Bad News, How to Win Big [1991] - https://news.ycombinator.com/item?id=2628170 - June 2011 (2 comments)

Worse is Better - https://news.ycombinator.com/item?id=2019328 - Dec 2010 (3 comments)

Worse Is Better - https://news.ycombinator.com/item?id=1905081 - Nov 2010 (1 comment)

Worse is better - https://news.ycombinator.com/item?id=1265510 - April 2010 (3 comments)

Worse Is Better - https://news.ycombinator.com/item?id=1112379 - Feb 2010 (5 comments)

Lisp: Worse is Better, Originally published in 1991 - https://news.ycombinator.com/item?id=1110539 - Feb 2010 (1 comment)

Lisp: Good News, Bad News, How to Win Big - https://news.ycombinator.com/item?id=552497 - April 2009 (2 comments)

dang 272 days ago
... and these are the some of the threads discussing it or aspects of it. Others welcome!

Worse Is Better - https://news.ycombinator.com/item?id=36024819 - May 2023 (1 comment)

My story on “worse is better” (2018) - https://news.ycombinator.com/item?id=31339826 - May 2022 (100 comments)

When Worse Is Better (2011) - https://news.ycombinator.com/item?id=20606065 - Aug 2019 (13 comments)

EINTR and PC Loser-Ing: The “Worse Is Better” Case Study (2011) - https://news.ycombinator.com/item?id=20218924 - June 2019 (72 comments)

Worse is worse - https://news.ycombinator.com/item?id=17491066 - July 2018 (1 comment)

“Worse is Better” philosophy - https://news.ycombinator.com/item?id=17307940 - June 2018 (1 comment)

What “Worse is Better vs. The Right Thing” is really about (2012) - https://news.ycombinator.com/item?id=11097710 - Feb 2016 (35 comments)

The problematic culture of “Worse is Better” - https://news.ycombinator.com/item?id=8449680 - Oct 2014 (116 comments)

"Worse is Better" in the Google Play Store - https://news.ycombinator.com/item?id=6922127 - Dec 2013 (10 comments)

What “Worse is Better vs The Right Thing” is really about - https://news.ycombinator.com/item?id=4372301 - Aug 2012 (46 comments)

Worse is worse - https://news.ycombinator.com/item?id=437966 - Jan 2009 (3 comments)

germandiago 271 days ago
For me this is just utopia vs real world.

What is better, a perfect design or one that exists?

Our mind tries to trick us into thinking that the choice is between perfect and not so perfect.

No, in real-life it is common that aiming for the perfect ruins everything else, even the existence of that idea in real form due to other constraints.

Taniwha 271 days ago
Of course the opposite is often stated as "Perfect is the enemy of Good"
dkasper 272 days ago
1991 tag. An all time classic.
orwin 272 days ago
This... Honestly the first lines seemed I thought this was an arrogant take, but he made _really_ good point and now I tend to agree with him.

Still I am a bit bothered, does a counterargument exist?

ripap 272 days ago
Worse is Better is Worse: https://www.dreamsongs.com/Files/worse-is-worse.pdf

Same author (name is an anagram).

samatman 272 days ago
May as well complete the set: Is Worse Really Better? https://dreamsongs.com/Files/IsWorseReallyBetter.pdf

Also: Nickieben Bourbaki might be an anagram of something, but it is definitely not an anagram of Richard Gabriel, with or without the P. There's no G, there's no h, there's an N and a k, it isn't even particularly close.

That claim is my best interpretation of this sentence:

> Same author (name is an anagram).

Although perhaps it was not your intention to connect the clauses in that way.

jes5199 272 days ago
the name also seems to be a reference to another pseudonym, https://en.wikipedia.org/wiki/Nicolas_Bourbaki
orwin 272 days ago
I wish i could upvote you more than once. This is a very good counterargument.
gpderetta 272 days ago
A pseudonym you mean?
jes5199 272 days ago
both!
AnimalMuppet 272 days ago
An anagram. Same letters, different order.
gpderetta 272 days ago
I might be dense today, but how's Nickieben Bourbaki an anagram of Richard P. Gabriel?
kqr 272 days ago
It is going to come down to context. For the most part, you never know quite what it is you are designing, so an iterative approach to design with fast feedback cycles will get you there quicker. Give people something to play with, see how they use it, and continue from there.

But sometimes you need to know what is you are designing before giving it to people, because there are large risks associated with improvising. In those cases, making The Right Thing is still expensive, but it may reduce the risk of catastrophe.

I think, however, that the latter cases are rarer than most people think. There are ways of safely experimenting even in high-risk domains, and I believe doing so ultimately lowers the risk even more than doing The Right Thing from the start. Because even if we think we can spend years nailing the requirements for something down, there are always things we didn't think of but which operational experience can tell us quickly.

gpderetta 272 days ago
Empirically, it seems Worse is Better appear to have been correct many many times.
kayo_20211030 272 days ago
The New Jersey approach has the benefit of dealing with time in a sensible way.

The time to market for the MIT approach is just too long if your revenue relies on actually shipping a product that covers the cost of the next iteration that will move you from 80% to 90%, or even 60% to 70%. It's an old joke, but in the long run we're all dead; and waiting for the production of an ivory tower implementation won't work out well. If it's all academic, and there's no commercial pressure, well, have at it. There's not much at stake except reputations.

Furthermore, in the real world, your users' requirements and your internal goals, theoretically covered by "the" design, will change. Not everything can be reasonably anticipated. The original design is now deficient, and its implementation, which is taking too long anyway, will be a perfect reflection of its deficiency; and, not fit for its new purpose.

AnimalMuppet 272 days ago
I'll go further. Even if users' requirements and your goals don't change, you don't adequately understand them. You don't know what your users need perfectly. It's better to fire rapidly and adjust your aim than it is to try to have your first attempt be perfect. (Yes, people take this too far the other way...)

Get something out there and start getting feedback. You won't actually know until you do.

ezekiel68 272 days ago
Yes and it almost serves as more of a coping mechanism (a thought framework that helps us accept what may not seem to be 'the better') than an airtight philosophical position.
NAHWheatCracker 272 days ago
One counterargument is that people who claim "worse is better" are often making excuses for why their preferred technology didn't win.

Often in these arguments, worse means "shortcut" and better means "won". The difficulty is proving that not taking the shortcut had some other advantages that are assumed, like in the article.

tialaramex 272 days ago
Winning is temporary. In 1900 Britain has an empire for example. Colonialism won right? The battleship, answering machines, VHS, the compact disk, steam engines.

This "victory" is fleeting. When people people tell you C++, a language which didn't even exist when I was born, is "forever" they are merely betraying the same lack of perspective as when Britain thought its Empire would last forever.

Feathercrown 272 days ago
I think the counterargument is that something being easy to proliferate doesn't mean that it's good.
bee_rider 272 days ago
I don’t get the name New Jersey approach, is it just the general association of New Jersey and poor quality? When I think of New Jersey and CS, I think of Princeton, which has a pretty good program IIRC.

Anyway, I wouldn’t put simplicity on the same level as the other things. Simplicity isn’t a virtue in and of itself, simplicity is valuable because it helps all of the other things.

Simplicity helps a bit with consistency, in the sense that you have more trouble doing really bizarre and inconsistent things in a simple design.

Simplicity helps massively with correctness. You can check things that you don’t understand. Personally, that means there’s a complexity prove after which I can’t guarantee correctness. This is the main one I object to. Simplicity and correctness simply don’t belong in different bullet-points.

Simplicity could be seen as providing completeness. The two ways to produce completeness are to either work for a really long time and make something huge, or reduce scope and make a little complete thing.

It’s all simplicity.

dmansen 272 days ago
bee_rider 272 days ago
Oh. Ok, that makes sense.
272 days ago
ezekiel68 272 days ago
I made the same error when I first read it years ago. It certainly felt like an academic reference.
sebastianconcpt 272 days ago
To Worse is Better I'd say, careful with what you wish.
karel-3d 272 days ago
(1991)
aredox 272 days ago
And then we wonder why everything gets worse and worse.
st_goliath 272 days ago
> ... everything gets worse and worse ...

    They're coming to take me a away haha
    they're coming to take me a away hoho hihi haha
    to the funny farm where code is beautiful all the time ...
-- Napoleon XIV, more or less...

Via: https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...

ta988 272 days ago
what is getting worse and how?
pulse7 271 days ago
"worse is better" is "the right thing"
hammock 272 days ago
There was an article posted on here[1] a while back that I only just found again, introducing the term "expedience." The idea was that we think we live in a world where people have to have "the best" sweater, be on "the best" social network, drive "the best" car, etc. But when you look at what really WINS, it's not the best, it's the most "expedient" - i.e. sufficiently good, with built-in social proof, inoculated of buyer's remorse, etc.

Is Amazon "the best" place to go shopping? No, you might find better prices on individual items if you put a little more work into it, but it's the most expedient. Is Facebook/Instagram/Tiktok/insert here "the best" social network? No, but it is the most accessible, easy-to-use, useful one. Is a Tesla (perhaps outdated example since X) "the best" car - no, but it is the most expedient.

There is a tangent here that intersects with refinement culture as well. Among the group of society that (subconsciously) care about these "expedient" choices, you see everyone and everything start to look the same

[1]https://tinaja.computer/2017/10/13/expedience.html

dkarl 272 days ago
"Expedient" is a common (or at least not rare) English word that means something like "practical and effective even if not directly attending to higher or deeper considerations."

For example, if two students in a class are having frequent confrontations that bring learning in the class to a halt, and attempts by teachers and counselors to address their conflict directly haven't been effective, the expedient solution might be to place them in separate classes. The "right thing" would be to address the problem on the social and emotional level, but if continued efforts to do so is likely to result in continued disruption to the students' education, it might be better to separate them. "Expedient" acknowledges the trade-off, while emphasizing the positive outcome.

Often a course of action is described as "expedient" when it seems to dodge an issue of morality or virtue. For example, if we solve climate change with geoengineering instead of by addressing thoughtless consumerism, corporate impunity, and lack of international accountability, many people would feel frustrated or let down by the solution because it would solve the problem without addressing the moral shortcomings that led to the problem. The word expedient stresses the positive side of this, the effectiveness and practicality of the solution, while acknowledging that it leaves other, perhaps deeper issues unaddressed.

circlefavshape 272 days ago
> For example, if we solve climate change with geoengineering instead of by addressing thoughtless consumerism, corporate impunity, and lack of international accountability, many people would feel frustrated or let down by the solution because it would solve the problem without addressing the moral shortcomings that led to the problem.

Oof. Now I understand something I didn't before

MikeTheGreat 271 days ago
Are you sure? No offence, but I don't think there's anything to understand here.

If we could solve climate change without "addressing thoughtless consumerism, corporate impunity, and lack of international accountability" we would all be f'ing _thrilled_.

As I type this Hurricane Helene just destroyed a good chunk of inland North Carolina (!!!) and Hurricane Wilton was just upgraded to a "category 5" storm.

If we could solve climate change the easy way we'd all be _thrilled_, because then we'd actually solve climate change.

inglor_cz 271 days ago
"we would all be f'ing _thrilled_."

I think the argument is pretty much the opposite: not everyone would be thrilled. The wannabe priest class (think of Greta and her How dare you!), which is always with us, would be frustrated from the lack of something to preach about.

Of course there is always Israel vs. Palestine.

tovej 271 days ago
I myself am convinced that man-made global warming can't be stopped without changing the way our political economies work, and I happen to believe that Palestinians are people and therefore deserve basic human rights (which they do not currently have). I'm also a researcher.

I _would_ be thrilled if there was an expedient way to solve climate change. But the best systems models for the earth all tell us that there's one way to solve the issue: just leave the carbon in the ground. That's it. Stop extracting it. Nothing else will solve the problem, it's a really simple, really bad, feedback loop.

This characterization of environmental or Palestinian activists as wanting to have the moral high ground is, imo, a knee jerk reaction. The people on the street aren't in it for clout, they're doing it because it is the right thing and they feel compelled to act. What gets me moving is not wanting to feel morally superior (a religious aspect, more at home in right wing politics), but an anxiety for the future, which is projected to include horrible death and suffering due to obvious problems that we could all fix if we just decided to recognize them.

inglor_cz 271 days ago
The motivations of regular participants vs. leaders may be rather different. Only a specific type of person is attracted to leading crowds.
foldr 271 days ago
You can apply the same cynicism to any political or protest movement. Not sure that it tells us very much.
inglor_cz 271 days ago
I agree with your first sentence, only I would replace the word "cynicism" with "skepticism".

And I think it is actually useful. People will try to manipulate other people through emotions, and mobs are easy to manipulate. One should have fairly high barriers before joining a street mob, because its potential destructive power is enormous, and it also tends to elevate unsavory characters to positions of power.

I am not saying that those barriers should be infinitely high, but fairly high.

For us humans, it is easy to succumb to "righteousness in numbers".

foldr 271 days ago
If innocent people being murdered or the threat of an imminent environmental catastrophe don't meet your 'high barriers', then nothing will. So though you claim in principle to approve of some protests, what you're saying in practice is that no-one should protest against anything because they'll probably just make things worse – because people in general are fairly awful and people who take charge of things are even worse. It's impossible to argue against this kind of cynicism as it's self-reinforcing, but it doesn't strike me as an interesting or insightful position to take. Especially when painting in broad brushstrokes rather than addressing issues with particular political or protest movements (which no doubt are not beyond criticism).

It's also important to weigh the harmful effects of apathy in the balance. These are easily forgotten but almost inestimably enormous. Just think of all the damage done in the decades (centuries) where hardly anyone could be bothered to protest against slavery, women's oppression, racial segregation, pollution, etc. etc.

inglor_cz 271 days ago
War is often more complicated than "innocent people being murdered" and we both know it. The Israeli-Palestinian conflict isn't morally black and white, and the current Israeli-Hezbollah conflict is something else entirely.

I think you may be proving my point. Taking one side of a complicated situation because of a black-and-white moralistic thinking is potentially destructive, and organizations like Hamas benefit from that.

As for your slavery example, did slavery disappear because humanity awakened morally and started demonstrating in the streets, or because we gained a new non-human resource of raw power? Previous civilizations didn't engage in slavery because they were profoundly immoral, but because human and animal muscle was the only practical source of power. The specifics varied across the globe, but unfree labor was ubiquitous in pre-modern societies.

For a contemporary situation, imagine a 22nd century activist judging people of 2024 for eating meat from dead animals, when he can get a good steak by pressing a button on a steak-making machine. It wouldn't be demonstrations which made the difference between 2024 and 2124.

foldr 271 days ago
Protestors are protesting against things that they think are seriously wrong. What you think about the Israel-Palestine conflict or the history of the abolition of slavery is completely irrelevant to their decision whether or not to protest about something. (But err, yes, popular anti-slavery movements played an important role in the abolition of slavery. The Haitian revolution didn't happen because we 'gained a new non-human resource of raw power'.)
inglor_cz 271 days ago
"Protestors are protesting against things that they think are seriously wrong."

OK, but that was sort-of my point. The more outrage, the less you need to really think about things.

"err, yes, popular anti-slavery movements played an important role in the abolition of slavery"

That is a chicken-and-egg question. Why did those mass movements only emerge at the time of the Industrial Revolution, and why did they emerge first in places that were influenced by the Industrial Revolution the earliest, while other places (Russia, the Ottoman Empire, the Qing Empire) only followed suit after their own industrialization began?

I don't think the arrow of causality is so simple here. A hypothetical society that abolished slavery, serfdom etc. in the 15th century could easily prove non-viable against its slavery-powered foes, which had more brute force at their disposal. By 1820, the situation was very much turning around and it was the modern, personally freer societies that were more effective in commerce and at war.

Notably, even though Victorian Britain was very anti-slavery, starting with the monarch herself, it had no moral qualms against subjugating a quarter of humanity in another form of submission. Which tells me that it was less about morality (equality) and more about practicality of the situation.

foldr 271 days ago
Everyone agrees that innocent people are being killed in the Israel-Palestine conflict and that this is an outrage. The disagreement is over exactly which people fall into this category and who is to blame. Acknowledging the horror and being outraged by it does not preclude thought, and it is ungenerous and inaccurate (and, indeed, cynical!) to characterize all protests about the conflict as thoughtless.

Your take on slavery is pretty wild. The Industrial Revolution did not replace Haitian slaves with machines for harvesting sugar cane. Nor did Spartacus invent the steam engine.

inglor_cz 271 days ago
Wars are brutal. No doubt about that. Nevertheless the disagreement that you mention ("which people fall into this category and who is to blame") seems to run so deep even here in the West, that I wonder if some of those protests wouldn't end up in an old-fashioned pogrom, if they weren't thoroughly policed from the outside.

Existence of more-or-less successful slave revolts across history doesn't really say much about viability of slavery as an economic institution. I don't think my take is pretty wild. The historic correlation between industrialization and abolition of slavery is rather strong, and while we can argue about whether it was causative, the hypothesis is at least plausible.

"Wild" would be if I attributed abolition of slavery to something that is clearly uncorrelated with it, so, say, the Milankovic cycle.

foldr 271 days ago
>Existence of more-or-less successful slave revolts across history doesn't really say much about viability of slavery as an economic institution

That's the point. The Haitian revolution didn't have anything much to do with the economic viability of slavery, but it still happened, and was a major and very definitely causative event in the broader history of the abolition of slavery.

If you think that slavery ended for purely economic reasons, then perhaps you can point to a mainstream historian who advocates this theory. I don't think you are doing your overall argument any favors by tying it to wild revisionist lost causes.

dmbche 271 days ago
So free association is less than ideal is what you're getting at?
inglor_cz 271 days ago
If you abstract away enough, you will always get to "X is less than ideal".

Food is less than ideal, war is less than ideal, death is less than ideal, HN is less than ideal.

Are you satisfied with this sort of Twitter-like posting and thinking? I am not.

Pixels are basically free and we should strive to post more than one-sentence snarks. For one-sentence snarks and drive-by dismissals, Reddit is the ideal territory.

dmbche 271 days ago
[flagged]
bsenftner 271 days ago
And I'm of the opinion that we'll not change our political economies without a material society adjustment, a threshold recognition between an emotional reasoning and a more controlled rational level of reasoning. Basically maturity; we've got a material volume of immature adults that derail any and all public and many private conversations with immature observations, short sighted reasoning, and the belief that their unprofessional opinion carries weight with those whose careers are the issue at hand. Until we do something about these, frankly, idiots, we're left adrift in a culture of unintended chaos.
dmbche 271 days ago
I highly recommend the feeling of living grass on the palm of your hand my friend. Or the kiss of the suns rays.
clarity20 271 days ago
I don't see why climate change needs "solving" per se, or how it can be "solved." To take your example, there have always been hurricanes. It's not correct to infer there's a human-induced tendency toward destruction that can be reversed by humans, or that yet another hurricane is actually a change to the climate in the first place.
moomin 271 days ago
It isn’t correct to infer that from the fact that hurricanes exist, no.

Nor is it correct to ignore the decades of peer-reviewed research that concludes that we really are causing more hurricanes on the basis that hurricanes have always existed.

hoseja 271 days ago
It's the meme:

A:"Only Global Communism can solve Climate Change."

B:"Nuclear power also solves climate change."

A:"I don't want to solve Climate Change, I want Global Communism."

downWidOutaFite 272 days ago
No you don't because there is no expedient solution.
raincole 272 days ago
> if we solve climate change with geoengineering instead of by addressing thoughtless consumerism, corporate impunity, and lack of international accountability, many people would feel frustrated or let down

If some people feel frustrated or let down because we achieve a literal miracle (by today's technology standards) that saves millions of lives I'm willing to call them mentally unhinged.

computably 271 days ago
Presuming said miracle is possible, geoengineering at a scale capable of "solving" climate change would be a massive gamble. Humanity's ability to model the climate simply isn't at a point where we could say with any confidence what the long-term effects of any particular geoengineering "solution" would be, and short of an abrupt technological singularity, won't be for centuries. Without any appeal to moral judgments whatsoever, it's safe to say geoengineering would only be seriously attempted out of total desperation and signal persistent unresolved issues.
reshlo 271 days ago
If we could solve climate change purely technologically, I wouldn’t feel let down by the fact that we solved climate change. I would absolutely support solving it in that way.

I would also feel frustrated by the knowledge that there were many people who were willing to sacrifice unimaginable numbers of humans and animals for the sake of making more profit for themselves, who were not held to account for their actions. If a person acts in a way that they should know will lead to future suffering, the development of an unforeseen technological solution to that suffering should not wipe their moral slate clean.

Trying to kill someone using a non-functional weapon, that you believe is functional, is not morally equivalent to taking no action just because it didn’t have an effect.

dkarl 271 days ago
Not unhinged, just human. There's even a logic to their views: when different approaches to a problem are on different sides of a line of political polarization, people see them as competing, and see the success of one as tending to invalidate the other. I've heard/read politically minded people on either end of the spectrum talk about global warming as a challenge that is going to discredit the other side, by which they mean that the success of their preferred approach will discredit other approaches. And I think there's a lot of truth to that, because of how most human beings see things. Thinking from an engineering mindset, we see different ways of solving a problem as complementary rather than mutually exclusive, and to us it's common sense to 1) pursue both of them simultaneously given the high stakes and uncertainty of success, and 2) continue to improve our understanding of different approaches so we can apply them all effectively in the future in situations that call for them. But thinking from an engineering mindset is not normal. That's not how most people see things. I think the way most people see things could be described as a zero-sum competition for mindshare.

You can even see the tendency to see complementary approaches as competing at work in smaller divisions between people who are mostly politically aligned. For example, when one group wants to take immediate direct action to ameliorate a problem and another group wants to focus on longer-term fundamental solutions, they might fight bitterly and badmouth each other's approaches if they feel they are competing for a finite pool of resources such as funding, political backing, or public attention.

For a more concrete example, think of how the issues facing Black Americans recently came to the forefront of public consciousness, books like Between the World and Me and Stamped from the Beginning were shaping the public conversation, and in this context, some people tried to harness that energy to get the public interested in social class as well. When they tried to inject class into the conversation, they often encountered pushback from people who were like, "Hey, the way the public is paying attention to the Black American experience is really special and amazing right now. Let's focus on maintaining this momentum. Save class consciousness for another day." It's not that the majority of people on the left disagreed about class being an important perspective for gaining insight. They simply believed that the public's appetite to learn about a topic like racism was limited and fragile, and it would collapse faster if they tried to add another topic on top.

For a small minority of deeply ideological people, the feeling of competition was stark and intense. Some claimed that talk about class was an attempt to change the topic away from racism and put white people back in charge, and others claimed that identity politics was a class warfare weapon of distraction. But even people who valued both race and class as ways of understanding society perceived a competition for public attention.

hnben 271 days ago
non-native speaker here

> "Expedient" is a common (or at least not rare) English word that means something like "practical and effective even if not directly attending to higher or deeper considerations."

I was not aware of the word "Expedient" before. From your example I conclude, that it has the same meaning as "pragmatic", i.e. if I sed i/expedient/pragmatic/g then your comment still makes perfect sense to me. A quick google search also seems to support this conclusion.

--> is there a nuance of the word "expedient", that I am missing here?

dkarl 271 days ago
There's a lot of overlap, and you could certainly use "pragmatic" in the two example contexts I gave. The differences as best as I can sum them up are:

- "Pragmatic" can also be used to describe a person (a pragmatic person) or a mindset (a pragmatic approach to a problem.) "Expedient" isn't used to describe people.

- "Expedient" usually acknowledges the existence of a higher or more demanding standard that the solution does not meet, admitting that the solution is not perfect. You might choose a word like "pragmatic" to praise a solution with known shortcomings, but it doesn't imply known shortcomings as strongly as "expedient" does.

- "Expedient" can be used euphemistically. ("Pragmatic" can, too, but not nearly as often, and not as harshly.) "They took the expedient route" might, depending on the context, mean that they did something lazy or unethical because it was easier. The euphemistic usage is common enough that for some people it has an overall unsavory flavor, but I don't think it's tipped over into the euphemistic usage being the assumed one.

mikelevins 271 days ago
"Pragmatic" connotes a choice that achieves a goal by practical means, possibly sacrificing some desirable features that are challenging to achieve.

"Expedient" connotes a choice that achieves a goal quickly and conveniently, possibly by sacrificing some important but difficult features, and incurring some undesirable costs or consequences.

So they're similar, but not identical. A pragmatic choice might be improved when you can afford it. An expedient choice will probably require some cleanup afterward.

KerrAvon 272 days ago
I think this is misleading -- geoengineering + social change will be necessary. You're not going to Scotty your way out of climate change.
ambicapter 272 days ago
What's misleading here? He was using that as an example of usage of the word 'expedient', not actually suggesting climate change solutions.
samplatt 271 days ago
A lot of people in this thread seem to think OP was suggesting that geoengineering is a possible way forward right now, when they were just positing it as an example.
groestl 271 days ago
Since OP provided a different example, an expedient solution would be to edit OPs post and remove that paragraph about geoengineering, and delete all comments referring to it.
CM30 272 days ago
The technical solution is the only practical solution. People aren't gonna give up a large percentage of their lifestyle for the sake of some greater 'good', especially not if their leaders and influencers seem to have zero interest in doing the same.

And anyone trying otherwise will struggle significantly at the polls. Mass carbon removal, renewable energy, recycling and maybe some technological solutions to limit the effects of atmospheric carbon seem like the more practical way to go.

adastra22 272 days ago
Why not? Really, why not? If we had a profitable way to extract CO2 from the atmosphere at scale, to deacidify the oceans, clean up toxic waste, etc., what would be left? How would that not solve the problem?
Hasu 272 days ago
No technology has been invented that doesn't have costs and tradeoffs. Technology that deacidifies the oceans will have other costs, other externalities that we cannot predict now. Determining how we want to deal with those costs/tradeoffs is a social problem, not a technical problem. Technical know-how can only inform us about the what tradeoffs are available, it can't tell us what we prefer.
adastra22 272 days ago
The ion pumps enabled by the technology we are working on won’t have external effects. They basically just filter out certain small molecules from the ocean into crystal storage.
thfuran 272 days ago
What kind of flow rate would be necessary to pull CO2 out of the ocean faster than it dissolves from the air, and is that achievable without affecting the surroundings?
adastra22 272 days ago
CO2 wouldn’t be pulled out of the ocean. You’d have to liquify it from air in a different process. The molecular pumps are used to extract dissolved ions or solutes from sea water and only act as fast as diffusion.
thfuran 271 days ago
So this is desalination? That seems unrelated.
adastra22 271 days ago
I listed cleaning up toxic waste and co2 sequestration as separate things, yes.
einpoklum 271 days ago
Because it's a distraction.

Global warming and other environmental crises are unfolding right now. While exploring possible future technological advances which would make coping with it easier is certainly a positive and useful pursuit - it cannot be the _main_ pursuit when facing those crises and challenges. That is:

1. We should not divert the discussion from present to possible fortuitous futures.

2. We must not confuse action with prospects.

3. We must not think of the two as "either-or". We can reduce emissions _and_ do R&D for new tech possibilities.

adastra22 271 days ago
There are plenty of examples of technological advances that in one fell swoop entirely eliminate a class of societal problems. Haber-Bosch, for example, completely eliminated famine as a barrier to world populations growth. Penicillin and vaccines eliminated entire categories of terminal disease.

We don’t NEED to reduce emissions. So long as we clean up as much or more than we pollute, what’s the problem?

einpoklum 271 days ago
Those are poor, poor examples; but before examining them - I can simply refer you to my previous comment. You are doing exactly the three things I caution against: Attempting to divert the discussion, confusing prospective futures with reality, and reinforcing a supposed dichotomy.

As for the examples:

* For every example of a technological advance that eliminated a class of societal problems, there are five examples of advances which didn't, and untold examples of advances which just never happened (or never happened the way they were expected to). Where is our transmutation of led to gold? Airships? Or Dennard scaling for that matter? No use writing efficient software, our computers will just get faster and it'll be fine.

> Haber-Bosch, for example, completely eliminated famine as a barrier to world populations growth.

And we (= humans) now have to work hard, and suffer through all sorts of problems, to establish barriers to population growth, and to cope with the resource use pressure of the huge population on the Earth.

Not to mention -

* Famines are alive as well.

* Massive energy requirement; and once the population is up - you can't just give this agriculture-industrial choice and let people starve.

* More industrialized economies able to produce a lot more than less-industrialized/poorer economies and areas, exacerbating all sorts of power dynamics, e.g. agricultural "dumping" and mass destitution of peasants who become unable to compete, without a transition having been planned.

* Some detrimental environmental effects.

That's not to say this process shouldn't be used - it's just that it's not a panacea.

> Penicillin and vaccines eliminated entire categories of terminal disease.

They did not. They reduced the fatality rates significantly, for a long period of time - which, apparently, may be drawing to an end over the next few decades:

https://www.wired.com/story/antibiotics-resistance-useless-p...

but even if that doesn't quite happen - antibiotics have absolutely _not_ :

* Reduced the need to Keep medical environments clean, carefully sterilize tools, wear gloves and "scrubs" etc. when treating patients.

* Made it uninteresting or unimportant to avoid mass infections - even those amenable to treatment with Penicillin or vaccines

* Reduce interest or prevalence of other forms of treatment of germs and viruses which are amenable to Penicillin or vaccines.

adastra22 271 days ago
You want there to be fewer people? Why?

I don’t think I can continue this conversation in good faith. I hope someday you can see the intrinsic evil of the world view in which more people = bad.

downWidOutaFite 272 days ago
why are you fantasizing? none of that is going exist (caveat: unless we figure out infinite clean energy).
adastra22 272 days ago
It’s not fantasy. I’m working on a startup to enable this technology. It’s a hard problem, but not impossible. And the energy needs are not as great as you think. A relatively small patch of the Sahara or Gobi desert or open ocean would be sufficient.
Log_out_ 271 days ago
its indistinguishable from fantasy extrapolating from the given track record . Previous fantasy technology break through like harbor bosch were viable only "after" major diaasters and worldwars and brought with them ever more destructive hidden costs.
adastra22 271 days ago
Haber-Bosch predates the world wars.
Log_out_ 271 days ago
But was only usefull with free trade guarantees by a sea empire. In colonial empire times it might aswell have been alchemy to make free cheese on the moon.
adastra22 271 days ago
I am confused. Haber-Bosch was developed to work around the need for free trade guarantees. There was enough nitrates in Chile for the foreseeable future in 1911, but Germany wanted a local source of fertilizer to remove their foreign dependency.
downWidOutaFite 271 days ago
You might need to panel over the whole Sahara: https://www.wired.com/story/the-stupendous-energy-cost-of-di...
adastra22 271 days ago
Much less than the whole Sahara.
capitainenemo 272 days ago
sun shields? There's some testing of those going on right now. You wouldn't need a huge reduction in sunlight either. Not enough to impact plants.
capitainenemo 265 days ago
I see this got downvoted. Maybe a citation would have helped. https://www.space.com/sunshade-earth-orbit-climate-change
hinkley 272 days ago
See also: war crimes
lisper 272 days ago
> Is Amazon "the best" place to go shopping? No, you might find better prices on individual items if you put a little more work into it, but it's the most expedient.

It's not just that. Every time you do business with a new web site you assume additional risk. Amazon is a known quantity. You can be pretty sure that they are not going to outright scam you, and they aren't going to be hacked by script kiddies. There is a significant risk of getting a counterfeit item, but they have a very liberal return policy, so the real cost to you in this case is a minute or two to get a return code and possibly a trip to the nearest Whole Foods to drop it off.

Amazon sucks in many ways, but at least their suckage is a known quantity. Predictability has significant value.

katbyte 272 days ago
Yep it’s the return policy that allowed me to gamble on items that may or may not be real/functional vs spending a ton of time to find one elsewhere that maybe is, but if it’s not it’ll be hard to return
d0mine 272 days ago
There is also "satisficing" (vs. maximizing).

Your model of the world is not perfect so instead of trying to find a globally optimal solution, you are satisfied with a local optimum that exceeds some threshold that has to suffices. https://en.wikipedia.org/wiki/Satisficing

hammock 272 days ago
Love that. Well-done marketing* can orient a consumer into a preferred "local optimum territory" , leading to satisfiction(?) and sales

* For example, the limited selection of candy at the checkout aisle. All you have to do is get your brand in there. (Placement on the 4P's)

* Or, "best credit card for travelers." By offering travel rewards, you can acquire a group of cardmembers even if, e.g. a more valuable cashback card could have gotten them even greater benefits (Promotion on the 4P's)

RcouF1uZ4gsC 272 days ago
>Is Amazon "the best" place to go shopping?

The number one reason I use Amazon, is not for the best prices, but because of their return policy. Amazon returns are actually often more painless than physical store returns.

Being able to return something predictably and easily outweighs a small difference in price.

WalterBright 272 days ago
> you might find better prices on individual items if you put a little more work into it

That extra work costs you money, too. Calculate how much your job pays you per hour, then you can deduce the $cost of spending more time to get a better deal.

Gigachad 271 days ago
This is a fairly crappy methodology though because the vast majority of people are not substituting paid working time with researching buying things online. So it hasn't "cost" them anything other than their free time which is far more complex than an hourly rate. Maybe they enjoy researching products and in that situation, it wasn't a waste of time at all.
ryandrake 271 days ago
Exactly. I always hate these "your time is worth your hourly wage" arguments. They're often used to argue against things like changing the oil in your car, fixing a clogged drain, or DIYing anything.

Your time is only worth money if you'd otherwise be working at that rate, which is not the case for the vast majority of humans.

WalterBright 271 days ago
Saving money is the same thing as earning it. Are you better off spending an hour to save $100 or an hour to save $10?

I once spent 2 hours negotiating the purchase of a car. It saved $5000. That works out to $2500 an hour. Was it worth it? Hell ya!

I've also worked hourly jobs in the past. There were often opportunities to work more hours or overtime. People often have side hustles, too.

contagiousflow 272 days ago
Are you removing paid working time in doing the extra work? If not it is just an opportunity cost
WalterBright 272 days ago
In my line of work, yes.
AnimalMuppet 272 days ago
If you include the cost of gathering information, the expedient solution may in fact be the best.
onlyrealcuzzo 272 days ago
> Is a Tesla (perhaps outdated example since X) "the best" car - no, but it is the most expedient.

The most expedient car? Or BEV in the US?

hammock 272 days ago
Fair point... it was a coastal California-centric point but there is plenty of nuance or adjustment to be made. At some point in the 90's we'd probably have said "Silver E class Mercedes" is the most expedient luxury sedan, if you wanted a different example.
UniverseHacker 272 days ago
The early to mid 90s E class aka W124 definitely went down in history as one of the best quality and best designed cars ever made. Other luxury cars may be faster, more fun, or have more features… but the W124 probably is “the best” if you’re looking just at build quality and well thought out design details.

As a car nerd though, I never felt the need to buy one because they just seem fairly boring- other than a few rare models most were 4 door sedans with automatics, fairly small engines, and soft non sporty suspension.

adastra22 272 days ago
I think Toyota would have gotten your point across better. Tesla is most certainly not expedient. It is a luxury purchase.
foobazgt 272 days ago
It's definitely in a class above Toyota, but once you account for gas savings, the LR RWD costs about as much as the cheapest Corolla you can buy. People bagged on Tesla being too expensive when their $35K car was only available over the phone. Now, adjusted for inflation, the LR RWD is $28K and comes with 100mi additional range to boot. On top of that, it's $13K below the average car purchase price.

IMO, it destroys its competitors in the value market, and the media is being awfully silent about it. I guess it's far too easy to focus on Elon instead.

adastra22 272 days ago
Really depends on where you live and what your electric rates are. These is my breakdown (Corolla comes out ahead as cheaper to operate): https://www.reddit.com/r/TeslaModel3/comments/14rj3fp/tesla_...
271 days ago
marxisttemp 272 days ago
“In a class above Toyota” in what sense? Certainly not in reliability or interior quality or CarPlay compatibility…
KerrAvon 272 days ago
Neither. Going by the parent poster's gauge, the most "expedient" is probably Lucid; better engineering, better range, and better service.
KerrAvon 272 days ago
but it's really not what "expedient" means
b3ing 272 days ago
I might argue it’s the one most known by the most people, the “best” takes time to get there, Google was better than yahoo but it took years to become #1 in terms of hits
jprete 272 days ago
Related - thinking takes a lot of energy, so people prefer options that are cheap to evaluate. This definitely contributes to the preference for expedient options.
rpastuszak 271 days ago
272 days ago
aulin 271 days ago
> inoculated of buyer's remorse

Non native here. What's the meaning of inoculated here?

It's not the first time that I struggle to parse this word. In italian it keeps the original latin meaning and can be translated with "injected with". You could inoculate a vaccine but you could also inoculate a poison, so it does not carry the immunity meaning by default. English (US?) as far as I can tell use it as a synonym of "immune", is that so?

pmg101 271 days ago
It should be "inoculated against" but the meaning is clear.

A vaccine inoculates you against a disease by a physical mechanism: that is, it prevents you from getting that disease (to a greater or lesser degree).

Metaphorically being inoculated against something means it can no longer hurt you. For instance, maybe by not owning a car you're inoculated against vehicle depreciation. Or by wearing the same simple but quality outfit every day you're inoculated against the vagaries of fashion.

avidiax 271 days ago
I think the author means "vaccinated". They mean that they've been made resistant to buyer's remorse.
271 days ago
NAHWheatCracker 272 days ago
I'll never understand the obsession with LISP. My guess is it just appeals to a certain type of person, sort of academic in my view. I'm not convinced that LISP was ever the-right-thing. The author didn't express anything about LISP vs C except to assert that C was a 50% solution and LISP was better.

I agree though that for practical purposes, practical solutions are just going to be more successful.

pjc50 272 days ago
Over the years I've developed what I call the "lefthanded scissors" analogy: people assume that everyone's mind is wired the same way and that all good programmers are good in the same way and think in the same way, but what if that's not true? What if different people have a predisposition (like lefthandedness) to prefer different tools?

Then a righthanded person picks up the lefthanded scissors and deems them weird and uncomfortable. Which they are .. for the right hand of a right-handed person.

Other popular examples of such taste controversy are Python's semantic whitespace, the idioyncracies of Perl, the very unusual shape of J/APL, and anyone using FORTH for non-trivial purposes.

edit: https://news.ycombinator.com/item?id=41766753 comment about "other people's Lisp" reminds me, that working as a solo genius dev on your own from-scratch code and working in a team inside a large organization on legacy code are very different experiences, and the "inflexibility" of some languages can be a benefit to the latter.

eikenberry 272 days ago
I've always preferred to use classical "fine" artists as an metaphor for development language preferences. Artists don't just pick up any medium at a whim, most have specific mediums that they excel in and ignore the others. For example Van Gogh produced his most famous works in oil paints. He had worked in other mediums but his talent definitely seemed to shine when working in oils and when he first found oils he took to them quickly and adopted them as his primary medium.

Most people definitely seem to have specific preferences built into their talents. I don't see why programming or programming languages would be any different from any other medium or art form.

Jach 272 days ago
In baseball and other sports, pros and cons of different styles are readily and honestly talked about, even when coaches have a bias. See e.g. https://blog.paddlepalace.com/2014/01/coaching-tip-playing-t... for table tennis. Few comparable articles in programming exist; either people can't conceive of other styles or contexts, or just want to talk about the superiority of their bias regardless of context. If downsides are mentioned at all it's often not about the preferred thing, but some deficiency of something else.

A commonly made up deficiency attributed to Lisp is that it's particularly bad at large scale, either in teams or program size. That would surely be surprising news to the teams doing stuff today or in the past, some responsible for multi-million lines of code systems (some still in operation). Or to use an old example, the documentation for Symbolics Computers, pictured here in book form: https://www.thejach.com/public/symbolics-books-EugyAAEXUAUG_... Such a set of books doesn't come from a "lone wolves only" ecosystem and heritage. Not to mention doc and so on not shown for applications they made for 3D graphics or document editing (https://youtube.com/watch?v=ud0HhzAK30w)

arethuza 272 days ago
I've seen Common Lisp source code that I didn't even recognise as Common Lisp because of over-enthusiastic use of reader macros...

Edit: I should also mention that once I worked out how reader macros worked I went on to enthusiastically (ab)use them for my own ends...

blenderob 272 days ago
If there has been over-enthusiastic use of reader macros, I think we will have to admit that the over-enthusiastic developer is no longer writing Common Lisp code.

Those reader macros have morphed the language into a new bespoke language. So it is then natural for a new developer to face a steep learning curve to learn that new bespoke language before they can make sense of the code.

I'm not condoning overuse of macros. I'd hate to be working with such code too. I'm only stating that Common Lisp is that language that can be programmed to become a different language. They call it a "programmable programming language" for a reason.

kayodelycaon 272 days ago
> What if that's not true?

It is definitely not true. The existence of neurodivergence is proof enough.

I can visualize an entire application in my head like it is a forest. I can fly around and see various parts and how they fit together. I am unable to juggle the tokens necessary to do logic and basic math in my head but I have no trouble reading a graph in understanding the relationships between numbers.

No matter how many times I try to read it, Lisp is completely inscrutable to me. It requires the same token juggling math does.

sceptic123 272 days ago
I would say the analogy fails because it's not about discomfort or weirdness (or taste); left-handed scissors just don't work if you try to use them right-handed. But right-handed scissors don't work if you try to use them left-handed either.

So it's not about taste or preference, left handed people learn how to use right-handed scissors, but they can also use left handed scissors in a way that a right-handed person would struggle to.

All that said, the analogy still works because most people don't understand why it doesn't work.

adamc 271 days ago
As a lefty who has used right-handed scissors in many situations all my life, it isn't that simple. I can cut with right-handed scissors. It requires changing the way I use them in a way that is a bit weird and unnatural, but it works.

It's very much about what is comfortable for people.

sceptic123 270 days ago
100% agree that to make them work in the wrong hand you need to use them in a way that feels weird but the original comment was:

> a righthanded person picks up the lefthanded scissors and deems them weird and uncomfortable. Which they are .. for the right hand of a right-handed person.

What I disagree with is the idea that you can know, just by picking up a pair of scissors if they are left or right-handed. Except for ergonomically shaped ones!, you can't immediately tell. It's only obvious that something is wrong when you try to use them, and even then, I don't think many people would know why.

kqr 272 days ago
> I'm not convinced that LISP was ever the-right-thing.

Remember that this has to be read in historical context. At the time C was invented, things like garbage collection, message-passing object-orientation, generics, rich sets of conditionals, first-class functions, etc. were brand spanking new. They were The Right Thing to do (even judged in the harsh light of hindsight), but also quite complicated to implement – so much so that the New Jersey people skipped right past most of it.

Today these things are par for the course. At the time, they were the The Right Thing that made the system correct but complex, and had adoption penalties. As time passes, the bar for The Right Thing shifts, and today, it would probably not be embodied by Lisp, but maybe by something like Haskell or Rust?

kragen 272 days ago
This paper was written in 01991. Garbage collection is from 01959. Message-passing (aka object-orientation) is from 01972. C is from 01973. If by "generics" you mean parametric polymorphism, those were added to Ada and C++ in the mid-80s, after having been invented in ML in 01973; in a sense they're an attempt to bring the virtues of dynamically-typed languages like Lisp to statically-typed languages. Even today, I don't think there's a Common Lisp implementation with any kind of parametrically-polymorphic static type system, not even as an optimization. First-class functions are also from 01959, or, arguably from 01936, in the untyped λ-calculus, or 01920, in combinatory logic.

Some of these things were brand spanking new in 01973, but none were in 01991.

There were Lisp systems for minicomputers like the PDP-11; BSD included one (which I think was ancestral to Franz Lisp) and XLISP ran on CP/M. And of course Smalltalk was developed almost entirely on PDP-11-sized minicomputers. But to my recollection virtually all "serious" software for microcomputers and minicomputers was written in low-level languages like assembly or C into the late 80s, not even PL/M—though Pascal did start to win in the late 80s, in significant part by adopting C's low-level features. Nowadays, microcomputers are big enough and fast enough that Lisp, Haskell, or even Rust is viable.

I don't think "the right thing" is mostly about what features your system has. I think it has more to do with designing those features to work predictably and compose effectively.

Jach 272 days ago
What do you think of Coalton?
kragen 272 days ago
I've never tried it! It does look like it might be the thing I was saying doesn't exist, though :-)
samatman 272 days ago
> > At the time C was invented

> C is from 01973.

> Some of these things were brand spanking new in 01973

...right. The Rise of Worse is Better is a memoir, it's set in the past.

anthk 272 days ago
Ever used SBCL?
kragen 272 days ago
I love SBCL. Does Python optimize with parametric polymorphism now?
pjc50 272 days ago
> things like garbage collection, message-passing object-orientation, generics, rich sets of conditionals, first-class functions, etc. were brand spanking new. They were The Right Thing to do

I quite like this view, because these things have clearly been copied everywhere such as my language of choice C#, but the one thing that nobody copied is the one that Lisp programmers rave about most: homoiconicity (brackets everywhere).

pjmlp 272 days ago
Lisp-2, Prolog, Dylan, Erlang, Julia, R....

Are all homoiconic without being full of parenthesis all over the place, the actual meaning is code and data being interchangeable.

BoingBoomTschak 272 days ago
I think following what's written in Wikipedia (https://en.wikipedia.org/wiki/Homoiconicity#Implementation_m...) and only using the adjective for stuff like Lisp, Tcl, Rebol is better than diluting its meaning to the point where it applies to any language with tree-sitter bindings and a parser.
kragen 272 days ago
Prolog, Dylan, Julia, and R are stuff like Lisp, Tcl, and Rebol. I don't know about Erlang, and I don't know what pjmlp means by "Lisp-2", which I normally interpret as meaning a Lisp with different namespaces for functions and variables, following Steele's terminology.
susam 272 days ago
> ... I don't know what pjmlp means by "Lisp-2", which I normally interpret as meaning a Lisp with different namespaces for functions and variables ...

There indeed was a language named LISP 2: https://dl.acm.org/doi/pdf/10.1145/1464291.1464362

I posted it on HN two weeks ago but it didn't get much traction: https://news.ycombinator.com/item?id=41640147

kragen 272 days ago
Aha, thanks!
272 days ago
BoingBoomTschak 272 days ago
Looking at https://docs.julialang.org/en/v1/manual/metaprogramming/, I have a hard time considering it "fully" homoiconic, as the parsed data structure Expr(:call, :+, :a, Expr(:call, :*, :b, :c), 1) isn't the same representation as the code itself, even having some new elements like :call.

How it this different from tree-sitter, except that you can feed the modified code back to the language to evaluate?

I mean, don't get me wrong, Julia metaprogramming seems lovely, but it just seems to me that the word loses meaning if it can applied to all languages with AST macros, no matter how gnarly the AST data structure is (is Rust homoiconic because of proc_macro?).

kragen 272 days ago
First, I want to point out that I've never written a single line of Julia code, so I'm no expert on this. I could easily be wrong about some of what I just found out by reading a small amount of Julia documentation.

However, that's not a parsed data structure; it's a Julia expression to construct one. Though I don't have Julia installed, apparently you can just as well write that as :(a + b*c + 1), as explained a few paragraphs further down the page than I guess you read: https://docs.julialang.org/en/v1/manual/metaprogramming/#Quo.... That's also how Julia represents it for output, by default. What you've written there is the equivalent of Lisp (list '+ 'a (list '* 'b 'c) 1), or perhaps (cons '+ (cons 'a (cons (cons '* (cons 'b (cons 'c '()))) (cons 1 '())))). The data structure is as simple as Prolog's, consisting of a .head such as :call and a list of .args. Prolog's, in turn, is only slightly more complex than Lisp's. From a certain point of view, Prolog's structure is actually simpler than Lisp's, but it's arguable.

How this is different from having a tree-sitter parser is that it's trivially easy to construct and analyze such structures, and not just at compile time.

Possibly the source of your confusion is the syntactic sugar which converts x + y + z into what we'd write in Lisp as (+ x y z), and also converts back? I would argue that such syntactic sugar is precisely what you want for constructing and analyzing expressions. That is, it's part of what makes Julia homoiconic in a more useful sense than J. Random Language equipped with a parser for its own syntax.

pjmlp 272 days ago
Also Lisp came out in 1958, and ESPOL/NEWP in 1961, PL/I and its derivates, predating C by a decade.

Had AT&T been allowed to charge real money for UNIX, and the Worse is Better would never happened.

linguae 272 days ago
Counterpoint: Some other “worse” would have probably taken over. We’ve seen this in the personal computing market, where MS-DOS (later Windows) and the x86 won out against competing operating systems and ISAs not because of technical superiority, but because of market forces and cost. Look at the evolution of the Web, especially JavaScript....

It’s likely that Unix might not had spread if AT&T didn’t give it relatively liberal licensing in the pre-divestiture era. But it’s also likely that something else that was cheap and readily adaptable would’ve taken over instead, even if it wasn’t technically superior to its competitors.

pjmlp 272 days ago
Or maybe we would be enjoying VMS or something like that, not written in a language that 60 years later after its inception still doesn't do bounds checking and decays arrays into pointers to save typing four characters.
kragen 272 days ago
BLISS-11 didn't do bounds-checking either. VMS was my favorite operating system until I got access to a Unix. People chose Unix because it was better—and not just by a little. Handicapping Unix wouldn't have improved the alternative systems.
pjmlp 272 days ago
VMS also supported Pascal and BASIC dialects for systems programming, in equal footing to BLISS.

People choose free beer, it always goes even if warm.

kragen 272 days ago
VAX Pascal is viable for systems programming (though standard Pascal wasn't). I only ever used VAX BASIC a tiny bit; was it interpreted like microcomputer BASICs? That made BASIC a nonstarter for "serious software" in the 80s. Not having records, local variables, or subroutine parameters was also a pretty big problem, but maybe VAX BASIC fixed that.

I wasn't paying for access to VMS either; it just didn't hold a candle to Unix.

pjmlp 272 days ago
No one really uses standard C for systems programming, yet that always applies to other languages.

There is always Assembly, compiler extensions, or OS specific APIs, as part of the deliverable.

Funny how UNIX folks always have these two weights approach.

I beg to differ in wax quality for candles, but if we get to do juice with bitter lemons, so be it.

At least is refreshing during Summer.

kragen 272 days ago
Do you know if VAX BASIC fixed the problems I mentioned in BASIC? Was it interpreted?

K&R C has separate compilation, pointer casting, a usable but janky and bug-prone variable-length string type, variadic functions, static variables, literal data of array and record types, bitwise operations, a filesystem access API, and an array iterator type. None of those require "assembly, compiler extensions, or OS specific APIs." (Well, I'm not sure if they specified variadic functions in the book. The first ANSI C did.)

Jensen & Wirth Pascal has none of those, making it completely inadequate for anything beyond classroom use, which was what it was designed for.

Each Pascal implementation that was used for systems programming did of course add all of these features, but many of them added them in incompatible ways, with the result that, for example, TeX had to implement its own string type (in WEB).

pjmlp 271 days ago
VAX BASIC was always compiled, it was its predecessors that were not, bitsavers or wikipedia have the background info.

Take all the Assembly routines from libc, and K&R C turns into a macro assembler with nicer syntax. And not a good one, given that real macro assemblers actually have better macro capabilities, alongside their high level constructs.

Quite visible in the C dialects that were actually available outside UNIX, on computers people could afford, like the RatC (Made availalbe via A book on C) and Small-C (DDJ article series).

Well even dumbest standard pascal compilers like GPC do allow calling into Assembly. So it should count for Pascal as well, if that is the measure.

Then we have this thing sticking with ISO Pascal from and its dialects, always ignoring that this was seen as an issue, that that is why Modula-2 exists since 1978, and Extended Pascal since 1991, one year later after ANSI C (C89 got a short C90 revision fix).

Also following the K&R C alongside Assembly line, several companies were quite successful with Pascal dialect alongside Assembly, including a famous fruit company.

Back in 2024, C extensions keep being celebrated to the point the most famous UNIX clone can only be compiled with a specific compiler, and the second alternative is only possible after a search company has burned lots of money making it possible.

But hey, lets stick to Pascal and its dialects.

kragen 270 days ago
Do you know if VAX BASIC fixed the other problems I mentioned in BASIC, other than being interpreted? You were the one that brought it up as an alternative. Specifically, did it have record types, local variables, and subroutine parameters? This is the third time I've asked in this thread, but possibly you didn't notice the first two times.

I think "a macro assembler with nicer syntax" is an excellent summary of C. Though it's transitioning to "a retrocomputing programming language we have to support in order to be able to run software that was written long ago".

I agree that many other macro assemblers have more powerful macro capabilities than C. After looking at most of the output of the group that produced Unix, I think that's on purpose: cpp was deliberately less powerful than GPM or m6, but that's not because they weren't familiar with m6 or couldn't figure out how to write it, and ed was deliberately less powerful than QED or TECO, but that wasn't because they weren't familiar with QED. Possibly Plauger's remark about how one of the worst things he'd done in his life was to write a relocating linker in QED provides a clue as to why.

With the benefit of 45–55 years of hindsight, the decision to prioritize clarity over expressiveness in ed and cpp seems to have really paid off. You seem to disagree, but you don't say why; maybe you think it's axiomatic that more expressive languages are better, despite the fact that we're having this discussion in HTML rather than PostScript or TeX, using URLs rather than Smalltalk or Open Firmware bytecode packets, with browsers written mostly in C++ rather than Scheme or Common Lisp, over TCP/IP rather than CHAOSNET. If my suggested axiom were correct, all of those would be the other way around.

I'd like to see RatC, but I haven't been able to find old editions of A Book on C, and the later revisions seem to have removed it.

I don't see how Modula-2 is relevant to a discussion about Pascal. There were lots of Pascal-inspired languages in the 70s and 80s; Modula-2 was neither the most influential one nor Wirth's favorite.

It remains true that you could easily do portable systems programming in C in the late 70s and 80s, and you could do nonportable systems programming in VAX Pascal, but doing portable systems programming in Pascal required major compromises when it was possible at all. (Just to clarify, when I say "systems" I don't mean "kernels"; I mean "not applications", as you clearly also did when you said "VMS also supported Pascal and BASIC dialects for systems programming.")

There were other parts of your comment I wasn't able to make any sense of, but if you feel you said something I haven't responded to, please feel free to clarify.

anthk 272 days ago
Imagine if RMS brought Lisp OS with Emacs not being bound to GNU/Unix. JITted Emacs since the 2000 and Eshell as the default CLI, with a floating WM as an easy desktop. Multithreaded, own modules for images/PDF and whatever. No need for security, daily backups and rollbacks would be granted. Today you would be 'rewinding in time' from Emacs if any attack happened.
kragen 272 days ago
Vaporware can always be better than actually existing software because vaporware doesn't have bugs or schedule/scope tradeoffs.
yoyohello13 272 days ago
I like LISP, but I wouldn't say I'm an evangelist. Here's my 2 cents for why LISP has such a following.

1. LISP is easy to start with if you're not a programmer. There is very little syntax to get to grips with, and once you understand "everything is a list" it's super easy to expand out from there.

2. LISP really makes it easy to hack your way to a solution. With the REPL and the transparency of "code is data" model you can just start writing code and eventually get to a solution. You don't need to plan, or think about types, or deal with syntax errors. You just write your code and see it executed right there in the REPL.

For my part, I love LISP when it's just me doing the coding, but once you start adding other peoples custom DSL macros or whatever the heck it becomes unwieldy. Basically, I love my LISP and hate other peoples LISP.

turtleyacht 272 days ago
Once you get far enough to need external dependencies, do you use your work as a library in another language or rewrite it?

http://www.winestockwebdesign.com/Essays/Lisp_Curse.html

anthk 272 days ago
QuickLisp solves most of the issues. Quiop and closer-mop if you want a universal CLOS.

Scheme is not Common Lisp, it's kinda the opposite. It comes with batteries included and MCClim is the defacto GUI for it.

>Stuck with Emacs [from the URL]

Well, Lem tries to be Emacs for Common Lisp, but without having to think on two Lisp languages (albeit closely related) at once.

Once you have a REPL, autocomplete and some docstring looking up tool, you are god.

turtleyacht 271 days ago
Good point. I neglected to consider standard libraries (stdlib) at the time. If those are allowed, then a lot more can be done with one's code without reaching much farther. Thank-you.
anthk 271 days ago
The 'stdlib' in CL is huge.

Scheme is the minimal one. Almost like comparing sh with zsh.

mattgreenrocks 272 days ago
Such a weird take on HN. Lisp should be experimented with if only to appreciate the profound beauty of a small, powerful, cohesive design. It is a wholly different feeling from industry standard languages which are constantly changing.

In Lisp, almost all of the language’s power is in “user space.”

The ramifications for that are deep and your beliefs as to whether that is good are largely shaped by whether you believe computation is better handled by large groups of people (thus, languages should restrict users) or smaller groups of people (thus, languages should empower users).

See this for more discussion: https://softwareengineering.stackexchange.com/a/237523

Blackthorn 272 days ago
Nothing about Common Lisp is small or cohesive. Some schemes, maybe.
anthk 271 days ago
CL it's much more cohesive than Scheme. They Schemes are barely compatibles upon themelves with ice-9's on Guile, SRFI's and whatnot.
pif 272 days ago
> languages which are constantly changing

They change because they are used.

skribanto 272 days ago
Well the main point is that there are changes that for most languages would need a change in the compiler/interpreter itself. However in lisp those kinds of things can be done in userspace
taeric 272 days ago
LISP remains one of the only languages where manipulating the code looks exactly the same as executing it. This is often illustrated by pointing out that "eval" in lisp doesn't take in a string of characters. (https://taeric.github.io/CodeAsData.html is a blog I wrote on the idea a bit more.)

What this often meant was that getting a feature into your LISP program was something you could do without having to hack at the compiler.

Used to, people balked at how macros and such would break people's ability to step debug code. Which is still largely true, but step debugging is also sadly dead in a lot of other popular languages already.

bbor 272 days ago
It’s for the dreamers. The crazy ones among us that do not think of themselves as experts in programming machines to solve business problems, but rather novices in cajoling machines to think like humans do.
djha-skin 272 days ago
In the words of "Programmers Are Also Human" YouTube channel, it's just more comfortable. REPL development, ease of refactoring, dynamic typing, good CFFI, all just adds up to a developer experience that I find to be, in a word, chill.
hcarvalhoalves 272 days ago
See these demos for the kinds of interactive systems LISP enabled back in the day:

https://www.youtube.com/watch?v=o4-YnLpLgtk

https://www.youtube.com/watch?v=gV5obrYaogU

It makes working on VSCode today look like banging rocks together, let alone 30 years ago.

anthk 272 days ago
And today with Lem/Emacs+slime+sbcl
linguae 272 days ago
Keep in mind that this essay was written in the early 1990s. Today there are many programming languages available that offer features that are unavailable in C but have long been available in Lisp. This was not the case in the 1980s during the AI boom of that era. There was a large chasm between classic procedural languages (C, Fortran, Pascal, Simula) and dynamic languages (Smalltalk and Lisp). Prolog was a popular alternative to Lisp in AI circles, especially in Japan back when it was pursuing the Fifth Generation Computing Project. When looking at the language landscape in the 1980s in the context of AI, it makes sense that practitioners would gravitate toward Lisp and Prolog.

Today we benefit from having a variety of languages, each with tradeoffs regarding how their expressiveness matches the problem at hand and also the strength of its ecosystem (e.g., tools, libraries, community resources, etc.). I still think Lisp has advantages, particularly when it comes to its malleability through its syntax, its macro support, and the metaobject protocol.

As a Lisp fan who codes occasionally in Scheme and Common Lisp, I don’t always grab a Lisp when it’s time to code. Sometimes my language choices are predetermined by the ecosystem I’m using or by my team. I also think strongly-typed functional programming languages like Standard ML and Haskell are quite useful in some situations. I think the strength of Lisp is best seen in situations where flexibility and have malleable infrastructure is highly desirable.

Jach 272 days ago
It's not a perfectly reliable tell, but people who write LISP instead of Lisp generally give themselves away as knowing nothing about the language. Why not kick the tires with Common Lisp, or even Clojure, and see if you can then understand for yourself why it sparks joy in people? I'm not saying it'll spark joy in you, just that you might understand. (Do you understand Haskell's draw to certain people? Rust's?) At the very least, perhaps you'll lose your notion that it primarily appeals to academic types. Common Lisp is and always has been an industrial language.

"Please don't assume Lisp is only useful for Animation and Graphics, AI, Bioinformatics, B2B and E-Commerce, Data Mining, EDA/Semiconductor applications, Expert Systems, Finance, Intelligent Agents, Knowledge Management, Mechanical CAD, Modeling and Simulation, Natural Language, Optimization, Research, Risk Analysis, Scheduling, Telecom, and Web Authoring just because these are the only things they happened to list." --Kent Pitman

NAHWheatCracker 272 days ago
My only personal experience writing with Lisp was Scheme, which I wrote for two weeks back in college 18 years ago, so I truly know nothing about the language.

In retrospect, saying I don't understand was hyperbolic. Of course I understand that people have their preferred languages. The handful of languages I've used in my career each have their draw.

My comment was meant more to question the assertion that Lisp is the-right-thing, which sounds like asserting the-right-religion.

272 days ago
buescher 272 days ago
It's not really about lisp or even about vague reverse-snobbish ideas of "practicality" but about very specific categories of design compromises.
kragen 272 days ago
The paper was written for an audience of Lisp programmers, so the things you're talking about were sort of out of scope. I'm not going to try to convince you that Lisp and ITS really did aim at "the right thing" in a way that C and Unix didn't; you'll see that it's true if you investigate.

Lisp definitely does depend on personality type. Quoting Steve Yegge's "Notes from the Mystery Machine Bus" (https://gist.github.com/cornchz/3313150):

> Software engineering has its own political axis, ranging from conservative to liberal. (...) We regard political conservatism as an ideological belief system that is significantly (but not completely) related to motivational concerns having to do with the psychological management of uncertainty and fear. (...) Liberalism doesn't lend itself quite as conveniently to a primary root motivation. But for our purposes we can think of it as a belief system that is motivated by the desire above all else to effect change. In corporate terms, as we observed, it's about changing the world. In software terms, liberalism aims to maximize the speed of feature development, while simultaneously maximizing the flexibility of the systems being built, so that feature development never needs to slow down or be compromised.

Lisp, like Perl and Forth, is an extremist "liberal" language, or family of languages. Its value system is centered on making it possible to write programs you couldn't write otherwise, rather than reducing the risk you'll screw it up. It aims at expressiveness and malleability, not safety.

The "right thing" design philosophy is somewhat orthogonal to that, but it also does pervade Lisp (especially Scheme) and, for example, Haskell. As you'd expect, the New Jersey philosophy pervades C, Unix shells, and Golang. Those are also fairly liberal languages, Golang less so. But a C compiler had to fit within the confines of the PDP-11 and produce fast enough code that Ken would be willing to use it for the Unix kernel, and it was being funded as part of a word processing project, so things had to work; debuggability and performance were priorities. (And both C and Unix were guided by bad experiences with Multics and, I infer, M6 and QED.) MACLISP and Interlisp were running on much more generous hardware and expected to produce novel research, not reliable production systems. So they had strong incentives to both be "liberal" and to seek after the "right thing" instead of preferring expediency.

adamnemecek 272 days ago
It’s a signaling mechanism to say “I went to MIT” or at the very least to say “I could have gone to MIT”.
djha-skin 272 days ago
I learned Racket at Brigham Young University in Utah, and Clojure, then Common Lisp on my own. I don't think it's just an MIT thing. It's in the ethos now. Lots of people know about SICP, for example.
272 days ago
Blackthorn 272 days ago
How do you know someone knows lisp? They'll tell you. And tell you. And tell you...

I like lisp for the most part, but holy shit is the enduring dialog surrounding it the absolute worst part of the whole family of languages by far. No, it doesn't have or give you superpowers. Please grow up.

anthk 272 days ago
Do either the book on Symbolic Computation + PAIP and/or SICP, and you'll understand.
Blackthorn 272 days ago
Yes, I've done SICP. Lisp does not give you superpowers.
272 days ago