At least hypothetically, I think there's an approach which is not "the right thing" or "worse is better" but rather more like "the right foundations".
Most interface complexity in my experience seems to be inherited from underlying interface complexity, and it takes a lot of work to fix that underlying interface complexity. This, I think is where "worse is better" shines. If you try to apply a "the right thing" approach to a system where you're dealing with shitty underlying interfaces (i.e. every popular operating system out there including every unix, and NT system) you end up with endless complexity and performance loss. So obviously nobody will want to do "the right thing" and everyone who takes the "worse is better" approach will end up way ahead of you in terms of delivering something. People will be happy (because people are almost always happy regardless of how crap your product is).
On the other hand, designing something with "the right foundations" means that "the right thing" no longer needs to involve "sacrifice implementation simplicity in favour of interface simplicity" to anywhere near the same extent because your implementation can focus on implementing whatever interface you want rather than first paving over a crappy underlying interface.
But the difficulty of "the right foundations" is that nobody knows what the right foundations are the first 10 times they implement them. This approach requires being able to rip the foundations up a few times. And nobody wants that, so "worse is better" wins again.
"In welfare economics, the theory of the second best concerns the situation when one or more optimality conditions cannot be satisfied. The economists Richard Lipsey and Kelvin Lancaster showed in 1956 that if one optimality condition in an economic model cannot be satisfied, it is possible that the next-best solution involves changing other variables away from the values that would otherwise be optimal. Politically, the theory implies that if it is infeasible to remove a particular market distortion, introducing one or more additional market distortions in an interdependent market may partially counteract the first, and lead to a more efficient outcome."
How would you design a network interface using your right foundations model? I'm not talking about HTML or whatnot.
I have some sort of medium, copper, fiber whatever and I would like to send 10 bytes to the other side of it. What is the right foundations that would lead to an implementation which isn't overly complex.
The biggest benefit of simplicity in design is when the whole system is simple, so it’s easy to hack on and reason about.
> Unix and C are the ultimate computer viruses.
The key argument behind worse-is-better is that an OS which is easy to implement will dominate the market in an ecosystem with many competing hardware standards. Operating systems, programming languages, and software in general have not worked this way in a long time.
Rust is not worse-is-better, but it's become very popular anyway, because LLVM can cross-compile for anything. Kubernetes is not worse-is-better, but nobody needs to reimplement the k8s control plane. React is not worse-is-better, but it only needs to run on the one web platform, so it's fine.
Worse-is-better only applies to things that require an ecosystem of independent implementers providing compatible front-ends for diverging back-ends, and we've mostly standardized beyond that now.
Rust started as an experiment of Mozilla team replacing C++ with something that helps them compete with Chrome in developing safe multi-threaded code more efficiently. It took a lot of experiments to get to the current type system, most of which gives real advantages by using affine types, but the compiler is at this point clearly over-engineered for the desired type system (and there are already ideas on how to improve on it). It's still too late to restart, as it looks like it takes 20 years to productionize something like Rust.
As for React I believe it's I believe an over-engineered architecture from the start for most web programming tasks (and for companies / programmers that don't have separate frontend and backend teams), but the low interest rates + AWS/Vercel pushed them on all newcomers (and most programmers are new programmers, as the number of programmers grew exponentially).
HTMX and Rails 8 are experiments in the opposite direction (moving back to the servers, nobuild, noSAAS), but I believe there's lot of space to further simplify the web programming stack.
Could the industry get rid of C and ridiculous esoteric abbreviation in identifiers, it could almost be a sane world to wander.
And, at the risk of intentionally missing the metaphor: they do in fact make automated tenderizers, now ;) https://a.co/d/hybzu2U
and then said something like "most of this is not important, these layers don't really exist, TCP/IP got popular first because it's just much simpler than OSI was"
I don't think very many people even remember X25. The one survivor from all the X standards seems to be X509?
I'm not sure he's right, but I do think his point of view is important to understand.
Just because you don't understand something, that doesn't mean it's bad.
I don't know enough to judge his assertion that the session layer exists to solve problems created by half-duplex terminals. (He doesn't seem to specifically call out Ceefax.)
I do remember all nine layers of the OSI stack though: physical, data link, network, transport, session, presentation, application, financial, political.
See also https://computer.rip/2021-03-27-the-actual-osi-model.html and https://dotat.at/@/2024-03-26-iso-osi-usw.html
> Teaching students about TCP/IP using the OSI model is like teaching students about small engine repair using a chart of the Wankel cycle. It's nonsensical to the point of farce. The OSI model is not some "ideal" model of networking, it is not a "gold standard" or even a "useful reference." It's the architecture of a specific network stack that failed to gain significant real-world adoption.
Also, if you're going to compare TCP/IP to various OSI implementations, you should compare the full stack including PEM, MOSS, SMIME, SSL/TLS, SSH. Each muddies the difference between presentation and application layers, but as in the previous paragraph, no one seems to care. Talking SMTP over SSH (or SSL/TLS) is totally fine; you don't need to have a sub-protocol to define how a presentation layer on top of a secure session layer works if you can make certain assumptions about the behaviour of the code on the other side of the network connection.
SNMP is “simple” compared to X.711 CMIP. But SNMP also uses ASN.1 and X.660 OIDs.
You'd scare anyone like an amazed rural tribesman if you showed them an iPhone in 1996.
I know, IPhone was a Cisco thing, but my mind went there for a beat ;)
And 30 years later they show few signs of letting go.
That being said... I've wanted to implement an old Explorer using an FPGA for a while. Maybe if I just mention it here, someone will get inspired and do it before I can get to it.
Ephemeral GC, concentrates on garbage collecting objects in RAM: For example every memory page has a page tag, which marks it modified or not. The ephemeral GC uses this to scan only over changed pages in memory. The virtual memory subsystem keeps a table of swapped-out pages pointing to ephemeral pages. The EGC can then use this information...
For fun a few weeks ago I went looking for COBOL on VMS jobs. They're definitely still out there, but you do have to look for them. No one's going to send you an email asking if you're interested and if you don't hang out with COBOL/VMS people, you may not know they exist.
I think my point is the total number of C/C++ jobs today are probably the same or slightly higher than 1994. But the total number of Java and C# jobs (or Ruby or Elixr or JavaScript jobs) is dramatically higher than in 1994, if for no other reason than the fact these languages didn't exist in 1994.
[As an aside... if you're looking for a COBOL / VMS programmer/analyst... I spent much of the 80s as a VMS System Manager, coding custom dev tools in Bliss and some of the 90s working on the MicroFocus COBOL compiler for AIX. And while you would be crazy to ignore my 30+ years of POSIX/Unix(tm) experience, I think it would be fun to sling COBOL on VMS.]
I don't know anything about the total number of C++ jobs, but there's a huge filter bubble effect for job searching. If you don't mention a language on your resume or list it under your skills then you're very unlikely to see any jobs for it or have anybody contact you for a job using it, whether we're talking about C++, Python, Typescript, or even technologies like Docker.
It gets a bit hard to replace something that your compiler depends on to exist in first place.
>The right thing takes forever to design, but it is quite small at every point along the way. To implement it to run fast is either impossible or beyond the capabilities of most implementors.
A deer is only 80% of a unicorn, but waiting for unicorns to exist is folly.
But at the end of the day everyone knows never to assume STD I/O will write an entire buffer to disk and YOU need to check for EINTR and C++ allows you to wrap arbitrary code in try...catch blocks so if you're using a poorly designed 3rd party library you can limit the blast radius. And it's common now to disclaim responsibility for damages from using a particular piece of software so there's no reason to spend extra time trying to get the design right (just ship it and when it kills someone you'll know it's time to revisit the bug list. (Looking at YOU, Boeing.))
I do sort of wonder what happens when someone successfully makes the argument that C++ Exceptions are a solution often mis-applied to the problem at hand and someone convinces a judge that Erlang-like supervisory trees constitute the "right" way to do things and using legacy language features is considered "negligence" by the courts. We're a long way off from that and the punch line here is a decent lawyer can nail you on gross negligence even if you convinced your customer to sign a liability waiver (at least in most (all?) of the US.)
Which is to say... I've always thought there is an interplay between the "worse is better" concept and the evolution of tech law in the US. Tort is the water in which we swim; it defines the context for the code we write.
If you don't achieve virality, you're as good as dead. Once a episteme/meme spreads like wild-fire there's very little chance for a reassessment based on value/function - because the scope is now the big axis of valuation.
It's actually worse because humanity is now a single big borg. Even 30-40 years back, there were sparsely connected pools where different species of fish could exist - not any more. The elites of every single country is a part of the Anglosphere, and their populations mimic them (eventually).
This tumbling towards widespread mono-memetism in every single sphere of life is deeply dissatisfying about the modern human life, not just for PL/OS/... but also for culture etc.
Anthropocene of humanity itself.
Are you? Maybe the worse solution peaks faster, but can be supplanted by a better solution in the future, like how Rust is displacing C/C++ in new projects. The better solution may never be popular yet persist.
Additionally there are no significant new projects being done in Rust for the games industry, AI/ML, HPC, HFT, compiler backends, hardware design,....
There's typically a "exhaustion" phase with mono-memetism/theories where everyone gets sick and tired of the "one and only way" and it becomes fashionable to try out new things (eg. Xtianity in Europe). We're not at this point where the olds can be toppled.
"It is better to go picking blueberries before they are fully ripe, that way you won't have much competition."
And this leads to an easier implementation - parsing is easier(which is why code transformation via macros becomes easy).
1. Minimal -- the design and implementation must be the smallest as possible, especially the scope (which should be deliberately "incomplete")
2. Timely -- the implementation must be delivered as soon as feasible, even if it comes before the design (get it working first, then figure out why)
3. Relevant -- the design and implementation must address important, unmet need, eschewing needs that are not urgent at the time (you can iterate or supplement)
4. Usable -- the implementation must be integrated with the existing, working and stable infrastructure (even if that integration causes design compromises)
The other dimensions, simplicity, correctness, consistency, and completeness are very nice to have, but they are not the primary drivers of this philosophy.
I would say that Timely and Relevant drive Minimal. I would also say that Minimal and Usable are in tension with each other.
McDougals resolves the apparent conflict between the other two. It blames the interrupt hardware as the root cause. It produces non-working, incomplete software. It's kind of a modest proposal.
However, it also produces no ripples in the design fabric. With MIT, the OS source is a maintenance nightmare. With NJ, modern software still has to deal with archaic idiosyncrasies like EINTR. With McDougals, all the "conflict-free" portions of the software advance, those that write themselves.
The result is likely immediately shelved, perhaps as an open source PoC. Over time, someone might write some inelegant glue that makes interrupts appear to behave nicely. Alternatively, the world might become perfect to match the software.
If nothing else, the software will have mimicked the way we learn. We use imperfect examples to draw the idealized conclusion. Even if it never gets to run, it will be more readable and more inspiring than either MIT or NJ.
It's the opposite of what marketers want you to think of when they say "uncompromising design."
(edit: I mean it unironically)
Personally, I wouldn't want to touch server-side JS with a 10 foot barge pole.
It actually worked decently well, but was due to Java needlessly verbose.
WASM lets us run other languages efficiently in the browser but that just opens the field to a lot of languages, not one language to rule them all.
But honestly it's kind of refreshing to see the original node.js presentation, where using javascript is sort of a side-note. He wanted to use callback-heavy language and JS fit the bill
It probably is the most severe case of "worse is better" I've experienced so far.
Anyway, worse is better is about simplicity of implementation versus conceptual simplicity. By principle, that's a much harder choice.
RichardGabriel makes this observation on the survival value of software in the paper Lisp: Good News, Bad News, How to Win Big. See http://www.jwz.org/doc/worse-is-better.html for the section on WorseIsBetter.
For those seeking the first node, see http://web.archive.org/web/19990210084721/http://www.ai.mit.edu/docs/articles/good-news/good-news.html.
For even more context on WorseIsBetter see http://www.dreamsongs.com/WorseIsBetter.html. My favorite part is RichardGabriel arguing with himself.
Lisp: Good News, Bad News, How to Win Big (1990) [pdf] - https://news.ycombinator.com/item?id=30045836 - Jan 2022 (32 comments)
Worse Is Better (2001) - https://news.ycombinator.com/item?id=27916370 - July 2021 (43 comments)
Lisp: Good News, Bad News, How to Win Big (1991) - https://news.ycombinator.com/item?id=22585733 - March 2020 (21 comments)
The Rise of Worse Is Better (1991) - https://news.ycombinator.com/item?id=21405780 - Oct 2019 (37 comments)
The Rise of Worse Is Better (1991) - https://news.ycombinator.com/item?id=16716275 - March 2018 (44 comments)
Worse is Better - https://news.ycombinator.com/item?id=16339932 - Feb 2018 (1 comment)
The Rise of Worse is Better - https://news.ycombinator.com/item?id=7202728 - Feb 2014 (21 comments)
The Rise of "Worse is Better" - https://news.ycombinator.com/item?id=2725100 - July 2011 (32 comments)
Lisp: Good News, Bad News, How to Win Big [1991] - https://news.ycombinator.com/item?id=2628170 - June 2011 (2 comments)
Worse is Better - https://news.ycombinator.com/item?id=2019328 - Dec 2010 (3 comments)
Worse Is Better - https://news.ycombinator.com/item?id=1905081 - Nov 2010 (1 comment)
Worse is better - https://news.ycombinator.com/item?id=1265510 - April 2010 (3 comments)
Worse Is Better - https://news.ycombinator.com/item?id=1112379 - Feb 2010 (5 comments)
Lisp: Worse is Better, Originally published in 1991 - https://news.ycombinator.com/item?id=1110539 - Feb 2010 (1 comment)
Lisp: Good News, Bad News, How to Win Big - https://news.ycombinator.com/item?id=552497 - April 2009 (2 comments)
Worse Is Better - https://news.ycombinator.com/item?id=36024819 - May 2023 (1 comment)
My story on “worse is better” (2018) - https://news.ycombinator.com/item?id=31339826 - May 2022 (100 comments)
When Worse Is Better (2011) - https://news.ycombinator.com/item?id=20606065 - Aug 2019 (13 comments)
EINTR and PC Loser-Ing: The “Worse Is Better” Case Study (2011) - https://news.ycombinator.com/item?id=20218924 - June 2019 (72 comments)
Worse is worse - https://news.ycombinator.com/item?id=17491066 - July 2018 (1 comment)
“Worse is Better” philosophy - https://news.ycombinator.com/item?id=17307940 - June 2018 (1 comment)
What “Worse is Better vs. The Right Thing” is really about (2012) - https://news.ycombinator.com/item?id=11097710 - Feb 2016 (35 comments)
The problematic culture of “Worse is Better” - https://news.ycombinator.com/item?id=8449680 - Oct 2014 (116 comments)
"Worse is Better" in the Google Play Store - https://news.ycombinator.com/item?id=6922127 - Dec 2013 (10 comments)
What “Worse is Better vs The Right Thing” is really about - https://news.ycombinator.com/item?id=4372301 - Aug 2012 (46 comments)
Worse is worse - https://news.ycombinator.com/item?id=437966 - Jan 2009 (3 comments)
Rust's typechecking passes are not the reason why the compiler is slow. Code generation dominates compile times. Type checking is pretty quick, and Rust makes some decisions that enable it to do so, like no global inference.
On my current project, "cargo check" takes ten seconds, and "cargo build" takes 16. That's 62.5% of the total compilation time taken by code generation, roughly (and linking, if you consider those two to be separate).
In my understanding, there can sometimes be problems with -Z time-passes, but when checking my main crate, type_check_crate takes 0.003 seconds, and llvm_passes + codegen_crate take 0.056 seconds. Out of a 0.269 second total compilation time, most things take less than 0.010 seconds, but other than the previous codegen mentioned, monomorphization_collector_graph_walk takes 0.157s, generate_crate_metadata takes 0.171 seconds, and linking takes 0.700 seconds total.
This general shape of what takes longest is consistent every time I've looked at it, and is roughly in line with what I've seen folks who work on compiler performance talk about.
My heart loves the clean sparse MIT approach, but I'm kinda forced to work in the other world because that approach has so far decisively failed.
I have my own thoughts as to why, and they're different from the typical ones:
(1) Worse is free -- the "worse" stuff tended to be less encumbered by patents and closed source / copyrights. This is particularly true after GNU, BSD, and Linux. Free stuff is viral. It spreads fast and overtakes everything else.
(2) Programmers like to show off, and "worse" provides more opportunities to revel in complexity and cleverness.
I had supposed this in a previous thread, and I agree it is another thing entirely, however whether Rust is 'correct' or 'pragmatic', I think is a matter of contention.
> (1) Worse is free -- the "worse" stuff tended to be less encumbered by patents and closed source / copyrights. This is particularly true after GNU, BSD, and Linux. Free stuff is viral. It spreads fast and overtakes everything else.
At the time it seems that the "worse" stuff was actually more encumbered--GNU started to exist as a consequence of Unix both becoming an industry standard and solidifying the position of nonfree software in industry, and it didn't get that way by being free--they simply licensed it away en masse to universities. The free software movement was born out of the MIT hacker ethic of sharing software freely (GNU brought the design philosophy of MIT to Unix systems, and largely stands opposed to the "worse is better" approach. It originally sought to replace parts of Unix with superior alternatives, such as Info over man pages). BSD didn't become free until much, much later, at which point Linux had already become relevant.
> (2) Programmers like to show off, and "worse" provides more opportunities to revel in complexity and cleverness.
I think maybe C or C++ lets you show off by hacking around the compiler to do certain things that look impressive like preprocessor hacks or obfuscated code, but I think many would agree that this kind of style isn't "correct": languages like Go were developed as a consequence of this in order to suck any and all fun you might get out of hacking C, in order to force you to write more "correct" programs. Lisp, on the contrary doesn't tell you what a correct or incorrect program is, and gives you every facility to write programs that are infinitely complex and clever.
To me, Rust looks like the result of trying to break the rules of industry languages, by trying to incorporate things like real macros and type systems into something that resembles a real-world language. But it's biggest flaw to me is that it doesn't break enough of them, rather it makes more in the process.
In Go you can immediately tell what the fields are in a config yaml file just by looking at struct annotations. Try doing that with Rust's Serde. Super opaque in my opinion.
Rust will only protect me from things my customers don't care about and don't understand.
By not using Rust and just dealing with it, I'm making more money faster than if I started with Rust.
Rust only matters in environments where that calculus comes out the other way.
If your customers are running your software you might have a business model problem instead of a software quality one.
Are you suggesting your customers don't care about CVE's, even indirectly when it affects them?
That's the wrong kind of protection. Rust should protect you from things people other than your customers (who presumably are well behaved) care about.
I think this is a self-delusion experienced by Rustaceans because they overvalue a certain type of software correctness and because Rust gets that right, the rest of its warts are invisible.
Still I am a bit bothered, does a counterargument exist?
Same author (name is an anagram).
Also: Nickieben Bourbaki might be an anagram of something, but it is definitely not an anagram of Richard Gabriel, with or without the P. There's no G, there's no h, there's an N and a k, it isn't even particularly close.
That claim is my best interpretation of this sentence:
> Same author (name is an anagram).
Although perhaps it was not your intention to connect the clauses in that way.
But sometimes you need to know what is you are designing before giving it to people, because there are large risks associated with improvising. In those cases, making The Right Thing is still expensive, but it may reduce the risk of catastrophe.
I think, however, that the latter cases are rarer than most people think. There are ways of safely experimenting even in high-risk domains, and I believe doing so ultimately lowers the risk even more than doing The Right Thing from the start. Because even if we think we can spend years nailing the requirements for something down, there are always things we didn't think of but which operational experience can tell us quickly.
The time to market for the MIT approach is just too long if your revenue relies on actually shipping a product that covers the cost of the next iteration that will move you from 80% to 90%, or even 60% to 70%. It's an old joke, but in the long run we're all dead; and waiting for the production of an ivory tower implementation won't work out well. If it's all academic, and there's no commercial pressure, well, have at it. There's not much at stake except reputations.
Furthermore, in the real world, your users' requirements and your internal goals, theoretically covered by "the" design, will change. Not everything can be reasonably anticipated. The original design is now deficient, and its implementation, which is taking too long anyway, will be a perfect reflection of its deficiency; and, not fit for its new purpose.
Get something out there and start getting feedback. You won't actually know until you do.
Often in these arguments, worse means "shortcut" and better means "won". The difficulty is proving that not taking the shortcut had some other advantages that are assumed, like in the article.
This "victory" is fleeting. When people people tell you C++, a language which didn't even exist when I was born, is "forever" they are merely betraying the same lack of perspective as when Britain thought its Empire would last forever.
—
Anyway, I wouldn’t put simplicity on the same level as the other things. Simplicity isn’t a virtue in and of itself, simplicity is valuable because it helps all of the other things.
Simplicity helps a bit with consistency, in the sense that you have more trouble doing really bizarre and inconsistent things in a simple design.
Simplicity helps massively with correctness. You can check things that you don’t understand. Personally, that means there’s a complexity prove after which I can’t guarantee correctness. This is the main one I object to. Simplicity and correctness simply don’t belong in different bullet-points.
Simplicity could be seen as providing completeness. The two ways to produce completeness are to either work for a really long time and make something huge, or reduce scope and make a little complete thing.
It’s all simplicity.
For example, why is it that central vacuums are more rare in 2024 than they were in the 1980s, despite them being superior in every way compared to regular ones?
"Worse" vacuums are "better" for the economy? (because Dyson makes jobs and consumes resources?)
https://en.wikipedia.org/wiki/Central_vacuum_cleaner
Considering the price of a house it isn't "spectacularly expensive". On the other hand vs the price of a hoover yeah a bit. Since it sits in a closet or garage and doesn't move, weight becomes a non issue so it can be a real behemoth of a vacuum.
Not spectacularly expensive when installed as the house is built. It's another run of pvc pipe in the walls before you close them up. Approx a day of labor and some pipe are the added expense - not much in terms of house building cost at all. Hardwood floors are much more expensive - and still need to be swept or vacuumed.
They're coming to take me a away haha
they're coming to take me a away hoho hihi haha
to the funny farm where code is beautiful all the time ...
-- Napoleon XIV, more or less...Via: https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...
Is Amazon "the best" place to go shopping? No, you might find better prices on individual items if you put a little more work into it, but it's the most expedient. Is Facebook/Instagram/Tiktok/insert here "the best" social network? No, but it is the most accessible, easy-to-use, useful one. Is a Tesla (perhaps outdated example since X) "the best" car - no, but it is the most expedient.
There is a tangent here that intersects with refinement culture as well. Among the group of society that (subconsciously) care about these "expedient" choices, you see everyone and everything start to look the same
For example, if two students in a class are having frequent confrontations that bring learning in the class to a halt, and attempts by teachers and counselors to address their conflict directly haven't been effective, the expedient solution might be to place them in separate classes. The "right thing" would be to address the problem on the social and emotional level, but if continued efforts to do so is likely to result in continued disruption to the students' education, it might be better to separate them. "Expedient" acknowledges the trade-off, while emphasizing the positive outcome.
Often a course of action is described as "expedient" when it seems to dodge an issue of morality or virtue. For example, if we solve climate change with geoengineering instead of by addressing thoughtless consumerism, corporate impunity, and lack of international accountability, many people would feel frustrated or let down by the solution because it would solve the problem without addressing the moral shortcomings that led to the problem. The word expedient stresses the positive side of this, the effectiveness and practicality of the solution, while acknowledging that it leaves other, perhaps deeper issues unaddressed.
Oof. Now I understand something I didn't before
If we could solve climate change without "addressing thoughtless consumerism, corporate impunity, and lack of international accountability" we would all be f'ing _thrilled_.
As I type this Hurricane Helene just destroyed a good chunk of inland North Carolina (!!!) and Hurricane Wilton was just upgraded to a "category 5" storm.
If we could solve climate change the easy way we'd all be _thrilled_, because then we'd actually solve climate change.
I think the argument is pretty much the opposite: not everyone would be thrilled. The wannabe priest class (think of Greta and her How dare you!), which is always with us, would be frustrated from the lack of something to preach about.
Of course there is always Israel vs. Palestine.
I _would_ be thrilled if there was an expedient way to solve climate change. But the best systems models for the earth all tell us that there's one way to solve the issue: just leave the carbon in the ground. That's it. Stop extracting it. Nothing else will solve the problem, it's a really simple, really bad, feedback loop.
This characterization of environmental or Palestinian activists as wanting to have the moral high ground is, imo, a knee jerk reaction. The people on the street aren't in it for clout, they're doing it because it is the right thing and they feel compelled to act. What gets me moving is not wanting to feel morally superior (a religious aspect, more at home in right wing politics), but an anxiety for the future, which is projected to include horrible death and suffering due to obvious problems that we could all fix if we just decided to recognize them.
And I think it is actually useful. People will try to manipulate other people through emotions, and mobs are easy to manipulate. One should have fairly high barriers before joining a street mob, because its potential destructive power is enormous, and it also tends to elevate unsavory characters to positions of power.
I am not saying that those barriers should be infinitely high, but fairly high.
For us humans, it is easy to succumb to "righteousness in numbers".
Food is less than ideal, war is less than ideal, death is less than ideal, HN is less than ideal.
Are you satisfied with this sort of Twitter-like posting and thinking? I am not.
Pixels are basically free and we should strive to post more than one-sentence snarks. For one-sentence snarks and drive-by dismissals, Reddit is the ideal territory.
The time and attention of your fellows is valuable and merits some thought before writing. Conciseness and clarity are more valuable than the number of pixels used to type a sentence.
Good luck going forward
Edit0: And no, most would agree that free association is an ideal of the human condition - you're welcome to disagree. Feel free to chat with a lawyer.
It's also important to weigh the harmful effects of apathy in the balance. These are easily forgotten but almost inestimably enormous. Just think of all the damage done in the decades (centuries) where hardly anyone could be bothered to protest against slavery, women's oppression, racial segregation, pollution, etc. etc.
I think you may be proving my point. Taking one side of a complicated situation because of a black-and-white moralistic thinking is potentially destructive, and organizations like Hamas benefit from that.
As for your slavery example, did slavery disappear because humanity awakened morally and started demonstrating in the streets, or because we gained a new non-human resource of raw power? Previous civilizations didn't engage in slavery because they were profoundly immoral, but because human and animal muscle was the only practical source of power. The specifics varied across the globe, but unfree labor was ubiquitous in pre-modern societies.
For a contemporary situation, imagine a 22nd century activist judging people of 2024 for eating meat from dead animals, when he can get a good steak by pressing a button on a steak-making machine. It wouldn't be demonstrations which made the difference between 2024 and 2124.
OK, but that was sort-of my point. The more outrage, the less you need to really think about things.
"err, yes, popular anti-slavery movements played an important role in the abolition of slavery"
That is a chicken-and-egg question. Why did those mass movements only emerge at the time of the Industrial Revolution, and why did they emerge first in places that were influenced by the Industrial Revolution the earliest, while other places (Russia, the Ottoman Empire, the Qing Empire) only followed suit after their own industrialization began?
I don't think the arrow of causality is so simple here. A hypothetical society that abolished slavery, serfdom etc. in the 15th century could easily prove non-viable against its slavery-powered foes, which had more brute force at their disposal. By 1820, the situation was very much turning around and it was the modern, personally freer societies that were more effective in commerce and at war.
Notably, even though Victorian Britain was very anti-slavery, starting with the monarch herself, it had no moral qualms against subjugating a quarter of humanity in another form of submission. Which tells me that it was less about morality (equality) and more about practicality of the situation.
Your take on slavery is pretty wild. The Industrial Revolution did not replace Haitian slaves with machines for harvesting sugar cane. Nor did Spartacus invent the steam engine.
Nor is it correct to ignore the decades of peer-reviewed research that concludes that we really are causing more hurricanes on the basis that hurricanes have always existed.
A:"Only Global Communism can solve Climate Change."
B:"Nuclear power also solves climate change."
A:"I don't want to solve Climate Change, I want Global Communism."
> "Expedient" is a common (or at least not rare) English word that means something like "practical and effective even if not directly attending to higher or deeper considerations."
I was not aware of the word "Expedient" before. From your example I conclude, that it has the same meaning as "pragmatic", i.e. if I sed i/expedient/pragmatic/g then your comment still makes perfect sense to me. A quick google search also seems to support this conclusion.
--> is there a nuance of the word "expedient", that I am missing here?
- "Pragmatic" can also be used to describe a person (a pragmatic person) or a mindset (a pragmatic approach to a problem.) "Expedient" isn't used to describe people.
- "Expedient" usually acknowledges the existence of a higher or more demanding standard that the solution does not meet, admitting that the solution is not perfect. You might choose a word like "pragmatic" to praise a solution with known shortcomings, but it doesn't imply known shortcomings as strongly as "expedient" does.
- "Expedient" can be used euphemistically. ("Pragmatic" can, too, but not nearly as often, and not as harshly.) "They took the expedient route" might, depending on the context, mean that they did something lazy or unethical because it was easier. The euphemistic usage is common enough that for some people it has an overall unsavory flavor, but I don't think it's tipped over into the euphemistic usage being the assumed one.
If some people feel frustrated or let down because we achieve a literal miracle (by today's technology standards) that saves millions of lives I'm willing to call them mentally unhinged.
I would also feel frustrated by the knowledge that there were many people who were willing to sacrifice unimaginable numbers of humans and animals for the sake of making more profit for themselves, who were not held to account for their actions. If a person acts in a way that they should know will lead to future suffering, the development of an unforeseen technological solution to that suffering should not wipe their moral slate clean.
Trying to kill someone using a non-functional weapon, that you believe is functional, is not morally equivalent to taking no action just because it didn’t have an effect.
And anyone trying otherwise will struggle significantly at the polls. Mass carbon removal, renewable energy, recycling and maybe some technological solutions to limit the effects of atmospheric carbon seem like the more practical way to go.
Global warming and other environmental crises are unfolding right now. While exploring possible future technological advances which would make coping with it easier is certainly a positive and useful pursuit - it cannot be the _main_ pursuit when facing those crises and challenges. That is:
1. We should not divert the discussion from present to possible fortuitous futures.
2. We must not confuse action with prospects.
3. We must not think of the two as "either-or". We can reduce emissions _and_ do R&D for new tech possibilities.
We don’t NEED to reduce emissions. So long as we clean up as much or more than we pollute, what’s the problem?
It's not just that. Every time you do business with a new web site you assume additional risk. Amazon is a known quantity. You can be pretty sure that they are not going to outright scam you, and they aren't going to be hacked by script kiddies. There is a significant risk of getting a counterfeit item, but they have a very liberal return policy, so the real cost to you in this case is a minute or two to get a return code and possibly a trip to the nearest Whole Foods to drop it off.
Amazon sucks in many ways, but at least their suckage is a known quantity. Predictability has significant value.
Your model of the world is not perfect so instead of trying to find a globally optimal solution, you are satisfied with a local optimum that exceeds some threshold that has to suffices. https://en.wikipedia.org/wiki/Satisficing
* For example, the limited selection of candy at the checkout aisle. All you have to do is get your brand in there. (Placement on the 4P's)
* Or, "best credit card for travelers." By offering travel rewards, you can acquire a group of cardmembers even if, e.g. a more valuable cashback card could have gotten them even greater benefits (Promotion on the 4P's)
That extra work costs you money, too. Calculate how much your job pays you per hour, then you can deduce the $cost of spending more time to get a better deal.
Your time is only worth money if you'd otherwise be working at that rate, which is not the case for the vast majority of humans.
I once spent 2 hours negotiating the purchase of a car. It saved $5000. That works out to $2500 an hour. Was it worth it? Hell ya!
I've also worked hourly jobs in the past. There were often opportunities to work more hours or overtime. People often have side hustles, too.
The most expedient car? Or BEV in the US?
As a car nerd though, I never felt the need to buy one because they just seem fairly boring- other than a few rare models most were 4 door sedans with automatics, fairly small engines, and soft non sporty suspension.
IMO, it destroys its competitors in the value market, and the media is being awfully silent about it. I guess it's far too easy to focus on Elon instead.
The number one reason I use Amazon, is not for the best prices, but because of their return policy. Amazon returns are actually often more painless than physical store returns.
Being able to return something predictably and easily outweighs a small difference in price.
Non native here. What's the meaning of inoculated here?
It's not the first time that I struggle to parse this word. In italian it keeps the original latin meaning and can be translated with "injected with". You could inoculate a vaccine but you could also inoculate a poison, so it does not carry the immunity meaning by default. English (US?) as far as I can tell use it as a synonym of "immune", is that so?
A vaccine inoculates you against a disease by a physical mechanism: that is, it prevents you from getting that disease (to a greater or lesser degree).
Metaphorically being inoculated against something means it can no longer hurt you. For instance, maybe by not owning a car you're inoculated against vehicle depreciation. Or by wearing the same simple but quality outfit every day you're inoculated against the vagaries of fashion.
I agree though that for practical purposes, practical solutions are just going to be more successful.
Then a righthanded person picks up the lefthanded scissors and deems them weird and uncomfortable. Which they are .. for the right hand of a right-handed person.
Other popular examples of such taste controversy are Python's semantic whitespace, the idioyncracies of Perl, the very unusual shape of J/APL, and anyone using FORTH for non-trivial purposes.
edit: https://news.ycombinator.com/item?id=41766753 comment about "other people's Lisp" reminds me, that working as a solo genius dev on your own from-scratch code and working in a team inside a large organization on legacy code are very different experiences, and the "inflexibility" of some languages can be a benefit to the latter.
Most people definitely seem to have specific preferences built into their talents. I don't see why programming or programming languages would be any different from any other medium or art form.
A commonly made up deficiency attributed to Lisp is that it's particularly bad at large scale, either in teams or program size. That would surely be surprising news to the teams doing stuff today or in the past, some responsible for multi-million lines of code systems (some still in operation). Or to use an old example, the documentation for Symbolics Computers, pictured here in book form: https://www.thejach.com/public/symbolics-books-EugyAAEXUAUG_... Such a set of books doesn't come from a "lone wolves only" ecosystem and heritage. Not to mention doc and so on not shown for applications they made for 3D graphics or document editing (https://youtube.com/watch?v=ud0HhzAK30w)
Edit: I should also mention that once I worked out how reader macros worked I went on to enthusiastically (ab)use them for my own ends...
Those reader macros have morphed the language into a new bespoke language. So it is then natural for a new developer to face a steep learning curve to learn that new bespoke language before they can make sense of the code.
I'm not condoning overuse of macros. I'd hate to be working with such code too. I'm only stating that Common Lisp is that language that can be programmed to become a different language. They call it a "programmable programming language" for a reason.
So it's not about taste or preference, left handed people learn how to use right-handed scissors, but they can also use left handed scissors in a way that a right-handed person would struggle to.
All that said, the analogy still works because most people don't understand why it doesn't work.
It is definitely not true. The existence of neurodivergence is proof enough.
I can visualize an entire application in my head like it is a forest. I can fly around and see various parts and how they fit together. I am unable to juggle the tokens necessary to do logic and basic math in my head but I have no trouble reading a graph in understanding the relationships between numbers.
No matter how many times I try to read it, Lisp is completely inscrutable to me. It requires the same token juggling math does.
Remember that this has to be read in historical context. At the time C was invented, things like garbage collection, message-passing object-orientation, generics, rich sets of conditionals, first-class functions, etc. were brand spanking new. They were The Right Thing to do (even judged in the harsh light of hindsight), but also quite complicated to implement – so much so that the New Jersey people skipped right past most of it.
Today these things are par for the course. At the time, they were the The Right Thing that made the system correct but complex, and had adoption penalties. As time passes, the bar for The Right Thing shifts, and today, it would probably not be embodied by Lisp, but maybe by something like Haskell or Rust?
Some of these things were brand spanking new in 01973, but none were in 01991.
There were Lisp systems for minicomputers like the PDP-11; BSD included one (which I think was ancestral to Franz Lisp) and XLISP ran on CP/M. And of course Smalltalk was developed almost entirely on PDP-11-sized minicomputers. But to my recollection virtually all "serious" software for microcomputers and minicomputers was written in low-level languages like assembly or C into the late 80s, not even PL/M—though Pascal did start to win in the late 80s, in significant part by adopting C's low-level features. Nowadays, microcomputers are big enough and fast enough that Lisp, Haskell, or even Rust is viable.
I don't think "the right thing" is mostly about what features your system has. I think it has more to do with designing those features to work predictably and compose effectively.
> C is from 01973.
> Some of these things were brand spanking new in 01973
...right. The Rise of Worse is Better is a memoir, it's set in the past.
I quite like this view, because these things have clearly been copied everywhere such as my language of choice C#, but the one thing that nobody copied is the one that Lisp programmers rave about most: homoiconicity (brackets everywhere).
Are all homoiconic without being full of parenthesis all over the place, the actual meaning is code and data being interchangeable.
There indeed was a language named LISP 2: https://dl.acm.org/doi/pdf/10.1145/1464291.1464362
I posted it on HN two weeks ago but it didn't get much traction: https://news.ycombinator.com/item?id=41640147
How it this different from tree-sitter, except that you can feed the modified code back to the language to evaluate?
I mean, don't get me wrong, Julia metaprogramming seems lovely, but it just seems to me that the word loses meaning if it can applied to all languages with AST macros, no matter how gnarly the AST data structure is (is Rust homoiconic because of proc_macro?).
However, that's not a parsed data structure; it's a Julia expression to construct one. Though I don't have Julia installed, apparently you can just as well write that as :(a + b*c + 1), as explained a few paragraphs further down the page than I guess you read: https://docs.julialang.org/en/v1/manual/metaprogramming/#Quo.... That's also how Julia represents it for output, by default. What you've written there is the equivalent of Lisp (list '+ 'a (list '* 'b 'c) 1), or perhaps (cons '+ (cons 'a (cons (cons '* (cons 'b (cons 'c '()))) (cons 1 '())))). The data structure is as simple as Prolog's, consisting of a .head such as :call and a list of .args. Prolog's, in turn, is only slightly more complex than Lisp's. From a certain point of view, Prolog's structure is actually simpler than Lisp's, but it's arguable.
How this is different from having a tree-sitter parser is that it's trivially easy to construct and analyze such structures, and not just at compile time.
Possibly the source of your confusion is the syntactic sugar which converts x + y + z into what we'd write in Lisp as (+ x y z), and also converts back? I would argue that such syntactic sugar is precisely what you want for constructing and analyzing expressions. That is, it's part of what makes Julia homoiconic in a more useful sense than J. Random Language equipped with a parser for its own syntax.
Had AT&T been allowed to charge real money for UNIX, and the Worse is Better would never happened.
It’s likely that Unix might not had spread if AT&T didn’t give it relatively liberal licensing in the pre-divestiture era. But it’s also likely that something else that was cheap and readily adaptable would’ve taken over instead, even if it wasn’t technically superior to its competitors.
People choose free beer, it always goes even if warm.
I wasn't paying for access to VMS either; it just didn't hold a candle to Unix.
There is always Assembly, compiler extensions, or OS specific APIs, as part of the deliverable.
Funny how UNIX folks always have these two weights approach.
I beg to differ in wax quality for candles, but if we get to do juice with bitter lemons, so be it.
At least is refreshing during Summer.
K&R C has separate compilation, pointer casting, a usable but janky and bug-prone variable-length string type, variadic functions, static variables, literal data of array and record types, bitwise operations, a filesystem access API, and an array iterator type. None of those require "assembly, compiler extensions, or OS specific APIs." (Well, I'm not sure if they specified variadic functions in the book. The first ANSI C did.)
Jensen & Wirth Pascal has none of those, making it completely inadequate for anything beyond classroom use, which was what it was designed for.
Each Pascal implementation that was used for systems programming did of course add all of these features, but many of them added them in incompatible ways, with the result that, for example, TeX had to implement its own string type (in WEB).
Take all the Assembly routines from libc, and K&R C turns into a macro assembler with nicer syntax. And not a good one, given that real macro assemblers actually have better macro capabilities, alongside their high level constructs.
Quite visible in the C dialects that were actually available outside UNIX, on computers people could afford, like the RatC (Made availalbe via A book on C) and Small-C (DDJ article series).
Well even dumbest standard pascal compilers like GPC do allow calling into Assembly. So it should count for Pascal as well, if that is the measure.
Then we have this thing sticking with ISO Pascal from and its dialects, always ignoring that this was seen as an issue, that that is why Modula-2 exists since 1978, and Extended Pascal since 1991, one year later after ANSI C (C89 got a short C90 revision fix).
Also following the K&R C alongside Assembly line, several companies were quite successful with Pascal dialect alongside Assembly, including a famous fruit company.
Back in 2024, C extensions keep being celebrated to the point the most famous UNIX clone can only be compiled with a specific compiler, and the second alternative is only possible after a search company has burned lots of money making it possible.
But hey, lets stick to Pascal and its dialects.
1. LISP is easy to start with if you're not a programmer. There is very little syntax to get to grips with, and once you understand "everything is a list" it's super easy to expand out from there.
2. LISP really makes it easy to hack your way to a solution. With the REPL and the transparency of "code is data" model you can just start writing code and eventually get to a solution. You don't need to plan, or think about types, or deal with syntax errors. You just write your code and see it executed right there in the REPL.
For my part, I love LISP when it's just me doing the coding, but once you start adding other peoples custom DSL macros or whatever the heck it becomes unwieldy. Basically, I love my LISP and hate other peoples LISP.
Scheme is not Common Lisp, it's kinda the opposite. It comes with batteries included and MCClim is the defacto GUI for it.
>Stuck with Emacs [from the URL]
Well, Lem tries to be Emacs for Common Lisp, but without having to think on two Lisp languages (albeit closely related) at once.
Once you have a REPL, autocomplete and some docstring looking up tool, you are god.
What this often meant was that getting a feature into your LISP program was something you could do without having to hack at the compiler.
Used to, people balked at how macros and such would break people's ability to step debug code. Which is still largely true, but step debugging is also sadly dead in a lot of other popular languages already.
Today we benefit from having a variety of languages, each with tradeoffs regarding how their expressiveness matches the problem at hand and also the strength of its ecosystem (e.g., tools, libraries, community resources, etc.). I still think Lisp has advantages, particularly when it comes to its malleability through its syntax, its macro support, and the metaobject protocol.
As a Lisp fan who codes occasionally in Scheme and Common Lisp, I don’t always grab a Lisp when it’s time to code. Sometimes my language choices are predetermined by the ecosystem I’m using or by my team. I also think strongly-typed functional programming languages like Standard ML and Haskell are quite useful in some situations. I think the strength of Lisp is best seen in situations where flexibility and have malleable infrastructure is highly desirable.
In Lisp, almost all of the language’s power is in “user space.”
The ramifications for that are deep and your beliefs as to whether that is good are largely shaped by whether you believe computation is better handled by large groups of people (thus, languages should restrict users) or smaller groups of people (thus, languages should empower users).
See this for more discussion: https://softwareengineering.stackexchange.com/a/237523
They change because they are used.
https://www.youtube.com/watch?v=o4-YnLpLgtk
https://www.youtube.com/watch?v=gV5obrYaogU
It makes working on VSCode today look like banging rocks together, let alone 30 years ago.
"Please don't assume Lisp is only useful for Animation and Graphics, AI, Bioinformatics, B2B and E-Commerce, Data Mining, EDA/Semiconductor applications, Expert Systems, Finance, Intelligent Agents, Knowledge Management, Mechanical CAD, Modeling and Simulation, Natural Language, Optimization, Research, Risk Analysis, Scheduling, Telecom, and Web Authoring just because these are the only things they happened to list." --Kent Pitman
In retrospect, saying I don't understand was hyperbolic. Of course I understand that people have their preferred languages. The handful of languages I've used in my career each have their draw.
My comment was meant more to question the assertion that Lisp is the-right-thing, which sounds like asserting the-right-religion.
Lisp definitely does depend on personality type. Quoting Steve Yegge's "Notes from the Mystery Machine Bus" (https://gist.github.com/cornchz/3313150):
> Software engineering has its own political axis, ranging from conservative to liberal. (...) We regard political conservatism as an ideological belief system that is significantly (but not completely) related to motivational concerns having to do with the psychological management of uncertainty and fear. (...) Liberalism doesn't lend itself quite as conveniently to a primary root motivation. But for our purposes we can think of it as a belief system that is motivated by the desire above all else to effect change. In corporate terms, as we observed, it's about changing the world. In software terms, liberalism aims to maximize the speed of feature development, while simultaneously maximizing the flexibility of the systems being built, so that feature development never needs to slow down or be compromised.
Lisp, like Perl and Forth, is an extremist "liberal" language, or family of languages. Its value system is centered on making it possible to write programs you couldn't write otherwise, rather than reducing the risk you'll screw it up. It aims at expressiveness and malleability, not safety.
The "right thing" design philosophy is somewhat orthogonal to that, but it also does pervade Lisp (especially Scheme) and, for example, Haskell. As you'd expect, the New Jersey philosophy pervades C, Unix shells, and Golang. Those are also fairly liberal languages, Golang less so. But a C compiler had to fit within the confines of the PDP-11 and produce fast enough code that Ken would be willing to use it for the Unix kernel, and it was being funded as part of a word processing project, so things had to work; debuggability and performance were priorities. (And both C and Unix were guided by bad experiences with Multics and, I infer, M6 and QED.) MACLISP and Interlisp were running on much more generous hardware and expected to produce novel research, not reliable production systems. So they had strong incentives to both be "liberal" and to seek after the "right thing" instead of preferring expediency.
I like lisp for the most part, but holy shit is the enduring dialog surrounding it the absolute worst part of the whole family of languages by far. No, it doesn't have or give you superpowers. Please grow up.