What "Worse is Better vs The Right Thing" is really about

August 11th, 2012

I thought about this one for a couple of years, then wrote it up, and left it untouched for another couple of years.

What prompted me to publish it now – at least the first, relatively finished part – is Steve Yegge's post, an analogy between the "liberals vs conservatives" debate in politics and some dichotomies in the professional worldviews of software developers. The core of his analogy is risk aversion: conservatives are more risk averse than liberals, both in politics and in software.

I want to draw a similar type of analogy, but from a somewhat different angle. My angle is, in politics, one thing that people view rather differently is the role of markets and competition. Some view them as mostly good and others as mostly evil. This is loosely aligned with the "right" and the "left" (with the caveat that the political right and left are very overloaded terms).

So what does this have to do with software? I will try to show that the disagreement about markets is at the core of the conflict presented in the classic essay, The Rise of Worse is Better. The essay presents two opposing design styles: Worse Is Better and The Right Thing.

I'll claim that the view of economic evolution is what underlies the Worse Is Better vs The Right Thing opposition – and not the trade-off between design simplicity and other considerations as the essay states.

So the essay says one thing, and I'll show you it really says something else. Seriously, I will.

And then I'll tell you why it's important to me, and why – in Yegge's words – "this conceptual framework became one of the most important tools in my toolkit" (though of course each of us is talking about his own analogy).

Specifically, I came to think that you can be for evolution or against it, and I'm naturally inclined to be against it, and once I got that, I've been trying hard to not overdo it.


Much of the work on technology is done in a market context. I mean "market" in a relatively broad sense – not just proprietary for-profit developments, but situations of competition. Programs compete for users, specs compete for implementers, etc.

Markets and competition have a way to evoke strong and polar opinions in people. The technology market and technical people are no exception, including the most famous and highly regarded people. Here's what Linus Torvalds has to say about competition:

Don't underestimate the power of survival of the fittest. And don't ever make the mistake that you can design something better than what you get from ruthless massively parallel trial-and-error with a feedback cycle. That's giving your intelligence much too much credit.

And here's what Alan Kay has to say:

...if there’s a big idea and you have deadlines and you have expedience and you have competitors, very likely what you’ll do is take a low-pass filter on that idea and implement one part of it and miss what has to be done next. This happens over and over again.

Linus Torvalds thus views competition as a source of progress more important than anyone's ability to come up with bright ideas. Alan Kay, on the contrary, perceives market constraints as a stumbling block insurmountable for the brightest idea.

(The fact that Linux is vastly more successful than Smalltalk in "the market", whatever market one considers, is thus fully aligned with the creators' values.)

Incidentally, Linux was derived from Unix, and Smalltalk was greatly influenced by Lisp. At one point, Lisp and Unix – the cultures and the actual software – clashed in a battle for survival. The battle apparently followed a somewhat one-sided, Bambi meets Godzilla scenario: cheap Unix boxes quickly replaced sophisticated Lisp-based workstations, which became collectible items.

The aftermath is bitterly documented in The UNIX-HATERS Handbook, groundbreaking in its invention of satirical technical writing as a genre. The book's take on the role of evolution under market constraints is similar to Alan Kay's and the opposite of Linus Torvalds':

Literature avers that Unix succeeded because of its technical superiority. This is not true. Unix was evolutionarily superior to its competitors, but not technically superior. Unix became a commercial success because it was a virus. Its sole evolutionary advantage was its small size, simple design, and resulting portability.

The "Unix Haters" see evolutionary superiority as very different from technical superiority – and unlikely to coincide with it. The authors' disdain for the products of evolution isn't limited to development driven by economic factors, but extends to natural selection:

Once the human genome is fully mapped, we may discover that only a few percent of it actually describes functioning humans; the rest describes orangutans, new mutants, televangelists, and used computer sellers.

Contrast that to Linus' admiration of the human genome:

we humans have never been able to replicate  something more complicated than what we ourselves are, yet natural selection did it without even thinking.

The UNIX-HATERS Handbook presents in an appendix Richard P. Gabriel's famous essay, The Rise of Worse Is Better. The essay presents what it calls two opposing software philosophies. It gives them names – The Right Thing for the philosophy underlying Lisp, and Worse Is Better for the one behind Unix – names I believe to be perfectly fitting.

The essay also attempts to capture the key characteristics of these philosophies – but in my opinion, it focuses on non-inherent embodiments of these philosophies rather than their core. The essay claims it's about the degree of importance that different designers assign to simplicity. I claim that it's ultimately not about simplicity at all.

I thus claim that the essay discusses real things and gives them the right names, but the wrong definitions – a claim somewhat hard to defend. Here's my attempt to defend it.

Worse is Better – because it's simpler?

Richard Gabriel defines "Worse Is Better" as a design style focused on simplicity, at the expense of completeness, consistency and even correctness. "The Right Thing" is outlined as the exact opposite: completeness, consistency and correctness all trump simplicity.

First, "is it real"? Does a conflict between two philosophies really exist – and not just a conflict between Lisp and Unix? I think it does exist – that's why the essay strikes a chord with people who don't care much about Lisp or Unix. For example, Jeff Atwood

...was blown away by The Rise of "Worse is Better", because it touches on a theme I've noticed emerging in my blog entries: rejection of complexity, even when complexity is the more theoretically correct approach.

This comment acknowledges the conflict is real outside the original context. It also defines it as a conflict between simplicity and complexity, similarly to the essay's definition – and contrary to my claim that "it's not about simplicity".

But then examples are given, examples of "winners" at the Worse Is Better side – and suddenly x86 shows up:

The x86 architecture that you're probably reading this webpage on is widely regarded as total piece of crap. And it is. But it's a piece of crap honed to an incredibly sharp edge.

x86 implementations starting with the out-of-order implementations from the 90s are indeed "honed to an incredibly sharp edge". But x86 is never criticized because of its simplicity – quite the contrary, it's criticized precisely because an efficient implementation can not be simple. This is why the multi-billion-dollar "honing" is necessary in the first place.

Is x86 an example of simplicity? No.

Is it a winner at the Worse is Better side? A winner – definitely. At the "Worse is Better" side – yes, I think I can show that.

But not if Worse Is Better is understood as "simplicity trumps everything", as the original essay frames it.

Worse is Better – because it's more compatible?

Unlike Unix and C, the original examples of "Worse Is Better", x86 is not easy to implement efficiently – it is its competitors, RISC and VLIW, that are easy to implement efficiently.

But despite that, we feel that x86 is "just like Unix". Not because it's simple, but because it's the winner despite being the worse competitor. Because the cleaner RISC and VLIW ought to be The Right Thing in this one.

And because x86 is winning by betting on evolutionary pressures.

Bob Colwell, Pentium's chief architect, was a design engineer at Multiflow – an early VLIW company which was failing, prompting him to join Intel to create their out-of-order x86 implementation, P6. In The Pentium Chronicles, he gives simplicity two thumbs up, acknowledges complexity as a disadvantage of x86 – and then explains why he bet on it anyway:

Throughout the 1980s, the RISC/CISC debate was boiling. RISC's general premise was that computer instruction sets ... had become increasingly complicated and counterproductively large and arcane. In engineering, all other things being equal, simpler is always better, and sometimes much better.

...Some of my engineering friends thought I was either masochistic or irrational. Having just swum ashore from the sinking Multiflow ship, I immediately signed on to a "doomed" x86 design project. In their eyes, no matter how clever my design team was, we were inevitably going to be swept aside by superior technology. But ... we could, in fact, import nearly all of RISC's technical advantages to a CISC design. The rest we could overcome with extra engineering, a somewhat larger die size, and the sheer economics of large product shipment volume. Although larger die sizes ... imply higher production cost and higher power dissipation, in the early 1990s ... easy cooling solutions were adequate. And although production costs were a factor of die size, they were much, much more dependent on volume being shipped, and in that arena, CISCs had an enormous advantage over their RISC challengers.

...because of having more users ready to buy them to run their existing software faster.

x86 is worse - as it's quite clear now when, in cell phones and tablets, easy cooling solutions are not adequate, and the RISC processor ARM wins big. But in the 1990s, because of compatibility issues, x86 was better.

Worse is Better, even if it isn't simpler – when The Right Thing is right technically, but not economically.

Worse is Better – because it's quicker?

Interestingly, Jamie Zawinski, who first spread the Worse is Better essay, followed a path somewhat similar to Colwell's. He "swum ashore" from Richard Gabriel's Lucid Inc., where he worked on what would become XEmacs, to join Netscape (named Mosiac at the time) and develop their very successful web browser. Here's what he said about the situation at Mosaic:

We were so focused on deadline it was like religion. We were shipping a finished product in six months or we were going to die trying. ...we looked around the rest of the world and decided, if we're not done in six months, someone's going to beat us to it so we're going to be done in six months.

They didn't have to bootstrap the program on a small machine as in the Unix case. They didn't have to be compatible with an all-too-complicated previous version as in the x86 case. But they had to do it fast.

Yet another kind of economic constraint meaning that something else has to give. "We stripped features, definitely". And the resulting code was, according to jwz – not simple, but, plainly, not very good:

It's not so much that I was proud of the code; just that it was done. In a lot of ways the code wasn't very good because it was done very fast. But it got the job done. We shipped – that was the bottom line.

Worse code is Better than not shipping on time – Worse is Better in its plainest form. And nothing about simplicity.

Here's what jwz says about the Worse is Better essay – and, like Jeff Atwood, he gives a summary that doesn't summarize the actual text – but summarizes "what he feels it should have been":

...you should read it. It explains why mediocrity has better survival characteristics than perfection...

The essay doesn't explain that – the essay's text explains why simple-but-wrong has better survival characteristics than right-but-complex.

But as evidenced by jwz's and Atwood's comments, people want it to explain something else – something about perfection (The Right Thing) versus less than perfection (Worse is Better).

Worse is Better evolutionarily

And it seems that invariably, what forces you to be less than perfect, what elects worse-than-perfect solutions, what "thinks" they're better, is economic, evolutionary constraints.

Economic constraints is what may happen to select for simplicity (Unix), compatibility (x86), development speed (Netscape) – or any other quality that might result in an otherwise worse product.

Just like Alan Kay said – but contrary to the belief of Linus Torvalds, the belief that ultimately, the result of evolution is actually better than anything that could have been achieved through design without the feedback of evolutionary pressure.

From this viewpoint, Worse Is Better ends up actually better than any real alternative – whereas from Alan Kay's viewpoint, Worse Is Better is actually worse than what's achievable.

(A bit convoluted, not? In fact, Richard Gabriel wrote several follow-ups, not being able to decide if Worse Is Better was actually better, or actually worse. I'm not trying to help decide that – just to show what makes one think it's actually better or worse.)


That's the first part – I hope to have shown that your view of evolution has a great effect on your design style.

If evolution is in the center of your worldview, if you think about viability as more important than perfection in any area, then you'll tend to design in a Worse Is Better style.

If you think of evolutionary pressure as an obstacle, an ultimately unimportant, harmful distraction on the road to perfection, then you'll prefer designs in The Right Thing style.

But why do people have a different view of evolution in the first place? Is there some more basic assumption underlying this difference? I think I have more to say about this, though it's not in nearly as finished form as the first part, and I might write about it all in the future.

Meanwhile, I want to conclude this first part with some thoughts on why it all matters personally to me.

I'm a perfectionist, by nature, and compromise is hard for me. Like many developers good enough to be able to implement much of their own ambitious ideas, I turned my professional life into a struggle for perfection. I wasn't completely devoid of common sense, but I did things that make me shiver today.

I wrote heuristic C++ parsers. I did 96 bit integer arithmetic in assembly. I implemented some perverted form of thread migration on the bare metal, without any kind of OS or thread support. I did many other things that I'm too ashamed to admit.

None of it was really needed, not if you asked me today. It was "needed" in the sense of being a step towards a too-good-for-my-own-good, "perfect" solution. Today I'd realize that this type of perfection is not viable anyway (in fact, none of these monstrosities survived in the long run.) I'd choose a completely different path that wouldn't require any such complications in the first place.

But my stuff shipped. I was able to make it work.You don't learn until you fail – at least I didn't. Perfectionists are stubborn.

Then at one point I failed. I had to throw out months worth of code, having realized that it's not going to fly.

And it so happened that I was reading Unix-Haters, and I was loving it, because I'm precisely the type of perfectionist that these people are, or close enough to identify with them. And there was this essay there about Worse Is Better vs The Right Thing.

And I was reading it when I wrote the code soon to be thrown out, and I was reading it when I decided to throw it out and afterwards.

And I suddenly started thinking, "This is not going to work, this type of thing. With this attitude, if you want it all, consistency, completeness, correctness – you'll get nothing, because you will fail, completely. You're too dumb, I mean I am, also not enough time. You have to choose, you're not going to get it all so you better decide what you want the most and aim at that."

If you read the Unix-Haters, you'll notice a lot of moral outrage – perfectionists have that, moral outrage at something imperfect. Especially at someone who knowingly chooses to aim at less than perfection. Especially if it's due to the ulterior motive of wanting to succeed.

And I felt a counter-outrage, for the first time. "What do you got to show, you got nothing. What good are your ideals if you end up dead? Dead bodies smell bad to us for a reason. Technical superiority without evolutionary superiority? Evolutionary inferiority means "dead". How can "dead" be technically superior? What have the dead ever done for us?"

It was huge, for me. I mean, it took a few years to truly sink in, but that was the start. I've never done anything Right since. And I've been professionally happy ever after. I guess it's a kind of "having swum ashore".

1. TolomeaAug 11, 2012

There is a lot of truth here. In some ways I got off easier, my career taught me early and repeatedly that there is the best technical solution and the best business solution and the business solution wins, because if it's not selling then it's game over.
This is my big complaint with TDD, don't get me wrong, testing is good and most of us aren't doing enough. But TDD takes it too far, the goal is not to write tests, the goal is to ship product, testing is a means to an end.
And don't make the mistake of thinking OSS is immune to this, being paid in cred and kudos instead of $ doesn't change the fact that if nobody is using your code then it might as well not exist.

2. Dmitry NovikAug 11, 2012

Good one (though it could be much shorter :-)

3. Terry A. DavisAug 11, 2012

You are a ugly sick atheist like I used to be. Trust me atheists are less fit — get no girls. You cannot argue that it's not fair Germany lost the war. Natural selection is blind.

4. Yossi KreininAug 11, 2012

@Tolomea: agreed about OSS; generally "viable" is about users of some sort, not necessarily money.

5. Sean Jensen-GreyAug 11, 2012

When one makes the epiphany towards evolutionary construction it will guide much of what you do. The first version is a mad dash to complete that feedback loop, without the feedback loop nothing you do is viable. Then you iterate and increase the fitness, all while maintaining the feedback loop. If you spend too much time increasing the fitness w/o getting it into the market to compete, you are optimizing the wrong thing. The market will tell you what you need to fix.

Any extra engineering that goes into making it better is energy wasted in not shipping sooner. Things can only be better in context of what they are not, what they are competing against.

Write drunk, edit sober.

@Tolomea, testing is vitale to completing the feedback loop and optimizing the fitness. If the testing isn't increasing the fitness relative to the expenditure then stop. One can certainly over test. You need the least amount of energy to go from on local minima to another.

6. ITAug 11, 2012

> completeness, consistency and correctness all trump simplicity

It's very difficult to claim that Lisp is on the left side of this comparison (maybe C++ would be a better example). Would anybody seriously claim that C is simpler than Lisp?

I think the wikipedia article gives a very good explanation of how simplicity fits into Worse is Better:


7. Barry KellyAug 12, 2012

In this post you've also captured something of why I have a some amount of disdain for certain programming language dilettantes that frequent forums like LtU. They arrogantly proclaim such and such features of their pet languages as clearly superior than what has won in the marketplace, without seeming to understand market constraints, the whole messy hairy beast of it. The smug moral high ground is actually a trap.

8. Yossi KreininAug 12, 2012

@IT: actually, Gabriel claims that C is simpler than Lisp – simpler to implement efficiently and in some ways simpler to write efficient code in. It's all in the original essay (and in the larger text that it's part of that is concerned mostly with the future of Lisp).

9. Z.T.Aug 12, 2012

I think you mis-characterize what Linus Torvalds meant when you say that he meant that competition is a source of progress. It isn't. Iteration of change and a real world fitness function are the source of progress.

Alan Kay said that if not the need to ship or to pay programmers for their time, he could spend infinite amount of time polishing a product until it was perfect. That deadlines and budgets and backwards compatibility all make products less than perfect. This is true.

Linus spoke against "big design up front", and in favor of iteration, because during iteration you discover information you did not have before (true in all circumstances), and specifically in open source, because releasing an early (barely working) version sooner lets you attract more help sooner.

Multi-player development (cooperative or competitive) is useful because no single person knows what the whole human race knows, and for scalability. But it's not necessary: a lone genius who is financially secure, given enough time, can iterate a design and develop the perfect product, without any help from economic forces. This design would not be simple, would not be backwards compatible and would not be constrained by deadlines or budgets. Exactly what Alan Kay wished. The question is would it ever ship and would it even matter.

Design process using iteration ("Every large system that works started as a small system that works") and economic forces influencing a design for the better are orthogonal.

10. Aaron DaviesAug 12, 2012

The most perfect software ever written is conventionally assumed to be TeX. Any thoughts on how it fits into this paradigm?

11. M.R.Aug 12, 2012

It's not about evolution, but about the criteria to decide what "better" means: whether "better" means useful to a mass of people or whether the creator's mind is the sole judge of quality. It's the same conflict that exists in Art between the Artist's view and that of the rest of the world.

12. Joshua BowlesAug 12, 2012

Nice. Many examples from human evolution parallel this, i.e.: We don't have great speed, strength, or sharp claws or teeth. But, having lost a lot of hair (which was also a disadvantage) we could run long distances without overheating — due to perspiration. This gave us a distinct advantage in getting away from predators. Many of our worst attributes helped support one attribute we could exploit for advantage. Others abound too: upright walking, formation of our larynx, nutritional needs of our brains....

13. Yossi KreininAug 12, 2012

@Z.T.: I guess you're right that it doesn't have to be competition; you do need a fitness function though, and it has to be (quote) "ruthless". A rich designer facing no economic pressure would submit his work to what ruthless fitness function? Generally, if you can think of any real-world fitness function that doesn't involve the concept of a practical alternative being better (which is what competition brings, and which is where "ruthlessness" comes from), then you're right and "competition" isn't necessarily what was meant.

@Aaron Davies: most people, including me, don't use TeX and would go to great lengths to avoid using it. I'm not very informed about TeX, but as a guess, I think it's a good example of The Right Thing design. (I mean TeX as the entire system, including the editing interface and the macro language, and not parts of the rendering algorithms).

@M.R.: well, it's related, to the extent that "useful to a mass of people" is what evolution pushes you towards. Likewise, the disdain for the masses and the belief in the extraordinary talents of a select few (such as capitalized Artists) is related to the disdain for economic evolution.

14. ParamedicAug 12, 2012

Do not hesitate to settle for the 2nd best solution

15. superbowlpatriotAug 12, 2012

Natural selection had a bit of a head start...

16. M.R.Aug 12, 2012

@Yossi: you're entirely wrong in considering "evolution" as the primary issue here. That would be true if those you say are against "evolution" would build everything in one go and declare the result as perfect, but in fact(and provably so) that's not how things happened: Symbolics didn't just build their OS without any kind of feedback loop, without making&abandoning prototypes and trying different approaches; painters would make many sketches(études) and change their minds before setting to complete the final version*, poets would likewise rewrite verses even hundreds of times before publishing.
The essence of the matter is whether the maker/builder is ego-centric or exo-centric, i.e. it's all about who decides when the result is fine and no further work is necessary.

* Except for "modern" abstract painters who trow a few buckets of paint on a canvas and decide that the result is a masterpiece, but modern art is a joke anyway

17. ZimmieAug 12, 2012

Still reading, but I had to comment on a problem I have with the part I have read. Information theory puts the lie to Linus' thoughts on the human genome. The classic example used in young-earth-creation versus evolution arguments is the 747. Unfortunately, people fail to realize that a 747 is *enormously* more complicated than a human. We have lots and lots of redundancy. That's the whole point of research involving undifferentiated cells.

Our genetic code fits in a few hundred megabytes (roughly 3 billion quaternary digits into bytes comes out to ~640 MB). If you trimmed the sections that don't code for any proteins we actually manufacture, it would probably be less than a hundred megabytes of actual information. The 747, meanwhile, takes hundreds of megabytes just for the software to run it. The mechanical specifications add hundreds more on top of that. Keep in mind, you can't simply reference parts like a 555 timer. You essentially have to include the VHDL files for every chip used. Our genetic code specifies about 20 different amino acids. That's how many fundamental parts/operations you get to use to describe the plane.

Sure, specifying a particular human would take a lot of information, but specifying a particular 747 would also. In each case, you would have to describe wear patterns and the like, but in the 747's case, its sheer bulk means it would still take more information to describe to the same level of detail.

18. Steve KlabnikAug 12, 2012

Don't forget that in the Real World, markets came about with the rise of states, and often were used as a tool of domination. So while in a theoretical, math sense, markets make sense, in the world of atoms rather than bits, 'worse is better is markets' may not actually make very much sense.

David Graeber's "Debt: the first 5000 Years" is pretty fantastic if you've never read about this anthropology.

19. wannabe editorAug 12, 2012

Yosef, shouldn't Alan and Linus be switched in the paragraph that starts out "Just like Alan Kay said" ?

20. Don HopkinsAug 12, 2012

Great article, and lots of good points!

I wrote the Unix-Haters chapter on X-Windows, which was kind of like shooting fish in a barrel. At the time I would have been quite astonished to know that X-Windows would be alive and well today. It has however adopted some of the ideas from NeWS as higher level libraries, like Cairo's stencil-paint / Porter-Duff imaging model. But it never got the extensibility thing right. However that problem has been solved again at an even higher level: the web browser.

One way I explain what NeWS was is this, which I contributed to the wikipedia page http://en.wikipedia.org/wiki/NeWS :

NeWS was architecturally similar to what is now called AJAX, except that NeWS:
used PostScript code instead of JavaScript for programming.
used PostScript graphics instead of DHTML/CSS for rendering.
used PostScript data instead of XML/JSON for data representation.

NeWS had a lot of flaws (most importantly the fact that it was not free), but the thing it got right was having a full-fledged programming language in the window server (what we now call the web browser). Yes, PostScript is a very high level language, more like Lisp than Forth, and NeWS had a nice dynamic object oriented programming system that was a lot like Smalltalk. The thing NeWS really needed was to have that same language on the client side (what we now call the web application server). So I like the approach that node.js has taken — there's a huge advantage to being able to share the same libraries and data structures between the client and the server. And it takes a lot of mental energy to switch between different languages and data models when you're writing code on both the client and the server side.

Another example of the perils of "worse is better" that I've had experience with is pie menus — http://en.wikipedia.org/wiki/Pie_menu . Research has proven that they're faster and less error prone than linear menus, yet they haven't been widely adopted. There are many reasons, some technical, but one of the major non-technical problems seems to have been the cargo-cult approach to user interface design that the industry has taken, and the "not invented here" attitude of user interface standards pushers.

I gave a demo of pie menus to Steve Jobs right after he released NeXT Step, and he jumped up and down yelling "That sucks! That sucks! Wow, that's neat! That sucks!" and claimed that the NeXT Step menus were superior because they did lots of user testing, even though they never compared them to pie menus. So Apple has never adopted pie menus, even though the "swipe" gesture is so common on the iPhone and iPad — yet they never provide a "self revealing" pie menu to prompt you which swipe directions perform what actions.

I gave up trying to convince user interface standards pushers to adopt pie menus for standards like OPEN LOOK and applications like Microsoft Word, and decided a better approach to making them popular would be to use them in a game, whose user interfaces are more open to innovations, and whose users are more accepting of novelty.

I joined Maxis to work on The Sims, and implemented pie menus for controlling the people. That worked out pretty well, and exposed a lot of people to pie menus.

I had another experience in developing The Sims, which confirms the "Worse is Better" hypothesis, which is a harsh reality of the games industry and the software development industry in general: I pointed out to my manager that the code was shit, and we really needed to clean it up before shipping. So he sat me down and explained: "Don, your job is TURD POLISHING. If you can just make your turd nice and shiny, we will ship it, and everybody will be happy with you, because that is what we hired you to do." But then at least he gave me a few weeks to clean up and overhaul the worst code. The moral is be careful what you ask for, or you might have to be the one who shovels out all the shit.

21. FabrizioAug 12, 2012

Nice article, thanks for sharing.

@ M. R.
I view it as different degrees of openness to feedback. Zero is the hermit, one is the for-profit business. Artists, researchers and FLOSS movement leaders fall in between at various degrees.

All of them are necessary, and it is a matter of personal preference.

But, the results of zero-feedback efforts are much more dependant on the personal ability of the proponent.

On the other hand, when you ask for feedback to someone, you implicitly enter into a negotiation about the overall objective of the work.

Pure, uncompromised ideas (and terse code) are more likely to come from zero-feedback (if the proponent is not very smart it will be a bad idea, if he/she is very smart it will be a shiny, inspiring idea).

Products (and crufty code) are more likely to come from one-feedback.

22. Aristotle PagaltzisAug 12, 2012

M.R. beat me to the punch in #16. I would go on to say it depends on what the maker considers the source of worth in his creation: do they seek to create something that in itself embodies some whatever measure of rightness or do they seek to create something... “effective” (for want of a better term)? In an essential sense, then: is what they are doing art, or design?

23. Aristotle PagaltzisAug 12, 2012

Now that said, I’ve got me here a hand grenade, so let me pull the pin and throw it into this argument (*cackle cackle*):

Do Apple practice “Worse is Better” or is it “The Right Thing”, and are they succeeding with that or not?

24. RyanAug 12, 2012

I don't feel that you're making the right dichotomy. I think the major distinction is not between those who think competition and markets are good or bad, but those who think that markets and competition are tools to be employed by rules-makers to aid in human achievement and those who feel that markets and competition are independent natural forces which no human concern need bow to.

25. JohnAug 12, 2012

I stopped after a few paragraphs as there was an initial contradiction. If conservatives are more risk averse tha liberals then it would follow that the former would be anti-marketplace and the latter pro-marketplace, as nothing is more risk and uncertain than the open market.

26. Yossi KreininAug 12, 2012

@Don Hopkins: thanks for Unix-Haters! If you have some mailing list archives (I only found archives from 90 to 93), and/or some stories on how the book came about (were you really the first in the genre? I was certainly inspired by Unix-Haters, in part, when writing the C++ FQA), I'd be delighted.

Node looks interesting; JavaScript ought to be the absolute worst-is-better piece of software ever though...

I saw pie menus just yesterday in an Android tablet's camera app – perhaps touch gave them a push.

27. Yossi KreininAug 12, 2012

@M.R., Fabrizio, Aristotle Pagaltzis: I guess evolution without any sort of market pressure is like evolution without natural enemies or competition for food. Perhaps evolution but not quite as "ruthless" a "feedback cycle". And, I didn't say The Right Thing was "against evolution" – at least not everywhere; I specifically said "economic evolution". Sure, The Right Thing is all for evolution from a Right Thing to a Righter Thing – but it's a different kind of evolution.

As to Apple – I know too little about the history of its products or what's under the hood to say much about this... Also, Apple is arguably a product company first and only secondarily a software company, a chip company, etc., and I'm absolutely not qualified to discuss physical end-user products. Nice hand grenade though.

28. Yossi KreininAug 12, 2012

@Zimmie: you're looking at program size as a measure of complexity; I'd rather look at functionality. That our specs are more verbose doesn't compliment us – "information theory" interpreted that way would rank a petabyte worth of random noise as still more "complicated"/"irreducible" than the 747, but so what? As to functionality – hard to measure, but, I dunno, humans can build the 747 but 747s cannot build humans, I really don't know how to argue about this. I think anyone who's been around complex machinery intuitively feels a certain revulsion at how dumb and common-sense-lacking it really is and would never claim it to have "exceeded" humans.

29. Yossi KreininAug 12, 2012

@John: if you read those few paragraphs more carefully, then you'd realize that the part about risk aversion isn't mine but Steve Yegge's, and the fact that I summarize someone's writing and his writing contradicts my own words that follow isn't a contradiction. As to "nothing is more uncertain than the open market" – try living under a communist government and predict its moves for a while and then we'll see what you think...

30. Yossi KreininAug 12, 2012

@Ryan: you mean the distinction in politics, not in software, right? There, I think the right distinction is between people who think "rule makers" are likely to improve market outcomes by rule tweaking – "rule makers are wiser and better than markets when left alone" – and those who think "rule makers" are likely to only make things worse by tweaking – "rule makers are dumber and more evil than markets when left alone". Which could be summarized for brevity as how I put it or how you put it.

31. Yossi KreininAug 12, 2012

@Steve Klabnik: let's talk about real world tech markets of today, and then you can point out how the difference between some theory of markets that you think I assume is different from the reality that is relevant to us today as you perceive it.

@wannabe editor: Linus and Alan should stay as they are in that paragraph.

32. OlegAug 12, 2012

@Yossi Kreinin: Minor nitpick about TeX. The core typesetting engine may be the Right Thing, but the whole infrastructure around it (LaTeX etc.) is actually the typical evolutionary design, consisting of many intermingled parts.

33. Yossi KreininAug 12, 2012

As I said, I know little about TeX; I do know that LaTeX isn't technically a part of TeX though. How many people use it out of those using TeX I don't know. I think the core of TeX does include the macro language that LaTeX is built atop and that I find rather awful. An evolving program would grow something more decent, I think, much like gdb 7 finally added Python scriptability in addition to gdb's own scripting facilities.

34. JonasAug 12, 2012

In your effort to make stuff up as you go along, you make both Torvalds and Kay look like they have vague guiding principles and put yourself as the arbiter between them.

But if you want to be productive and learn something out of this, you first have to assume that both Torvalds and Kay are way more intelligent than you are. Because that is the probable situation after all. Both persons may understand the other's viewpoint perfectly, but they have very different goals.

(Also, Smalltalk inspired by Lisp? Wtf? They have the idea of syntactical simplicity in common, but differ in every other possible way, and Kay is not known to be a Lisper.)

Better examples might be found in the comments, which include TeX and a few select pieces of Apple stuff. What they have in common is that they are for a niche market, but one which they completely own even thirty years later with legacy code. That's something. So worse is not always better.

Don Hokins: I really loved those stories. If you have a blog I'd love to read more!

35. neleaiAug 13, 2012

same idea was described at


36. Robert MAug 13, 2012

Much of this discussion reminds me of Barry Schwartz' 'The Paradox of Choice', and the entire maximizers vs satisficers line of discussion.

Schwartz gave a tech talk at Google a few years back, and it can be viewed here:


Wikipedia's article of satisficing is a worthy read as well.


37. Yossi KreininAug 13, 2012

@neleai: not quite the same idea, but a related one, which is why I linked to that page right at the second paragraph.

@Robert M: it related, somewhat; the thing is, "good enough" doesn't look like 70% of "the best" – sometimes it's going in an entirely different direction, a step back if you hope to reach "the best" eventually – this is why a "maximizer" is so grieved by the work of a "satisficer" – here's my inappropriately-titled piece on that one.

38. WandspiegelAug 13, 2012

If he is a perfectionist and has been for many years, then his worst can only be so bad...

39. Daniel LeeAug 13, 2012

Reading that your motives have "evolved" away from perfection, I couldn't help but feel both sorry for, and jealous of you, all at once. I mean, to have had the devotion to perfection beaten from you by the constraints of software development in the real world sounds akin to realizing that even the deepest love is still only driven by the biological imperitives of its participants. Fellow idealists, cry out! At the same time, it must be nice not to be feel the angst I feel over the fact that everything I write turns out to be something less that completely satisfying to me. The way I have come to reconcile myself to this is that, instead of your "realistic" lowering of standards, I am resigned to my own dissatisfaction, as long it results in a better product for my users.

40. Joshua DrakeAug 13, 2012

"if you want it all, consistency, completeness, correctness – you’ll get nothing" Isn't that what Godel said? http://en.wikipedia.org/wiki/G%C3%B6del%27s_incompleteness_theorems.

41. JSAug 13, 2012

As a politically left-oriented person, I see problem with markets not in competition or evolution, but with the monetary inequalities they create, which translate to power inequalities. And I believe the most leftist have the same problem with free markets, i.e. inequality they create, so the article is one giant strawman.

And Steve Yegge's liberal/conservative distinction is yet different then right/left view of the world.

42. Yossi KreininAug 13, 2012

@JS: the problem you see is inequalities, but what is your solution that is consistent with competitive markets? If objecting an outcome effectively leads you to object its cause, then I think one might say you object the cause.

@Daniel Lee: biological imperatives, my ass. As to devotion to perfection – it is indeed a rather sweet drug, just one that is rather hard to afford.

@Joshua Drake: ...unless of course you're willing to consider simple enough sets of axioms, or infinite, non-computable sets of axioms.

43. gus3Aug 13, 2012

If you're going to compare Worse-Is-Better vs. The Right Thing, and refer to The UNIX HATER'S Handbook, how could you omit mentioning the OS with everything *and* the kitchen bit-sink, VMS?

Wasn't it Ken Olsen who said, "The beauty of Unix is it's simple, the beauty of VMS is that it's all there"?

44. unhumanAug 13, 2012

Интересная статья! Как и все остальные в этом блоге.

Но вот по поводу перфекционизма думаю так: все дело в том, что для того, чтобы найти оптимальный компромисс между несколькими значимыми факторами, надо задействовать гораздо больше нейрончиков, чем чтобы развивать какой-то один :D

Поскольку значимые факторы, как правило, связаны обратно пропорциональной зависимостью (напр. знаменитый «проектный треугольник» время-качество-деньги), становится понятно, почему так тяжко что-то нормально сделать.

А ведь их еще и найти-то надо, эти значимые факторы (по принципу Парето, действительно значимых меньшинство)! Они и для каждого проекта свои.

Удивительно, что иногда, наиболее значимый фактор нет-нет да и оказывается как раз тем, на котором и сосредоточился «перфекционист», и, если он не достаточно навредил проекту, возникает обманчивое ощущение The Right Thing.

Но расстраиваться думаю не стоит: если организм и есть оптимальное для данной эволюционной ситуации соотношение факторов (а он и есть), то он зря нейрончики напрягать не будет. А если напрягает, и до сих пор не отсеялся отбором – стало быть, напрягает не зря. ))

45. Aaron DaviesAug 13, 2012

I should note that TeX itself, not LaTeX or any of the rest of the TeX ecosystem, is what I was thinking of in my comment—what makes TeX special is the amount of time, both in design and implementation, that Knuth has put into it.

46. Jeremy ThorpeAug 13, 2012

Substituting your definition of "worse is better" for Gabriel's renders the thesis tautological: "Those things which have better survival characteristics are better, for they are more likely to survive."

Read the original article again. It says that simplicity wins.

47. Yossi KreininAug 13, 2012

@Jeremy Thorpe: I realize that the original essay is around simplicity, and I point this out very clearly in my text above. Two things though: one, apart from your tautology, there's the bit about the "best" things in the evolutionary sense tending to be "worse" than "the right thing" in some other significant senses – this is not tautological. And, what I tried to show was, based on real examples of "survivors that look worse than non-survivors" and others' perception of the essay – it's not just me who tends to de-emphasize simplicity in the essay and come out with a different take-away that is more central on "survival of the worse".

@Aaron Davies: I was also talking about TeX, not LaTeX; the what-you-say-is-what-you-get editing model and the macro language, which put off most users including me, are part of TeX's core (though the macros making up LaTeX aren't).

@gus3: I didn't mention VMS because of being utterly ignorant about VMS...

48. Fadi El-EterAug 14, 2012

LISP is definitely much simpler than C. I remember one line codes in LISP that were the equivalent of pages in C. (Remember, high order functions)

The nice thing about LISP is that you can write to generate code that generates code – and that's why it was the language of choice for AI. LISP was dismissed because it was memory hungry and slow.

Nowadays, LISP is used in research labs and to teach university students AI.

I wonder how the world would be if LISP prevailed.

49. Yossi KreininAug 14, 2012

@Fadi El-Eter: Gabriel, who said C came out of a design style favoring simplicity, didn't mean that C was simpler as in "more expressive", but simple as in "simpler to implement efficiently and write efficient code in".

50. Nick BaumanAug 14, 2012

The two axioms are what are known as a dialectic. One is the thesis (probably this is "The Right Thing"), the other is the antithesis (this is arguably "Worse is better"). What's always happening is some kind of synthesis.

Because you cannot know in advance, cannot definitively prove, the correct thesis for a problem that involves something as complex as human and machine interaction. The Halting Problem alone says this. And you cannot just throw a bunch of bits at a computer and hope for the best. You must work somewhere along the continuum between the two. The extremes are wrong, both of them. They are myths. Myths are best when taken as metaphor (which everyone around this issue seems to be doing, which is good).

51. RobDAug 14, 2012

Thanks for the insight. I followed the "Worse is Better" debate for a long time at the time it was written. I always wondered what criteria the debaters were using to decide "better". "consistency, completeness, correctness" of course, but as applied to what definition of the problem? That's the kicker, I think: do we want a compromised definition of the problem or do we want to stick to the aesthetically more pleasing definition which is in our individual designer's head?

The problem definition in the designer's head is more aesthetically pleasing because it is in itself "consistent and complete" and is of course then solvable "correctly". The more compromised (i.e. more inclusive) definition will include other views of the problem as well as questions of profitability, timeliness, etc. and is not itself (and can never be) "consistent and complete" and so can never have a "correct" solution.

So my view is not that the question is tied to views of evolution or whatever, but that there are no isolated problems: all problems are interrelated and the definition of any problem is infinitely expandable in all directions, so that the selection of an isolated issue to solve "correctly" always ends up imposing arbitrary boundaries. Including a larger public in the definition of the problem always fuzzies up the edges and makes the problem both less attractive and incapable of an obviously "correct" solution. But the solutions to these fuzzy problem definitions are always more useful than the solutions to the smaller cleaner definitions because of the synergy created in expanding each definition to be inclusive of more points of view.

52. Jeremy ThorpeAug 14, 2012

@Yossi: It's not my tautology either. It's the tautology that is left once you generalize as much as you have. Okay maybe it's not a complete tautology: "The right thing doesn't always win." You don't say!

I realize that you're not the only one to do this, and Atwood is equally wrong to lump x86 in with "Worse is better". There are some things that neither MIT nor New Jersey would be proud of, and they may even have won in the market.

Gabriel connects two dots: simplicity and fitness. "The right thing" is in the background. Why you'd want to build something fit for survival is also an exercise left to the reader, though you seem to have taken it up.

53. Yossi KreininAug 14, 2012

@Jeremy Thorpe: it's more like "the right thing never wins"; still one could reply, "you don't say" I guess...

@RobD: I'm not sure that The Right Thing is about solving more isolated problems; frequently it's actually more about expanding your solution to handle every imaginable case. Much of the outrage in Unix-Haters is against all the parts of the problem Unix ignores that others systems handle, and how real problems (such as irretrievable data loss after typing rm *>o instead of rm *.o) are ignored by Unix aficionados because in the Unix aesthetics, there is no problem here. That The Right Thing is "right" in the designer's head more so than in the real world could be considered true from a Worse is Better perspective, I guess; but it's not necessarily because of ignoring more real-world uses or points of view.

54. Drog AltAug 15, 2012

If anything I think Linux has proved the problem with evolution, and with being to "progressive". I guess it could apply to politics, too, good analogy.

Good stuff gets thrown out or mauled due to people not understanding or caring why it was there in the first place, or even people who have an agenda they want to push onto computing.

Everything gets thrown in just because, regardless of how much cruft you introduce.

I don't think there is a real issue about "simplicity" but when you choose to do one thing then it sets the design off in a certain direction, however so slightly.

So look at the uncanny valley, for example. Is it really that the more near to humanlike something it is, the creepier? I don't think so.

To me it's always been obvious that the cartoony images are sort of a least common denominator. It only shows what's there and more or less correct (if exaggerated) in everyone. The more detail you add, the more fine detail, the further it deviates from correctness or in this case what you personally need or want.

At the high end you get windows 8 completely insane over the top complexity masquerading as being simple because it does it all for you. Well, I guess it does but what it really does is instead of giving you tools do do what you need tries to make something that does everything for the least knowledgable people caters to handhelds.

And success is just nothing to do with it. They are making an argument that one thing is better based on success, but a lot of that is based on who does the deciding and how they decide.

So probably just a pointless comment, sorry, but interesting post though I'm not sure how much of what you're saying really has any connection.

55. rusAug 16, 2012

Nicely written, well researched, and well concluded. Thanks :)

I had a similar experience of watching my beautifully simple, smug architecture totally fail to handle an important use case. I had to hack it up to make it work, and that was humbling, and an important lesson.

56. Yossi KreininAug 17, 2012

@rus: glad you liked it!

57. KenAug 20, 2012

Fast, cheap, or easy: pick one. Better or worse is always relative to some measure of merit. Change the measure to change the perceived value. Want to destroy a meeting, a product, or an organization, keep changing the target. When everyone is focused on quality, raise concerns about schedule or cost. When everyone focuses on schedule, bring up quality and cost issues. If the discussion addresses cost, consider quality or schedule.

Why Unix/C ? Simply put, the price was/is right. It was effectively a $0 cost option to educational institutions. ATT couldn't sell it, so they gave it away. This "gift" sent many generations of computer scientists, computer engineers, and software engineers on down the road from college or university with an interesting OS and language bias. A classic continuing case of the cheap limiting the available options. Better? Worse? Just different? On what dimension are we determining "merit". I am now retired, and can just throw darts at the balloons as they drift by.

58. Xah LeeAug 26, 2012

i can't condone any good things said about unix.

about Yegge's article categorizing coders as American political left/right, i think it was silly, and i half expect him to declare it being a hoax.

lots of these discussions is philosophy (as opposed to science), and not with philosopher's stringent training.

It reminds me of the tale of 7 blind men feeling an elephant ( http://en.wikipedia.org/wiki/Blind_men_and_an_elephant ). Each person see it to his experiences, seemingly fitting, and arguably not incorrect. Steve yegge want to fit American left/right poltical thought. In this essay we have economic evolution as framework. I don't see value in either of these two descriptions. It seems like politicians validating opposite view points with the same data.

of all the essays mentioned, i do highly admire Richard P Gabriel's Worse Is Better section of his lisp talk. It's not scientific analysis or analytic philosophy, but it hits my spot. Because i felt his description of unix vs lisp design mindset is perfect, and he used virus to describe the survival advantages of the unix mindset.

i do not believe that Worse Is Better has better survival charateristics in the long run. Nor do is see today's software as dominated by Worse Is Better. Of course, this is again all babbling of gut feelings, until one scientifically defines what's really “Worse Is Better” or “the right thing”.

PS Second Life used pie chat from at least 2006 to 2010, and i loved it, and it's easier.

TeX is Worse Is Better. I can't hate it more. ☺

enjoyed your C++ criticism very much. It was how i found you few years ago.

59. GDSep 8, 2012

I am quite late to the party, but feel obliged to leave my response here, on what seemed like the longest list of responses. I do not know if it was on your mind, and I guess you certainly did not want to push discussion on this, but does not this all, remind you a bit of the Evolution vs. Intelligent design culture clash ?

The timeless Religious/Agnositics debate non withstanding, I guess you can also put it to Idealist vs Cynic, etc ....

Its all damn too much philosophical to ever have somebody one one camp ever convince the other party, that's for sure ....

60. Yossi KreininSep 8, 2012

@GD: it is related somehow, but in many different ways or so it seems to me, which is why I didn't want to go there.

One such relationship that I did mention was, people who do believe in evolution and don't believe in a single bit of intelligent design ever taking place – those people differ among themselves in the extent of "awe" they have for what they believe are results of evolution. Some think the results are amazing (implying they could never achieve anything like it if they were tasked with an "intelligent design" of this kind) and some see a tangled mess of genetic bugs (implying that they, or someone not unlike them, could in fact do better).

The upshot being, some people have a much higher esteem of human ability and those tend to sneer at evolution, both biological and economical, while others are much more pessimistic in that regard, and those tend to think of evolution as a good thing.

61. StevePSep 10, 2012

Unless you can predict the future, there is no such thing as "The Right Thing". There is only "Attempting to Guess The Right Thing" — which fails more often than doing that and just solving immediate problems in a way that is simple, effective, and as open as possible with the limited perspective you have at the time.

62. Yossi KreininSep 10, 2012

@SteveP: If you believe that you know what the Right Thing is based on your own aesthetics and value system, then this belief will not be shattered when it turns out, at whatever point in the future, that people choose to use something else. Then the people are simply wrong, or misinformed, or robbed of their choice by some force or circumstance. If, to you, "The Right Thing" means a correct prediction rather than recovering a timeless truth, then you have a kind of a Worth is Better attitude.

63. trijezdciJan 16, 2013

And what if I don't care if anybody uses my stuff? What if I am doing it for *myself*, not for anybody else? What if I am doing it because I am sick of using crap tools. What if my own stuff actually does work better for *me*? To me that is superiority enough. I don't need to be a world shaker. I just want to work with tools that suit me.

64. Yossi KreininJan 16, 2013

In that case, it's a bit unclear why you care to tell this to the world.

65. Jecel Assumpcao JrJun 18, 2013

To me, the important thing about "worse is better" solutions is that they are simplified. I mean that in the same way a physicist might calculate the outcome of a horse race by supposing perfectly spherical horses running in a vacuum. These solutions are also simpler, but that is a side effect rather than a goal.

Simplicity is just a starting point. The 8086 was very simple compared to the "right thing", which was the Intel iAPX432. It had half the transistors of a Motorola 68000 or a National 16032. But once it get its foothold in the market, the "worse is better" solution grows and grows in complexity. How long did it take Unix to become far more complex than Multics?

Another good example of the struggle between these two alternatives is Xanadu vs the World Wide Web.

66. Yossi KreininJun 19, 2013

Sure simplicity is (sometimes) just a starting point; sometimes it's not – JavaScript is intrinsically not that simple and it wasn't all that simple from day one. At any rate, sure the 432 is more complex than the x86 implementations of the time, but Itanium is less complex than the x86 implementations of its time; both were eaten by x86. When we call x86 "worse is better" when comparing to both, it's pretty clear that complexity is not what makes us classify things as we do. "The Right Thing" may be more or less complex; what makes it "The Right Thing" is the focus on doing the right thing through deliberate design vs the focus on evolutionary pressures and where they push you.

67. Paul BOct 23, 2013

My take on the matter is that, like evolution, the more flexible is typically the survivor. Taking the human genome example, we have a vast surplus of codons for different traits that either have no clear merit or are actively detrimental in common circumstances. What this gives us is the flexibility for some of the species to survive in just about any possible circumstance. Likewise with code, the code that can be modified to suit changing needs is longer lasting than the code that solves one problem perfectly. Occasionally a problem is identified that is persistent but has only a handful of useful solutions. TeX seems to solve one of these and succeeds despite having an unpleasant macro language.

Unix succeeds because it has been able to evolve to keep up with both user needs and changing hardware where less successful systems typically could not be easily ported and more portable systems had to sacrifice even more features. X86 succeeds because by rigorously maintaining backwards compatibility it allows multiple generations of software to benefit from the latest hardware.

68. Big MacSep 18, 2014

"Worse is Better" is a loaded phrase. It presumes the existence of something superior...but isn't that what this debate is about?

Perhaps it is more accurate to describe this side of the debate as "Good Enough".

69. gFeb 15, 2016

I know I'm coming to this years after the fact, but I wanted to comment on Jonas's WTF about Lisp and Smalltalk. Alan Kay is in fact a big fan of Lisp, and has said in so many words that Lisp was a big influence on Smalltalk. A few examples:

"We had two ideas, really. One of them we got from Lisp: late binding. The other one was the idea of objects." (So, Smalltalk was built on two ideas, and one of them came from Lisp.)

[This is from a description of a talk AK gave in2006.] "Alan uses John McCarthy and Lisp as an example of real science in computer science. He showed us that you can build a system that’s also it’s own metasystem. [...] Alan used McCarthy’s method to design an object oriented system." (So, the other big idea in Smalltalk was objects; and Kay designed the object system in Smalltalk using a technique he learned from the original Lisp paper.)

I think it's pretty fair to say that Smalltalk was inspired by Lisp. (Alan Kay also called Lisp "the single greatest programming language ever designed", and he called "The art of the metaobject protocol" — a book describing the metaobject system in Common Lisp — "the best book written in computing for ten years". As I say, a big fan.)

70. Maxim KhailoFeb 15, 2016

I think the history of technology is this. Grand visions that are publicly funded (Computers, The Internet, etc) have been since privatized and destroyed by the market.

I don't think you should compare Smalltalk vs Linux, but Alan Kay's Desktop GUI and Dynabook to today's UIs and the iPad.

It isn't a history of competing ideas that the market choose is correct. It is the history of grand ideas incubated in the public sector and then further distilled and misunderstood by the private sector.

The reality is the Market has little in making the decision between the grand idea and the distilled one. They never had the opportunity to decide. They were always given worse ideas to choose from.

71. Maxim KhailoFeb 15, 2016

And by "They" in my previous remark, I mean the consumer.

72. Maxim KhailoFeb 15, 2016

Just like when you walk into a super market. You are given options that were decided for you. That were filtered through thousands of decisions made outside the market beforehand. The market only chooses things available to it. But it doesn't have to be this way. We can choose to make things available outside the market. The market would have never produced computers or the internet, for example. Those things came about outside the market.

73. Nick PFeb 15, 2016

You beat me to it as I'm mentally working on a similar essay. Backward compatibility and shipping pressure I already covered a lot in my posts on Schneier's blog elaborating on this. See Steve Lipner's Ethics of Perfection essay for a great take on "ship first, fix later" mentality. He had previously done a high-assurance, secure VMM. So, he had been on both sides.

On backward compatibility, you need to explore lock-in and network effects. These are the strongest drivers of the revenues of the biggest tech firms. Once you get the market with shipping, people will start building on top of and around your solution. They get stuck with it after they do that enough to make it hard to move. Familiarity with language or platform matters here, too. The economics become more monopolistic where you determine just enough additions to keep them from moving.

I agree with other poster on OpenVMS: it's a great example of Right Thing vs Worse is Better that *won* in market. While their management was good. ;) It had better security architecture, individual servers went years without reboot, mainframe-like features (eg batch & transactions), cross-language development of apps, clustering in 1980's, more English-like command language, management tech, something like email... the whole kitchen sink all integrated pretty well. Reason was it was a company of engineers making what they themselves would like to use then selling it to others. Also mandated quality where they'd develop for a week, run tests over weekend, fix problems for a week, and repeat. That's why sysadmins forgot how to reboot them sometimes. ;)


Here's a few others that fall under Cathedral and Right Thing model that got great results with vastly fewer people than Worse is Better and/or were successful in the market. Burroughs and System/38 still exist as Unisys MCP and IBM i respectively. Lilith/Oberon tradition of safe, easy-to-analyze, and still fast lives on in Go language designed to recreate it. There's nothing like Genera anymore but Franz Allegro CL still has a consistent, do-about-anything experience. QNX deserves mention since it's a Cathedral counter to UNIX where they implemented POSIX OS with real-time predictability, fault-isolation via microkernel, self-healing capabilities, and still very fast. Still sold commercially and was how Blackberry Playbook smashed iPad in comparisons I saw. They once put a whole desktop (w/ GUI & browser) on a floppy with it. Throw in BeOS demo showing what its great concurrency architecture could do for desktops. Remember this was mid-1990's, mentally compare to your Win95 (or Linux lol) experience, and let your jaw drop. Mac OS X, due to Nextstep, could probably be called a Cathedral or Right Thing that made it in market, too.







So, more food for thought. The thing the long-term winners had in common is that (a) they grabbed a market, (b) they held it long enough for legacy code/user-base to build, (c) incrementally added what people wanted, and (d) stick around due to legacy effect from there. Seems to be the only proven model. It can be The Right Thing or Worse is Better so long as it has those components. So, we Right Thing lovers can continue to trying to make the world look more Right. :)

Nick P
Security Engineer/Researcher
(High assurance systems)

74. aminorexFeb 16, 2016

Sine qua non, a success of scale must first be successful in a raw land-grab. The ability to grab land is not really a technical or an evolutionary merit. I would not consider an invasive species of rat that overruns its small island, destroying the habitat of every other animal species, to be a success — and much less so if it destroys its own food supply, dooming the line to extinction. Nor would I consider the death of the last island reptile to indicate that the rat was 'better'.

I am not even sure what are the criteria of evolutionary success, especially for parasites. True success should include succession. Perhaps subspeciation into distinct populations in homeostasis with their environments would be unambiguous in general. If so, Unix is a manifest success, and Windows a failure.

I suspect most readers are more interested in commercial criteria of success. While the commercial competitive landscape may share some features with an evolutionary competitive landscape, let's not press the analogy beyond its breaking point. Instead, we should usefully clarify the range and limits of the analogy.

75. Yossi KreininFeb 16, 2016

@Maxim: well, you certainly made it clear where you stand on these issues :-) Your CV however says you've worked a lot in the private sector. If it blows so much and all the good stuff originates in publicly funded projects, why don't you work on one of these instead? Also, why didn't the fastest (or pretty much any...) computers come out of the USSR whose public sector was always larger than that of the US and who had plenty of educated people in the relevant areas?

@Nick: I don't think I really "beat you to it", in that you're looking at it from a different angle. My main points were (1) "worse is better" is really about evolutionary forces, regardless of the details of what these forces are, and even when people don't realize that it's what they're arguing about, and (2) the reasons different people side with or against "worse is better" – and I didn't get to that second part yet... (I promised a follow-up which has failed to arrived in the 4 years since.)

@aminorex: it's simple: rats won the first round, reptiles lost it. If rats then became extinct, they've lost the second round. (Incidentally, rats rarely do.)

Maybe it's different from someone else's point of view, especially if they don't like rats, but it's certainly as simple as that for the rats. Maybe I don't like to use Linux and I sure as heck wouldn't like to have to contribute to it, but I end up using it and people working for various companies end up having to contribute to it, and getting subjected to verbal abuse by Torvalds, and I bet he rather likes how it all came out.

So when it comes out to you're the rat or you're the reptile, make sure to narrow down your definition of success enough to be the rat, is all I say.

76. Nikos ChavaranisDec 26, 2016

What is measured as worse or better is technology that has to sell to the widest audience possible. It is common in microeconomics that when you target the average consumer, you are going for a marketable low cost solution, so that you can be competitive on the market. This is because the average consumer cannot pay any price you set, as long as the product deserves it. You don't need a degree from a business school to know that quality drives up prices. In the examples above, Worse is Better because it is more affordable.

77. zxq9Dec 9, 2017

For future reference, the UNIX-Haters' Handbook has been moved here:

Still relevant.

Post a comment