Hacker News new | past | comments | ask | show | jobs | submit login
A Generation Lost in the Bazaar (acm.org)
519 points by mahmud on Aug 20, 2012 | hide | past | favorite | 344 comments



OK, I'll make a couple of general observations here.

First: It would be a big help for this discussion, if we could have the informal convention that people who were employed in an IT job before 1990 marked their post. I think it would show a quite clear divergence of attitude.

Second: It's very obivious, that a lot of you have never been anywhere near the kind of software project posited in the "cathedral" meme, instead you project into the word whatever you have heard or feel or particularly hate. That's not very helpful, given that there is an entire book defining the concept (I belive it's available online on ESR's homepage, how about you read it ?)

Third: No, I'm not of the "either you are with us, or you are against us" persuation. The bazaar is here to stay, but having everybody in need of transportation buy the necessary spareparts to build a car is insane.

Fourth: Related to point two really: A lot of you seem to have little actual ambition of making things better, I guess that is what happens if you grow up in a bazaar and never even experience a cathedral. I pity you.


First: Have you ever looked at a plot of people for/against mariage rights for gays vs age? I suspect that the clear divergence does exist, but may not mean what you think it means...

Second: I worked on OS X. In fact, I worked on Snow Leopard, and the start of Lion. What's interesting about that, is that Snow Leopard was the last version of OS X developed according to the "Cathedral" model. Also, while the Snow Leopard cathedral was being built, iOS was being developed firmly using the bazaar model...

Third: You know, I wonder if you've ever been to a bazaar before? I live in Turkey, where the bazaar is a way of life (and where one of the largest, oldest bazaars in the world is located). I've never found "spare parts" at a bazaar. What I have found is some of the highest quality jewelry, tapestries, rugs, and other hand-made goods you'll find anywhere.

Fourth: I think you're conflating "good quality"=>Cathedral and "poor quality"=>Bazaar. The only thing that distinguishes the Cathedral and the Bazaar is whether or not there is one single individual in whose head the only valid vision of the completed project exists. You might do well to read up a bit on the history of Kapalıcarşı. Throughout its history there were guilds to enforce authenticity and all manner of quality control mechanisms. It is possible to have a Bazaar and a very high quality product.


I also worked on OS X, and I think you're misinterpreting "Cathedral" vs "Bazaar", especially given your later definition:

> The only thing that distinguishes the Cathedral and the Bazaar is whether or not there is one single individual in whose head the only valid vision of the completed project exists.

Core OS and related groups has long consisted of VERY different teams each working in their own fiefdoms with their own methodologies. There was a somewhat coherent vision overall, but everybody achieved it in their own way, with very mixed results -- OS X has some tremendously terrible code. Security framework, Installer team, mDNS, anything related to Server ...

To be honest, I'm not even sure what your point actually is.


My point is that ideas flowed as often from "the bottom up" as from "the top down". Yes, there was a vision of the product held by those in charge, but there was also a thriving ecosystem of ideas. Some of the most successful engineers I knew at Apple made a habit of developing things they thought were amazing and should be included in the product. These were not components that were part of the original design.

The point is that, if one of the workers building the Notre Dame had gotten hot after a day of work and said "You know what we should include here? A swimming pool!" well...he probably would've been committed.

Coherent vision and a priori design are not the same thing. Apple has coherent vision. Bazaars can have coherent vision (should have coherent vision if they hope to be successful). But that's not the same thing as a Cathedral's a priori design...


> Coherent vision and a priori design are not the same thing. Apple has coherent vision. Bazaars can have coherent vision (should have coherent vision if they hope to be successful). But that's not the same thing as a Cathedral's a priori design...

A priori design can adapt to new ideas. I'd argue that's what Apple did/does, in many cases.

Likewise, I'd argue that the truly ad-hoc bazaar development is responsible for some of the worst ideas and bad code that can be found at Apple.

The historical lack of good centralized vision on aspects of the Core OS -- such as Objective-C -- has led to staggering missteps and ridiculous inefficiencies on behalf of both the framework and compiler teams. This has been to nobody's benefit and the sum result is clearly inferior to better-designed language work done elsewhere (eg, MS).

Likewise, the ability for applications teams to drive forward ill-conceived OS and framework hacks has led to some terrible long-lasting implementation failures, which is something a coherent top-down vision could have prevented.

However, the fact is that products can succeed despite of their poor implementation. Costs may be higher, bug counts may be higher, and user satisfaction may be lower, but that hasn't always stopped Apple from building successful products. Where I take umbrage is in the notion that there's a dichotomy -- either you do things poorly and let intellectual lazy engineers take the lead, or your product does not succeed. That's not accurate.

I don't think Apple is a good case study for your point.


If a priori design adapts, then it was never more than a coherent vision to begin with. The notion of a priori design is "we're going to do A, B, and C, and don't even think of changing the plan until those are done". As ESR talks about in his original essay, the "Cathedrals" were mostly developed in chunks. What was different about the Linux "Bazaar" was the way everyone could see and give input to the development along the way.


That's a false dichotomy -- a straw-man -- that's you're using to discredit the idea of planning ahead.

Compare FreeBSD kernel design versus Linux.

kqueue vs. dnotify/inotify/???

Mach-descended VM vs. a string of linux-vms

BSD scheduler, ULE scheduler vs. how many different schedulers?

Linux churns through ill-conceived solutions to problems until they find one acceptable enough. FreeBSD grinds on one until it definitively works.

FreeBSD almost invariably winds up with the better solution, in less time. See kqueue, for example -- the foundation upon which Apple's GCD is built.


It's very interesting that you bring up kqueue. I was about to bring up kqueue, but for a different reason.

Planning ahead can lead to great things. The Notre Dame, and the dozens of other cathedrals throughout Europe are positively stunning...

I love and I hate kqueue. I mean, I love kqueue. I love the way it can be used from C, I love the way it's integrated in MacRuby...I spent the weekend studying Clojure's reducers and was dying to have some time to work on ClojureC just so I could implement reducers with kqueue...

I hate kqueue because, in all likelihood, I'll never get to use it in a production system, because it's not in Linux.

Cathedrals can be nice to look at. Bazaars are often more functional.


> I hate kqueue because, in all likelihood, I'll never get to use it in a production system, because it's not in Linux. Cathedrals can be nice to look at. Bazaars are often more functional.

Except that FreeBSD is functional in production, so what is the actual problem? That market effects and accidents of history resulted in Linux becoming more widely adopted?

What does that argue for, exactly?


If Linux became more popular/more widely supported by mere chance, then there's nothing to argue about, and nothing to learn...


It's not chance, but it was very likely due to market conditions that have little to do with development methodologies or outright code quality.

http://en.wikipedia.org/wiki/USL_v._BSDi was hugely crippling at a very critical moment.

In a similar vein, PostgreSQL lost out to MySQL in no small part due to:

- PHP supported MySQL out of the box.

- MySQL was slightly easier to get running.

If there's a lesson we ought to learn, it's that market success may be partially or fully disassociated from actual merit relative to other market entrants. We've seen this in commercial software. It would be foolish to think it doesn't apply to open-source.


FreeBSD grinds on one until it definitively works.

And what happens if it never works?


An example:

libkse was an attempt to implement M:N thread scheduling. This is something that can, in theory, provide significant benefits over 1:1 thread scheduling, but in practice, is extremely complicated to implement and has fallen out of favor across the board: Solaris abandoned the approach in Solaris 9, and FreeBSD abandoned it in (IIRC) FreeBSD 7.

Concurrent to libkse, libthr was developed by David Xu, and was also included in the FreeBSD base system. It implemented 1:1 threading, is far simpler than libkse, and has replaced libkse as the default threading library.

I would argue that in this case, FreeBSD's cathedral model failed; M:N threading ideally would have never been attempted, and it was wasteful to attempt to implement two distinct threading libraries. However, considerable thought and expertise went into the work, and the work (and decisions around it) were not made flippantly or taken lightly.

It was simple a case where despite best intentions and best effort, the wrong choice was made. At the same time, in what some might call "bazaar-like", David Xu maintained libthr as an alternative. It remained there, ready for adoption as the default threading library, until such time as it became apparent that M:N via libkse was a dead-end.

This was a mistake, but no entity is perfect, and the success rate of FreeBSD's considered decision-making remains statistically high. In this case where something never worked, it was replaced with something equally well-considered and far more successful.

Moreover, compared to Linux's gross missteps with their threading implementations (including such gems as setuid() being thread-local as an implementation side-effect), libkse was a minor fender-bender.


I was intimately involved in those discussions and decisions.

There seems to be some underlying assumptions of omniperfection hung of the "cathedral" meme in these parts. That is simply unfounded in both theory and practice.

Second, cathedral vs. bazaar is not really about governance, but about architecture, and they are two separate power-structures, although most organizations, including FreeBSD, mingle them, to their own disadvantage.

We can argue if attempting M:N was a wise or a factually based decision, but that has nothing to do with cathedral vs. bazaar, certainly not in this particular case: What happened was that M:N came first and that was that.

Only once it clearly transpired that it could not sustain its promises in practice (partly because every thread programmer assumed 1:1) did the actual decision making aparatus of FreeBSD kick into gear.

In Linux, (which is much more cathedral than FreeBSD) or OpenBSD (at the time even more so), that decision would probably have been executed practically instantly, but in FreeBSD which is mostly consensus driven, it took some time (and angry words etc.)

But overall the libkse vs. libthr saga has almost nothing to do with cathedral vs. bazaar, because architecture was not the driving force at any time during that saga.


I just wanted to quickly point out that you have referred to Linux as "much more cathedral than...". Considering that Linux was the original motivation for "The Cathedral and the Bazaar", identified as "the Bazaar", I highly suspect that you are not using the terms in the same way the original essay intended.

Indeed, throughout this discussion it has become clear that you regard anything with a sense of design and some amount of quality control as a "Cathedral". If you want to discuss the benefits of design and quality control, that's a perfectly fine discussion to have, but it also seems to be completely orthogonal to the original discussion of cathedrals and bazaars, which was wholly focused on the development process.

In fact, I wonder if your arguments would be better focused on the concept of "craftsmanship" (or lack thereof) in programming...keeping in mind, of course, that both cathedral builders and bazaar artisans have historically had a notion of craftsmanship.


ESR seemed to target linux userland and GNU-toolchain/utilities in his description of Linux as a Bazaar, although not to the exclusion of the kernel.

I think it is plausible to describe the kernel as much more Cathedral-like, at least in parts where Linus is very strict about how components should be integrated. I can see where parent could have got this impression.

At the same time, there are parts of kernel that are bazaars. Virtualization/Hypervisor support is the area I'm most familiar with; the various hypervisor vendors basically crammed their paravirt calls and passthrough drivers in, without unifying the interface at all. Yes, Linus and others were very aggressive with maintaining the code quality, but less with the architecture. There are other areas of kernel (particular drivers), where we've seen similar situations.


Thanks for weighing in. I worry about going out on a limb describing FreeBSD in a thread you're a part of. While I'd never assume omniperfection, I expect top-down enforced careful consideration to more often produce correct decisions.

That said, given your response, I think perhaps still don't understand the meaning behind your cathedral/bazaar dichotomy.


Thank you for actually dignifying my reply with a reply :)

While it's been years since I mucked about in Linux kernel space, I always remember it as being a very open and egalitarian community. Although I never made the attempt to contribute to any of the BSDs, I got the impression that they were much less open to newer contributors and divergences from "accepted wisdom" (this could be put more politely as being more discriminating (as in taste) and careful). That being said, the majority of decisions I was witness to on LKML appeared to be well-considered, and make no mistake that patches and design ideas were not just accepted willy-nilly. While Linux has stumbled across many missteps, I still personally prefer the wild possibilities and endless choices that eventually shake out a solution, at the same time keeping with a distribution (Debian stable) that filters out the issues for the end users (myself included).

I also find very interesting the parallels between this discussion and the "worse is better" essay, where UNIX was supposed to be in the "worse" camp, and now it's Linux.


> I always remember it as being a very open and egalitarian community

It definitely is, in my experience. Most importantly, despite the obvious presence of "celebrities," even completely unknown people can jump into the middle of a conversation on an important topic, and if they have their shit together, will be accorded pretty much instant respect and be treated as an equal. There's very little sense of "needing to pay one's dues."

But you definitely have to have your shit together. If you don't, you will be quickly eviscerated.

It's really rather nice, especially compared to many dev communities where there's often much more sense of entrenched factions.


According to phkamp, iOS is a cathedral kind of project.

I was hoping someone would bring up OS X and how horrible its cathedral-born API became around 10.5, but I was born in '85 and therefore have no right to speak in this thread.


While Apple would certainly like you to believe that the image of iOS sprang, fully-formed, from the mind of Steve Jobs like some sort of medieval cathedral, the reality is anything but. In fact, much of Apple's success is due to the extent to which it functions like a confederation of very well funded startups. Cupertino is very much a bazaar, wherein a shopper with very refined tastes (Jobs, for example) can pick and choose the finest wares.

This article concludes that quality only happens if someone takes responsibility for it. Yes. You'll find no argument on that point from me. But cathedral's are not (edit: should say "not only") about one person taking responsibility for quality. They are about a priori design.

Quality control can happen ex post facto but creativity, once strangled, dies.


I'm going to drop the cathedral and bazaar analogy, because I don't feel like debating the exact meaning of the terms.

It seems to me that quality is about having someone on the top who is willing to give direction and vision to a project, instead of having every person who scratches their own itch full influence. Contributors are great. But letting everyone pull in their own direction doesn't lead to something that feels well engineered.

PHK seems to be calling this vision and cohesive design (a-priori or not) the "cathedral model". I agree strongly that for a project to work well and feel "high quality", it needs some source of a unified idiom.


The problem is that coherent vision and a prior design are NOT equivalent. Obviously, I cannot know what was going through the head of ESR when he wrote the original essay, but what I've always taken from it is that a priori design is inherently inflexible, prone to becoming disconnected from reality, and ultimately less inviting to creativity.

If the suggestion of the article is that the only way to retain quality is to move back to the "single vision" world of a priori design, I'm sorry...that ship has sailed, the cat is out of the bag...whatever your favorite analogy, PHK is very right that the new generation has gotten used to not having instructions handed to them.

Of course, anyone is free to start a project with a single vision, recruit new members, and do their best to grow the project. I suspect, however, that such an effort would loose out to one that figures out how to develop a coherent vision without the need for a priori design.


Brooks spends some time discussing these issues in the book, and I'm pretty much aligned with him:

Nobody belives in "a priori design" and I somehow doubt that anybody did. Brooks points out that the original publication of the "waterfall model" was meant as "how not to..." and people got that wrong.

But cathedrals are not about a priori design, they are about style, elegance, economy of means and coherency of design.

But my point in the piece is that the lost generation doesn't even know what a cathedral is in the first place, having grown up in the bazaar.


But did they grow up in the Bazaar? or in the Mad House?

ESRs original prototype for the "Bazaar" was Linux. Do you feel that Linux is lacking in coherency of design? You refer to the dot-com bubble, but the problem with the bubble wasn't, I think, that it was the "Bazaar". Indeed, I don't think the dot-com bubble of the late 90s was characterized by much open source development at all!

It was, rather, consumed with "flashiness" and "wow factor". I definitely see the continued obsession with these things as a problem. I would say that, for example, much of the obsession with Node.js today is a consequence of this obsession. But that isn't a Bazaar.

It's a disco.


Can I, the customer, pick and choose the finest wares? If not, it's not a bazaar.


The original essay is about how you build a cathedral versus how you build a bazaar, not what you use them for.


According to phkamp, iOS is a cathedral kind of project.

This is true. The OP has made several somewhat-arbitrary and vague assertions about what may or may not constitute a "Cathedral", on top of lambasting others for not agreeing with his definition ("Read Brooks Book")

Excerpts from the HN discussion :

"Windows and Office are actually not examples of cathedrals, because the architectural focus of Microsoft was not on software but on a near-monopoly market." -phkamp

"...iOS is very much a Cathedral and has a designer and architect who cares and who is in control." -phkamp

Does Apple not care about achieving market dominance? Did no version of Windows or Office have a lead architect who "cared"?

Frankly, the entire concept of a "Cathedral" - rigidly defined as it appears to be here - is a little nebulous and hand-wavy.


You never read ESR's essay/book, did you ?


I read ESR's essay. He was protesting how hard GNU makes it for outside contributors such as himself. Somehow the essay got construed as open source vs. closed source, and ESR pivoted to align with the popular interpretation of his essay. I've never read the book (someone would have to give me a really good reason), but I assume it doesn't focus on GNU.

You're calling a ot of people out on using Microsoft as a cathedral example, so I assume you want the original bazaar-vs-cathedral reading of the paper. But if you do that, how can autoconf be a bazaar example, when it's GNU software?


In a world where software is designed by cathedrals, there is no need for autoconf. You read the POSIX spec and write software that conforms to it.


That is true. But the article decries autoconf's implementation, not just the need for it. It implies that cathedral designers would never implement it, or if they did implement it, do a better job.


I suppose this is the essence, because however much he may chaff at autoconf's implementation (and it surely is ugly), it actually works pretty well. Users (in this case, "user" = "developer") never see the ugliness for the most part.

With more open development communities there are tradeoffs, but often the result is that you can do stuff you simply wouldn't have the manpower to do in closed communities.

I think most of us would love it if every library and tool we used was pretty and elegant, but in the end, it's often better to have something, however much you'd ideally wish for it to be prettier.

Morever, a community in which different components are often the result of disparate teams with different thinking probably results in more attention paid to robust and simple interfaces between them, simply because that's the only way you can get everything to work. Systems composed of heterogeneous components with robust and simple interfaces between them, are, I think, a good thing. I suspect they tend to be more future proof than systems with everything designed by one person from the top down, because in the end, change is inevitable, so having to deal with heterogeneity from the beginning is an advantage.


No stop it! Don't defend autoconf, it is not worth it and the software does not deserve it. Users of autoconf, that is developers that try to use that POS to build their software definitely experiences its pains.

It is not just that M4 is a horribly ugly, badly, inconsistent language. You also have about a million leaky caches in the build process to deal with. Autoconf rewrites the build scripts in three stages, each which is cached and also keeps an autom4te.cache directory around. So what happens is that you update something in the M4 build script and then have to spend hours hunting down why the change didn't change anything because some stupid autoconf cache wasn't flushed.

What autoconf has done is to separate free software developers into two categories. Those who can hack the source (the .c and .h files) and those who understand the "autoconf magic." It should be telling that people refer to it as "magic" - it's a fing build system, not rocket science.

The only reason this ugly piece of software has lived on is because people treat it as some kind of magical device and inertia. Configuring and building software is an easy problem. It gets even easier if you finally decide to drop support for 30 year old unices that no one uses. The autoconf team consistently refuses to do that which why the software wont ever become better.

CMake, SCons, Waf and probably half a dozen more tools all do a much better job configuring software than autoconf ever will. People really need to try those alternatives and I'm sure they will realize autoconf needs to die, die, DIE.


Shrug. You don't like autoconf, great; that's clear. As a user of it, I think it's sometimes ugly, but basically fine for its intended purpose (easing portability amongst POSIX-compatible systems). [If you like Windows, you're probably outside its target audience.]

CMake, SCons, etc, all have their pros and cons. They are hardly paragons of virtue though. [Yes, even as a mere "consumer" of CMake (someone who occasionally needs to build packages that use it), I've been bitten and frustrated by it.]

As far as I've found, there really isn't any build tool that really gets everything right -- which suggests that the problem is harder than you suggest. [Although judging from the number of people who try to write their own build tool (most of which languish and eventually die, but cause some pain along the way), it's clear enough that many people think it's a simple problem...]

> autoconf needs to die, die, DIE

Decaf?


You sound a lot like my colleagues when I tell them why I much favor dynamic languages like Ruby and Python over Java. :) They listen politely to my arguments, then shrug their shoulders and go "meh, Java works fine for me". CVS users are the same when it comes to Subversion users who are the same when it comes to Git users.

It doesn't matter for a "works fine" person how many faults their current tool has or how much smoother "your" tool solves all those problems, they won't listen. Logic doesn't work maybe because it is very hard for them to get used to something new. Maybe because if you haven't experienced any alternatives it is very hard to imagine that something could be better.


I am sorry, but I would like to come in on that first comment. I think I understand the point you are making (if you grew up in a time of cathedrals you will rate them highly) but it can be taken as an ad hominem attack - one I am sure you did not intend, and would be beneath the HN quality level (its a Cathedral to debate round here)


Not intended as an ad hominem as much as a reductio ad absurdum. It is frequently true that the views held by those older are also more likely to be correct, but it is not guaranteed. This is especially true when the views relate to social norms, views on which few people change as they get older. Cathedral vs Bazaar is probably as close to an argument of social norms as the software world gets...


My reading of PHK here is that Cathedrals often lead to better systems than the anarchy-bazaar that he feels has come to be most of the OSS world.

I think bazaar's like the Turkish one referenced above (with guilds and standards of quality) are the best of all worlds. but to get there, a guild needs a strong, opinionated leader. Torvalds perhaps. (PHK seems to fit the bill too :-)

So, what I suggest is the issue here, is too many people trying to build a cathedral in a bazaar - they want everyone to build it their way but cannot persuade enough others to follow them. As such the bazaar is full of many half-finished carpets instead of less, perfect carpets made by engaged teams.

Software needs many talented people, working together, mentoring new ones in "their way". A bazaar is great for getting different talented people to find each other and choose between many different projects. A Cathedral is great for encouraging apprenticeships, the promulgation of culture and so on.


I think Torvalds does that play that role, but only for the Linux kernel. For user-space, which OP focuses on, there isn't one.


It is frequently true that the views held by those older are also more likely to be correct, but it is not guaranteed.

I think that one of the best lessons to take away from this whole flame fest is that history is something to learn from, and not just blindly revere. But first, that requires knowing about history, which far too many people don't. The whole issue of whether cathedrals or bazaars produce better software is orthogonal, and needs to be addressed to different situations.

First, are there still places where cathedral style development is a very good idea, hinging on being necessary? Yes, but it has to be done right, and unfortunately, cathedrals aren't (usually) very flexible. The Mars rovers were all cathedral programmed and work pretty darn well; on the other end of the spectrum you can find all sorts of big projects that went awry, even with "coherent design".

Second, is cathedral design necessary for all software today? No. I'm fortunate enough to have the perspective of a second generation programmer; my father started in middle school by sending punch cards to the university and would get his results a week later. Careful planning was not just a good idea then; it was absolutely necessary. These days he has a computer with more power and connectivity in his pocket than was available when he was born. His turnaround time for seeing results on software changes these days is well under five minutes.

So the computing industry has come so far so fast that it's not just unnecessary to make grandiose plans in advance, it may actually be a bad idea in a number of cases (what was it Paul Graham said? "If you want a recipe for a startup that's going to die, here it is: a couple of founders who have some great idea they know everyone is going to love, and that's what they're going to build, no matter what.").

It's kind of nice that anyone can just pick up a computer and start writing software these days. Sure, sometimes it's scary too, to think of all the inefficiencies, and wrong results and security holes that plague software. But it's so nice and liberating to think: "you know what? I can do it better, I have the freedom to try to make something better." And you can develop all cathedral style if you want!

Could software be improved? Definitely! Should people learn from history? Yes, including the mistakes. Would more design and coherent vision help make better software? Maybe; go ahead and prove it (a quick note: unfortunately, the market doesn't select the best software or designs, so you will have to prove it some other way, but being successful market wise is a good start).


It's interesting that a lot of people (myself included) think that Snow Leopard is a better OS than both Lion and Mountain Lion.


I wonder why you think that? I suspect it might be related to the focus on "consumer-centric" (as opposed to geek-centric) features. That is one interesting side-effect of the bazaar model: it's great at building what the majority want. If you are trying to build something that nobody is asking for, you can't beat the cathedral model! ;-)

I will tell you, though, that Snow Leopard missed its ship date by 8 months (of course, you never heard about that because Apple is smarter than to announce an OS ship date before it goes GM). It was also nearly un-usable for about a month, and only marginally usable for two months beyond that, during development. Lion suffered from none of these issues...


I can't speak for the gp, but I certainly would still be on Snow Leopard if development for the current version of iOS was still supported on it. I'm now seriously considering switching back to Linux as a desktop OS as I hardly do any iOS work anymore. (I've rather fallen out of love with that platform too as it's pretty clear that the only objective is selling me a new iPad+iPhone every year as this year's iOS always runs like crap on last year's hardware)

I guess post Snow Leopard era OSX and Apple apps just feel cheap, unpolished and rushed. It actually feels a lot like Windows, where quality of different parts of the system varies wildly.

Examples: Finder and anything file browsing related beachballs much more than before (particularly when dealing with network shares). Visual design seems to have taken a hit - everything seems to just consist of either different murky shades of grey, or jarring skeumorphism. The 10.6-era AHCI kernel panic hasn't happened for a while now, so maybe that was fixed in a newer revision of 10.7 or 10.8, but userspace stability seems much worse. Some of it is crashes, some just weird UI glitches that require a restart or Force Quit. Old problems (like Mail.app reliability or the heap of shit that is iTunes) are ignored completely in favour of sexing up the UI. (the runaway dynamic pager issue FINALLY seems better on 10.8, but again we get loads of other regressions)

Add to that what you say: the "improvements" leave most people I know completely cold or even get in the way, and that's not just power users/developers. Since we're "paying" for the (subjectively) unnecessary bling with regressions, the overall impression is that of a negative change.

Not that any of this is particularly relevant to the development model. You can end up with festering layers of crap with either model if nobody feels responsible for overall quality. I suspect the objective for Lion was "make OSX look kind of like iOS and get rid of all of that GPLv3 software" and that's basically what was achieved. I guess bug-free software isn't good business, Microsoft were printing money for decades.


"I suspect the objective for Lion was "make OSX look kind of like iOS and get rid of all of that GPLv3 software" and that's basically what was achieved. I guess bug-free software isn't good business, Microsoft were printing money for decades."

You can say that again. Frankly, after using Mac for 10 years, this is my last one.


Just out of curiosity, would you use newer Apple hardware if you could run older OS X versions on it?

Hackintosh or something even worse (and I am with you 100% on the horrible state of OS X) -- I wouldn't want to discount the quality of the hardware.


As someone who's been on OSX since Panther, I think you may have a slightly rose-tinted view of the past - lots of stuff in OSX has been improved - I remember pretty horrible kernel panics in 10.4 and userland issues in 10.5 that no longer exist since 10.6/10.7 (or have yet to show themselves). Driver support from vendors is vastly improved, and overall, though I agree that quality has been a bit hit-or-miss, the overall usability is improved, IMHO.


I've been here since 10.2 and IMO the only great releases of OS X were Panther and Snow Leopard, though Panther did begin the whole idiotic metal UI thing. Maybe 10.9 will rock?


I'm actually finding Mountain Lion much more stable than Lion or Snow Leopard. Lion had some really weird memory things and I did get multiple kernel panics in early Snow Leopard.


Ok, so who's telling the truth? Was Snow Leopard good or did it kernel panic? Is Mountain Lion stable or have users just not reported the issues yet?


I think we are both telling the truth as we saw / see it. It might come down to machine, configuration, and type of jobs.


Sounds reasonable to me.


Not to fork this discussion further, but it bears repeating how much things changed after SL.

In fact, it actually reminds me a lot of the change from FreeBSD 4.x to 5.x ... things worked, people were happy ... and then boom.

That was when all of my desktops and laptops stopped running FreeBSD and I "switched" to the mac. I am fairly certain that once running SL stops being practical, I will switch again.


My feelings exactly! I now do virtually all my development in Linux (or ssh'ed into a Linux box from my MacBook Air). I hate what OS X has become. The only thing I now find superior about OS X to Linux is that it runs Logic Pro 9 (which I can't do without); yay for vendor lock-in!


> it's pretty clear that the only objective is selling me a new iPad+iPhone every year as this year's iOS always runs like crap on last year's hardware

Nothing against a good conspiracy theory, but if you truly think this is intentionally planned then this thought is a bit … dumb.


I've not experienced this "bug-free software" you speak of. Where might I find such a thing?

It does feel that Apple's development model is changing to an annual release cycle, wherein larger/systemic bug fixes are deferred to the next major release, rather than in a point release. In fact, most OS X release these days feel like "minor tweaks that make consumers happy + a truckload of bug fixes and plumbing". The fact that they deferred so much 10.0-era plumbing to 10.6 and beyond is indicative of why there's a lot of regressions, IMO.


For me, it's stuff like my quad core iMac becoming unusable under heavy disk activity, or issues like scrolling web pages being impossible when running (quasi) full screen video in a dual monitor setup, or mouse lag under ML.

Minor irritations like reworked Spaces and full screen mode making a second monitor pointless are irritations, but I wouldn't regard them as reflective of the quality of Lion and ML.


I think you're confusing two largely-unrelated issues.

Whether products are developed according to what users want vs what builders want has almost nothing to do with bazaar vs cathedral.

Indeed, many bazaar-style projects (the linux ecosystem in general, gimp, etc) are almost cautionary tales about "builders creating what they want for their own niche purposes and if anyone disagrees they're free to fork off".


The majority wanted to have scrolling inverted?


No, the majority wanted to stop it from continuing to be inverted, as it has been for decades. If you do a significant percentage of your computing on a touchscreen, having the touchpad work the same way is really, really nice. After a few days of Lion, I switched all my other machines to do scrolling the same way. Now that I'm moving to Linux instead of Mountain Lion, I was happy to note that I can swap the up and down scroll with a simple command, and almost everything respects it (oddly, some things don't).


Have any actual data on that (to be clear, data that says that people didn't like / wanted to change the way that computer scrolling has worked for the last 25 years)?


I have no data. The first part of that bit ("... majority wanted ...") was intended to be somewhat hyperbolic, but I would expect that a survey would show that heavy smartphone and tablet users tend to keep the Lion default, rather than switch it back to the traditional scroll behavior. Haven't actually done the survey, though. :)


You are aware that the same “simple command“ is a checkbox in the OS X system preferences, right?


Yes, I am. I was just pointing out something about Linux that was easier than I'd expected, not dissing OS X. If you read the last coupla years of my history, you'll note some bitter rants from me about how 2010's Linux hadn't improved nearly as much as I'd hoped since I moved to OS X in 2003. So, I'm often pleasantly surprised these days. :)


The majority have never known an Apple product that worked any other way.


?


I think he's referring to the OS X users that started out with the iPhone, iPod or the iPad as their first Apple device and then bought a Mac. (Not that I agree with the argument).


I know, but if he's speaking to expectations, the brand of computer isn't relevant.


I don't think this follows.


The software that replaced Spaces in Lion is so painful to use that I just don't use virtual desktops on OSX any more. I basically only moved to Lion because Snow Leopard would not work on my new machine.


Pretty much all your arguments about the Cathedral model being superior fail in the face of the most common cathedrals known to man: Microsoft Windows and Office.

Their inscrutable beauty is buried under tons of libraries nobody will ever touch for fear of breaking 20 years of development efforts, exactly like what happens in the Unix world. Their move to the 64-bit world was painfully slower than what their fellow merchants accomplished in the Unix bazaar. They still provide compatibility layers for programs built with technologies that have been thought of as extinct, like monks still praying to the gods of ancient Greece. Whenever they went for the "total reuse" mantra, they built terrible and insecure specifications (DCOM) that still saddle us 20 years later. And let's not even talk about portability, which is anathema: to each his own Cathedral and his own Faith, touch ye not any unbelievers!

So yeah, making mistakes and keeping around the cruft is something every long-running IT project can experience. Unixes are, arguably, the longest of them all, so the ones that naturally tend to show it more. Besides, there's a whole new world of applications to be built out there, if we were rewriting libtool every three months we'd move even more slowly than we do now.

<ad-hominem>Oh, and I wish I could say your rant is unbefitting of professionals of your age, but I'm afraid it's actually quite matching the grumpy-old-man stereotype you're clearly striving for.</ad-hominem> Hey look, I can do ad-hominem too, and I was born in the late 70s!


Windows and Office are actually not examples of cathedrals, because the architectural focus of Microsoft was not on software but on a near-monopoly market.

You diagnosis of its qualities are spot on.

"ad hominem" means to "attack the man", ie: I single identified man, saying some generalities about identifiable groups is not "ad hominem".

And yes, I am a grump old man, and a surprisingly cheerful one at that.


This sounds a bit "No True Scotsman" to me. Most commercial projects that I've seen eventually grapple with the same sorts of issues that you mention. When they've faced diverse deployment environments, the results have often been worse than autoconfigure, from what I've seen. I guess that's because they aren't cathedrals, since the businesses behind them were focusing on market-demands instead of whatever it is that Cathedrals focus on?

Your remarks are essentially Ad Hominem regardless of whether you single out 1 man or 100 individual men.


Half of this thread is unfortunate terminological noise, but phk did specify what he means by "cathedral" – software governed by a coherent design vision – and Windows and Office are not only not that, they are the poster children for not that.

The words "cathedral" vs. "bazaar" have obviously gotten too vague for people to meaningfully argue about. But "coherent design vision" has a much clearer meaning, and an enormously important one.

I think the pendulum is bound to swing back from incoherent, hypercomplex software, because at some point the reductio that we've got today simply won't be able to adapt. Who knows what will trigger that, or when. But intelligent people will always care about simplicity, beauty, efficiency, and the other qualities that come from good design. And among the ignorant programmers there are more than a few whose eyes light up when they are finally exposed to good design, and want to learn to work that way. I'm an example. So there's no point of no return here.


Once again, you're veering into "No True Cathedral" territory here.

The UNIX design philosophy -- small utilities loosely joined into a coherent whole -- is a coherent vision for a platform, under the definition you've repeated. And the Microsoft Office design philosophy, if what you're saying is true, is a bazaar because it's "incoherent."

If the UNIX ecosystem is a Cathedral and the Windows ecosystem is a Bazaar, I think we can safely say that you're not using the words that way the classic essay used them, and that the definitions you're using are malleable enough that any argument about aesthetics could bend and twist them into synonyms for "stuff I like" and "stuff that annoys me."


Well, as I meant my comment to convey, I don't think "cathedral" and "bazaar" are well defined, so most of the disagreement in this thread is just crosstalk. We can argue about what the original essay meant, but I don't find that very interesting. Software design on the other hand interests me very much. I don't have any problem provisionally accepting phk's definition of "cathedral" in order to hear his thoughts about design, most of which strike me as sensible.


So a "cathedral" must be "coherent" to be a true Scotsman?

Coherency cannot be measured, and it's extremely subjective. The *nix world can be extremely coherent: for example, you'll always find libtool, whether you want it or not, and your '70s-like filesystem layout. Isn't that "coherent" with its history?

This is not true even of Android and iOS, the modern wonderchildren he cheers on: a quick look at the Android filesystem hierarchy will show its adherence to outdated conventions (etc?) and incoherent repetition (sys? system?), and I bet you'd find something similar in iOS. So there might be a "coherent vision" behind, but the practice is quite incoherent; and mind, we're talking about very basic OSs that delegate any but the most basic functionality to third party apps. I'm happy to bet that in 10 years, the Android codebase will be as shitty and crufty as any Unix.

The truth is that a cathedral, in phkamp's rant, is "software that can all fit in one's head". For all operating systems, we're well past the stage where a single genius architect could envision the totality of a massive cathedral, for a number of reasons (time, working culture, legacy tech etc). Clearly phkamp, being more intelligent than most, only just passed that threshold with the very latest FreeBSD release, and felt the urge to tell the world.


I honestly can't tell what you're trying to say. (Other than the last bit, and that part is needlessly noxious.)

"Software that can fit in one's head" and "coherent design vision" are closely related things. I'm in favor of both. If you're saying we're past the point where that's feasible in real-world systems, that's just assuming the conclusion – the wrong conclusion, in my view.


"Windows and Office are actually not examples of cathedrals, because the architectural focus of Microsoft was not on software but on a near-monopoly market."

So, you are defining "cathedral" as "good"?


You seem to be saying that an ad hominem is no longer an ad hominem when you substitute "a lot of you" for "you". The Latin may be singular, but I don't think you can expect English speakers to treat it as strictly singular.


Not a cathedral because it was not on a software but a near-monopoly market? I love taking metaphors too far, so I'll just notice that "near-monopoly" is what cathedrals are all about.


Try reading again ?

Microsoft didn't architect their software as software, they shaped it as tools of monopoly enforcement.

There are numerous accounts from people involved about how marketing decisions relating to cutting of 3rd parties or trying to bludgon somebody into submission caused them to cripple their own software and its architecture.

See the evidence in the M$/Novell case for ironclad evidence of this.


I hate to point this out, but that is just one of the benefits of the cathedral model: you get to inject alternate agendas into the project. The more bazaar-like a development model is, the less outside priorities factor in.


Actually, the amusing thing is you're still reinforcing the cathedral metaphor. If the primary target of the cathedral was an event hall capable of housing a bunch of people it could have been built in a way more reasonable way. The point of cathedral is to be the biggest and most impressive kid on the block. Therefore I'd say that MS was designed in a cathedral way to achieve cathedral goal.


M$? Really? I thought HN and seasoned OSS developers were above this kind of petty name calling and sneering.


You know, first time I encountered the "M$" shorthand was in a TELEX (look it up if you don't know what that is) from Commodore Corporate about pricing of the the PC10 computer. I have used it ever since, based in part on what was not very diplomatically expressed in that missive about M$'s attitude to licensing.


> a TELEX (look it up if you don't know what that is)

This kind of passive-aggressiveness only reinforces the grumpy-old-man stereotype and makes it really hard to take you seriously.


Just because you read it a long time ago doesn't make it less petty or silly.


as long as I can use the term I coined: "open shit"


Actually, the proper traditional way to insult Open Source is "Open Sores". Use that and the feeling I got during flame wars between high schoolers on IRC channels in the late nineties will be complete.


A friend of mine is really big on the term "freetard" for open source software.


I don't put the blame all on MS. Lets face it, MS' biggest customer is corporate. And they never EVER want to see change. They expect Word to be able to open files dating back to ancient times (like Works and Word Perfect files). So MS left the old libraries in.

I do think that was a mistake. But MS didn't (doesn't) have the luxury that Apple did (and still does). MS can't just shut 30 years of compatibility off and expect to keep their corporate customers happy.

Oh and just for the record (since this thread seems to be grouped by age, I was born in 1963. I remember when MS was cool and "us" nerds were running from CA and IBM.


>Pretty much all your arguments about the Cathedral model being superior fail in the face of the most common cathedrals known to man: Microsoft Windows and Office.

It's been twelve long years that OpenOffice moved from the cathedral model to the bazaar model. And it still suffers from the same criticisms you level against Office and more.


Read the whole comment: i said these are problems every long-running software project suffers from, regardless of being a cathedral or a bazaar.


Do you think turning a cathedral into a bazaar is a minor architectural change?


You'd be surprised at how well a bazaar fits _inside_ a cathedral :)


To be completely sincere, I've hidden a bazaar inside a cathedral a couple times myself.


Ok, pre-1990 person here, and the piece resonated quite strongly with me. But I note that there is a third axis which isn't well covered, which is 'volunteer' vs 'paid'.

It is important to note the distinction between FreeBSD's package system and say Debian apt. In FreeBSD I can make from source some package, and in Debian I can apt-get install a package, because the Debian packages are prebuilt it just comes over in several chunks and doesn't need to build. (yes you can pull prebuilt packages for FreeBSD too). But my point is that the packaging system of FreeBSD, as used, conflates building the packages and using the packages.

So if I write a perl script that goes through the source code and changes all the calls to fstat() to match your configuration then to build I need perl even though you do not need perl. (as an example). But to run I don't care if you have perl or not.

But lets get back to the volunteer/paid thing again. People who volunteer rarely volunteer to clean the shit out of the horse stalls, no they volunteer to ride in the races and build a new horse. So you end up with a lot of stuff around you don't want.

Sadly for operating systems, and the original rant is really operating systems, there really isn't a cathedral/bazaar model, its more of a democracy / feudalism kind of thing. Nobody 'owns' the final user experience for the OS in FreeBSD/Linux discussions.


Yes. This.

It's also a matter of what people are willing to pay for. Even (or perhaps especially) in commercially supported software, cleanups only happen if there is a strong business case for the cleanups. And if the cost of buying an extra build machine to run long complicated configure scripts is significantly less than the engineer time to re-engineer a new autoconf system from scratch, in many companies (and certainly most starupts) --- it won't happen.

The OP cares very much about code quality as a good and important thing in and of itself. But that view isn't shared by many business people, or by many programmers in general, for better or for worse. Some have argued that OSS code tends to actually be _better_ about code cleanliness, because it's public, and people do care about making sure that their code is clean. (I've never seen the proprietary source code for the Oracle DB, but there are many stories out there about how horrible it is from a code cleanliness perspective.)

Also, the OP seems to care a lot of lots of extra library dependencies. The big problem here is that a lot of people don't really care if their package uses perl or python as part of their build scripts/makefiles --- or even if their package uses perl _and_ python scripts. The OP cares, but for better or for worse, most people don't. And I would wager this is likely true at most companies where programmers are paid to maintain the source tree as well!


> conflates building the packages and using the packages.

Yes. Much of his poor experience is due to build systems.

Anyone who's developed anything can tell you how fragile and finicky build systems can be.

Since build systems are used by developers, who can deal with complexity much better than "ordinary" users, they tend to be rough around the edges.

They're also unglamorous infrastructure, so volunteers tend not to spend time on them if they're not paid. And they have a tendency to break the entire application if broken, and are often part of the "interface" of the software (since every distro who builds package X from source uses build scripts that rely on for example ./configure with particular options), so maintainers tend to take an "if it's not broke, don't fix it" attitude.

GNU projects in particular -- whose autoconf the author complains of -- have quite a bit of built-up cruft since many of them are very old, and also vital infrastructure of most FOSS OS'es.


I think a lot of people, especially around here, eschew college education in the computer sciences, because you can get a job without it and you can build a website without it, but I really think the decline of formal education in computer programming concepts has led to a lack of cathedral thinking.

Of course, I'm biased, because I have such a degree. However, when I compare what I build to what is built by a business school grad who thinks anyone can learn computers, so everyone should be taught business, I really cringe at loss of formality in the industry.

Needing to build a computer processor from AND and OR gates really drives the concepts home. Building an operating system in C++ really drives the concepts home and creates a framework for thinking about computer based problem solving that's lost in many of the systems I look at today.


I'm going to go ahead and strongly disagree on this first point -- I've seen no correlation whatsoever between education background and propensity for strong system design. In fact, my background is mathematics with very little formal CS-related education and I'm firmly in the cathedral camp.

I really think it is a matter of exposure -- at one time all anyone ever saw were cathedrals, and so that was all anyone ever built. I feel now we've swung too far the other way, where there are many in the craft who've never seen a cathedral and have only known the bazaars. The reality is we need both, and the craft of building cathedrals is becoming endangered.


my background is mathematics

You are a terrible example because you come from a formal background of a hard science. Even worse, you come from one where proofs (the cathedral in the mathematics sense) are required or you aren't taken seriously.

The parent was talking about joe schmo off the street or the high school wunderkind who is building "twitter for teens" or "pinterest for social good". They are just slapping it together from the get go without thinking about any aspect of the design. Eventually they'll have to hire the formally schooled to come in and clean it all up, if they actually get anywhere with it.

Which makes me wonder, if more formally educated individuals were in the start up game would the failure rate be so high as it is now?


"proofs (the cathedral in the mathematics sense)"

Uh, we are talking about the same Cathedral and Bazaar thing here, right? Cathedral in the sense that only the anointed master architects get to make decisions; everyone else does what they're told and keeps their mouths shut; bazaar in the sense that everybody brings whatever they've got and hopefully the good stuff hangs around?


Huh? No, Cathedral in the sense that:

Everyone brings whatever they've got, and the anointed master architects sift through it, determine the best course of action, redirect known-bad avenues of exploration and possibly anoint new master architects from the best of the candidates.


The startup game has tons of formally educated people, and their failure rate is just as spectacularly high, if not higher. In my experience, the formally educated tend to build "science projects" at a higher rate than businesses, when compared to those building "twitter for teens"


their failure rate is just as spectacularly high, if not higher

I asked a question, you are making a statement. If you have numbers that prove that the formally educated are failing more then I think we'd all love to see them. Otherwise, this is just conjecture.

As for 'science projects' failing it could be a number of things. I would assume that hard science startups fail a lot of the time because of cost. "Twitter for teens" is much cheaper to start than one doing alternative energy. Also, hard science is less sexy than a lot of the social stuff that currently dominates the landscape. Thusly, it can be much more difficult to find the money you need due to visibility. The social startups will probably pay out sooner than long-tail hard science companies.


I'm not I'd like to get on either side of this argument, but I'd like to point out a steady decline in systems and operating systems (both research and teaching) across the academic world. Rob Pike wrote a polemic touching on it, and he wasn't entirely off base.

This isn't to say that they don't exist, but there are a wealth of examples demonstrating the transition of major progress in those fields to companies like Google and Amazon. For anecdotal examples, see GFS, MapReduce, and Dynamo; there are many unpublished/trade-secret examples that will take years to come out.

Additionally, many schools have reduced their requirements for what exposure students should have to systems and OS:

A Harvard CS concentrator, for example, is not required to take either the introductory systems or operating systems classes; he/she may opt to take a high-level mobile programming class instead. Indeed, the operating systems and distributed systems courses are taught every other year, further reducing enrollment. The last time CS 161 (intro OS) was taught, there were only 23 students. 25 in CS 153 (Compilers). This is a stark contrast to the 100-200 students who took other upper-level CS classes (189 in mobile, for example).

While I will not contest the value of the formal CS education, I see some of the 'naive'(for lack of a better term) CS proliferating even there.


Please, what was Rob Pike's polemic? Can you share a link?

From googling around, might you be referring to:

"Systems Software Research Is Irrelevant"

http://doc.cat-v.org/bell_labs/utah2000/

And there's a little slashdot discussion:

http://interviews.slashdot.org/story/04/10/18/1153211/rob-pi...


That's the one, though I prefer this format: http://herpolhode.com/rob/utah2000.pdf

I don't agree strictly with everything he says. It is a polemic rather than a careful argument. But still, he has a point and the data about OS research is strong (even beyond his data).


I suspect that "a lack of cathedral thinking" has absolutely nothing to do with the quality or lack thereof of modern systems. I've seen both well- and poorly-built results from bazaar-like development approaches, and well- and poorly-built results from cathedral-like approaches.

Of course, I may be biased because I've seen more poorly-built results from cathedral approaches.


Started in IT in 1980. I'm 50 this year. First FOSS contributions were in Nov 87 (patches to gcc/gdb/emacs for convex machines). This pre-dates any contribution by ESR, and it enrages him when I point it out. :-)

My company ships quite a bit of FreeBSD (as pfSense). Could have gone linux, but linux is a mess (much worse than ports.)

I think OpenBSD is a mistake, at best it belongs as a group focused on security inside the netbsd project, but of course, Theo got kicked out of netbsd, thus: OpenBSD.

I'm also the guy who appointed Russ Nelson the "patron saint of bike shedding." Just FYI. :-) (None of Eric Raymond, Russ Nelson or Theo de Raat like me much.)

The first time I read of 'Cathedral & Bazaar' I thought ESR was illustrating and contrasting the BSD .vs linux development model. Only later did I understand that he was pointing fingers at GNU/FSF, not BSD.


To reply to point three, which is the only one not containing an ad hominem...

There exist several larger OSS projects, such as Apache, Boost, the Kernel, etc, which accept contributions but are also curated. Thus they represent sort of hybrid between cathedral and bazaar. People who use these projects know they are getting some (varying) standard of quality.

I think these sorts of projects -- often shepherded by some kind of noncommercial Foundation or Organization -- are the best way to get a mix of openness and quality going forward.


Those are essentially ESR's original, canonical examples of "bazaar" projects. They're not middle-of-the-road. They're precisely what he meant by "bazaar".


Why don't you talk about a few current "Cathedrals" and how they're different from "Bazaars", that might help the discussion too.

The only example that jumps out at me is the original Unix, and I think you'll agree that comparing software from 30 years ago that does vastly less than ... pretty much anything out there these days is not an entirely fair, nor useful comparison.


I agree with phkamp that it is weird that you can't think of any.

Android is the most original-UNIX-like in it's development but I imagine you could extend it to systems in which large portions of source are available (since that is the kind of project C&B discussed) such as iOS and perhaps Java. Google Chrome. vBulletin. The Pine MUA. The ssh.com RFC 4253 implementation. Various parts of RHEL. And probably numerous "Free" programs that end up following the Cathedral model merely due to the culture of their maintainers.


> I agree with phkamp that it is weird that you can't think of any.

We can think of many but the point is to make sure we're on the same page. It's not like all software projects in the world are neatly divided into "bazaar" and "cathedral". I don't know that Google Chrome is a "cathedral", for instance.

It's ridiculous that the author refuses to give a single example, instead opting to say basically "see! You people don't know what a cathedral is, just like I said."



The fact that you and others cannot even spot any more recent cathedrals is sort of my entire point here...


I held IT jobs before 1990. Might I make the observation that when a software project is no longer able to be maintained by a single person or dynamic duo, it will tend to become a bazaar, no matter how cathedral-like it may have been before, and that the alternative is stagnation.

I will cite as evidence two of my favorite programs ever:

WriteNow (for Macintosh) an excellent early word-processor written entirely in assembler. Unequaled for many years for its combination of stability, raw performance, and ease of use, ultimately it simply could not add new features (let alone make the jump to PowerPC) and died on the vine.

HyperCard, which was perhaps one of the most dazzling, innovative, and influential products ever to ship, which pretty much stopped evolving once its original programmer lost interest.

Today we have the phenomenon of the incredible version one product, usually developed by one person, which never really makes it to 2.0. These seem very much like cathedrals.


There is a third alternative - split the project into manageable parts.


Good point but the problem is fundamentally fractal. For any project P it will either (a) stagnate, (b) go bazaar, or (c) be split into new sub-projects.

Most projects go nowhere, but once a project gains momentum it's going to split into pieces and each piece has a chance of "going bazaar" at each step.


Okay, I'll name one: the F-35 flight control software. Take of that what you will.

Now, I've not been working in the software industry since before 1990, so I'm obviously not qualified to comment, but I'll say this: I can understand why a cathedral model might be warranted. Even using something like CMMI might be a good idea in some cases.

But for a lot of things, especially exploratory/experimental things, the bazaar model is really nice and can reap significant benefits. And the nice thing about the bazaar model is that if you want to follow a cathedral model, no one is stopping you! Go off and be your own little dictator with a "grand unifying vision". Come to think of it, that seems to be what many of the most successful open source projects are: one (or a few) people have a vision of an itch they want to scratch, and they pursue it with a bloody-minded persistence. The bazaar only comes in when someone forks or in the fact that anyone can compete or (try to) contribute.

And BTW, I do know who you are, and have a lot of respect for you, but in some ways this article (and your comments here) could be read as reactionary against the success of Linux and other more open OSS; the BSDs have always been more insular (or discriminating depending on your POV) and developed like cathedral models than Linux; oddly enough this has resulted in three distinct BSDs while there is still only one Linux kernel. I will agree that reading Brooks (and other computer history) is almost always a good idea; just MMM was enough to open my eyes to how little the industry has progressed (VirtualBox/VMWare? That's nice; IBM was designing full system emulators for hardware that didn't yet exist in the sixties).


I most certainly can think of some things that would qualify in my mind, but since we're talking about rather vague notions, why don't you go ahead and name some projects you think are good examples? You seem like a bright guy; I'm trying to understand your point of view better, and so far, you are failing to communicate it very well.


> Fourth: Related to point two really: A lot of you seem to have little actual ambition of making things better, I guess that is what happens if you grow up in a bazaar and never even experience a cathedral. I pity you.

I think people do want to make things better, but its happening much more decentralized, with small teams taking small safe steps on tools, libraries & frameworks.

Media, on-line banking, 3d printing, accessible hardware hacking, e-commerce sites, scientific/engineering software. The majority of these things are made by small teams plugging the best libraries and platforms out there together, where they already solves part of their software problem. Android, Cloud, Linux etc. It would be silly to say these were without ambition and don't contribute to the community.

(The average software devs Impact Factor may be diluted by the increasing number of people working with computers, but computers are so globally useful its inevitable. If good things are still getting made then who cares.)


Instead of giving us pity and RTFM, it would help if you clearly specify what you think are the defining and distinguishing features of the cathedral and the bazaar models. There seems to be a lot of confusion around that.


I think you are missing a couple of things.

In this comment I will equate "Rug Market" with "Ready, Fire, Aim" and "Cathedral" with "4 Year Plan".

Firstly, the "Rug Market" beats the "Cathedral" when you haven't formulated the problem properly, and so you have bad specs.

Secondly, the "Rug Market" beats the "Cathedral" when bad software is more profitable than good software. Google for "Worse Is Better" and "The Innovator's Dilemma".

Sad but true.


As someone who grew up as a programmer with assembly language, C, and C++, learning the good practices needed to make a 1 million lines of code C++ application work and be maintainable:

I've been equally disgusted by the evolution of programming in the last few years. HTML - which is, to a first approximation, always invalid. JS - which needs jQuery to make it mostly-but-not-completely cross-compatible among browsers. CSS, which hides so many hacks on top of each other, and which even needs a reset to be compatible. And now - distributed systems with pieces in PHP, Python, Objective C, Dalvik-Java, etc... and sustained by awesome-but-hackish fixes like Varnish and FastCGI and Nginx. where everything seems to be put togetther with duct tape.

But I'm slowly getting to the next stage, and having to admit - no, actually, admitting - that if it's spread like fire, there must be something to it, no matter if it hurts our taste so much.

And if you look well into it, it's very similar to biological evolution. Our own genetic codes and body plans are full of ancient pieces that are not needed any more (or at all, male nipples anyone), but it seems that it was more economical to "patch" things than to fix things properly. Or at least, it didn't hinder the current designs enough that they wouldn't succeed over the alternatives evolution probably also tried.

And now Google translates text statistically, without even trying to understand things.

One fear I have is that we will be able to create a working AI in a few years... and due to the way we do it, we may even not understand how it works.


Nitpick: jQuery exists to make the HTML DOM manageable, not JavaScipt the language. (There are libraries targeted at javascript the language, but you could make an argument that that's the point of all libraries for all languages...)

I find jQuery more analogous to tools like configure and autoconf. (in that they aren't actually needed for "modern", standards compliant browsers.) As an example: there are already many lightweight drop-in replacements that assume a sane browser to begin with.

CSS is remarkably hack-free and the resets are just to remove the default styling that browsers have built in. Obviously browsers need builtin styles otherwise all the old pre-css pages would stop working.

And why are you calling programming languages and web servers hackish? How is nginx hackish and apache not? How is Objective C hackish and C++ not? Why stop inventing new languages at assembly, c or c++?

Other than that I generally agree with the rest of your comment. I think if we ever get to some kind of technological singularity we'll almost certainly and just about by definition not understand how it works. And true/strong AI would definitely be a singularity. Even if we understood version 1 we would likely not be able to understand whatever it dreams up 5 seconds after we overclock it. And we will - moor's law and all that. ;)


I don't mean Nginx or Varnish are more hackish themselves than Apache. I could have said Apache instead of Nginx, but I think

Back in the day, a "serious" application usually involved a large amount of quite homogeneous source code in a single language (or 2 or 3 different languages for different tasks: C/C++ for the main code/engine/logic, assembly for performance-sensitive code, and maybe some custom high-level script for high-level application logic). This was built usually on a single machine with a single build script, and resulted in some binary which could be deployed.

Even early web apps were more akin to this model - Java apps, or even Perl apps.

Nowadays, an "app" lives distributed among dozens of servers and client types. People describe what they have: "12 memcaches, 5 varnishes, 8 nginxes, 5 app servers with Ruby, 4 static content servers, and a MySQL master server with a fallback master and 5 slaves for reads. Separately, an Android app, an iPhone app and an HTML5 front-end."

These systems started as a single app in a single box, and have "grown" for scalability, reliability and security, being patched up with different pieces of technology put together with often unreliable methods but fallback mechanisms that make it more resilient that if reliable methods were used but no fallback. It's not so important nowadays that the application code be clean, elegant, or failproof, but that measures are put in place so that the service will keep chugging along most of the time.

And setting up the whole system from scratch involves a manual with probably hundreds of steps putting together haphazard technologies, installing different types of Linuxes and packages for each piece. And may even involve difficult-to-replicate steps, such as using an AMI to launch EC2 instances, where the engineer that created the original AMI doesn't even work here any more and recreating it from scratch would involve 20 packages, Googling for 5 of them as they're not available in normal repositories, 5 manual patches, two secret incantations and a tribal dance around the chair while singing and praying to arcane gods of long-forgotten package managers.

{{Side-note: although it's not the main point, I'd definitely say Objective C is more hackish than C++. C++ has its own amount of weird stuff due to being built on top of C and its evolution with templates, etc... but the evolution of Objective C is done much more in the form of "patches". Even the original syntax takes advantage of awkward gaps in the syntax of C (@interface, @implementation, #import, [object msg:param]?). But it's even worse how new features are piled on top of old ones: properties, ARC 2.0, etc... it's distinctly noticeable when you just wouldn't feel comfortable teaching programmers new to Objective C how to use ARC, without learning the underlying memory management model first. In C++, people can learn "new" and "delete", and never hear about malloc()/free(), and there's no problem at all.

Anyway, my point about hackishness was not referred to ObjC, but to how whole systems are engineered nowadays.}}


You're basically describing the very nature of distributed systems :) We see many more of them nowadays because of the scale of some modern web apps / sites whereas in the 1990s the internet just wasn't that big or that complicated. Whether every app that ends up being that complicated actually needs to be like that is a different question.

It also comes back to "don't reinvent the wheel" and "not invented here syndrome". You _could_ implement many of these things in your homogenous codebase or you could just reuse something that already works. I'm just saying there are serious and good counterarguments to the "large amount of quite homogenous source code written in a single language" approach.

I agree with some of the details (like AMIs, EC2, different types and versions of linux, different package managers in the same system), but I think in general the things you think of as hacks aren't hacks at all and just the signs of progress. Or at least the nature of large, complex modern distributed systems. You basically described the term "agile development" and you're remembering a time many developers would rather forget with rather rose-tinted glasses.

(I don't really know enough about objective-c to comment.)


>And now Google translates text statistically, without even trying to understand things.

This comment seems to mostly be, "I'm going to complain about software disciplines and lines of CS research that I don't understand because their practical approaches don't conform to my finicky aesthetics."

If you want to formulate Strong AI to solve NLP as a problem, go for it. The rest of us are happy to use Google as it is.


This article reads like an old timer feeling left behind by the current rate of progress who thinks that the problem is really that the rest of the world is doing it all wrong.

He's probably right that lots of software could be designed better, but I think he's wrong that that's of paramount importance. We need lots of software these days and we simply don't have the resources to built it to Kamp's standards. Also, we've learned that our requirements change so fast that his beautiful design would quickly be twisted into the pile of hacks that he hates.

He notes how long it takes to compile the software that runs on his work machine---how long would it take to compile the software on a windows machine? (Or whatever he would claim is the standard-bearer of his cause).

He also attacks open-source software. The truth is, we have an amazing amount of free and open source software available. Some of it may be flawed in design or usability, but it enables us to solve so many problems (and look at/modify the software when needed). I don't think that all software needs to be "free", but I do think free software has made us all much richer. I have a hard time seeing all this value that the Bazaar has created as inferior to slow-moving, centralized, big, up-front design development.


I wonder if you bothered to check who wrote that article, before you started speculating about things you could have found out with a few google searches ?

You seem to assume that cathedrals are "slow-moving, centralized, big up-front" designs, where did you get that idea ?

Ever looked into how USA put a man on the moon ?

Maybe you should. Also: Read Brooks book, if you can.


Why does it matter who you are? I've tried to respond to what you wrote.

Why do you think most of the software we build now is like putting a man on the moon? I think it's nothing like that.

Maybe I should read Brook's book (I've enjoyed MMM), but your article doesn't make a strong case for it.


> Why does it matter who you are? I've tried to respond to what you wrote.

Because you claim he's attacking open source software, and you said that maybe he thinks MS is the standard bearer of his cause.

The (easy to find) facts are that he has committed a lot of code to open source and has been involved in freeBSD for many years. These are not small trivial bits of code used by a few people or for insignificant reasons. Crypt and varnish are bits of code getting billions of uses.


If the author's open-source credentials are part of his argument, then he needs to make them part of the story he tells in his article. He shouldn't assume that people will connect the dots (he doesn't have the name recognition of a Torvalds or Stallman). Further, even with the context, I don't think the dots add up to enough of a picture to understand what he's arguing for/against.

To make a convincing argument, he should have made it clearer what he was arguing against ("the bazaar" is too nebulous and refers to many things, .com development refers to other things) and what he was arguing for (which was entirely missing, other than a reference to Brooks' book). If he doesn't have a clear vision of how development should work (or a "standard-bearer"), he should have at least provided some examples of things working the "right" way (or better).

He should probably also have left off a bunch of the insults and hyperbole: "clueless" / "hacks" / .com period a "disaster" for code quality. I'm frankly pretty surprised that ACM Queue would publish this.


> If the author's open-source credentials are part of his argument, then he needs to make them part of the story he tells in his article.

PHK is one of the most prolific contributors to Queue. Most regular ACM readers know who he is.

> I'm frankly pretty surprised that ACM Queue would publish this.

Again, see above.


In addition to what chrisaycock said, phks writings are typically very "lacking" and assumes a certain knowledge about whatever he has chosen to write about. He writes interesting stuff, a lot is implied and/or not very sufficiently explained.


Or maybe, just maybe I don't write for the lowest common denominator, but for people who can think for themselves.


He's also a little bit fond of personal attacks.


A bizarre feature of some of the developer mailing lists and Usenet groups is the absolutely hateful vicious toxic nature of them.

OP's site bikeshed.org (http://bikeshed.org/) has some nice proposed features for software that posts to large audiences.

     +------------------------------------------------------------+
      | Your email is about to be sent to several hundred thousand |
      | people, who will have to spend at least 10 seconds reading |
      | it before they can decide if it is interesting.  At least  |
      | two man-weeks will be spent reading your email.  Many of   |
      | the recipients will have to pay to download your email.    |
      |								   |
      | Are you absolutely sure that your email is of sufficient   |
      | importance to bother all these people ?                    |
      |								   |
      |                  [YES]  [REVISE]  [CANCEL]                 |
      +------------------------------------------------------------+

      +------------------------------------------------------------+
      |	Warning:  You have not read all emails in this thread yet. |
      |	Somebody else may already have said what you are about to  |
      |	say in your reply.  Please read the entire thread before   |
      |	replying to any email in it.                               |
      |								   |
      |			     [CANCEL]                              |
      +------------------------------------------------------------+
Perhaps any forum software needs these, as well as buttons saying [really post this?] [save this angry version locally, and give you time to write a calmer version] etc.


I remember that my newsreader back in the early 1990s did exactly this. I had trouble posting the first few times because I took the message to heart and didn't want to waste other people's time. But I quickly realized that other people didn't follow the same suggestion, which meant that I should ignore the message, and even interpret it as wrong.

Why is it wrong? For example, consider the second message. Suppose you haven't read all the messages in a thread because you've been on holiday for the previous few days. But a friend pointed you to a message in the thread which specifically mentions your name, asks "could you verify this for me?", and doesn't have any followups?

Why should your newsreader force you to read all of the thread in order to answer something which isn't in the rest of the thread? Yet the only option there is "cancel".

Furthermore, "at least 10 seconds" is completely wrong. People killthread, plonk people, and develop other ways to ignore discussions. Assuming 4 hours per day means people can handle at most 1440 messages. There are 276 comments already in this thread, but I expect most people sampled a few messages on each major branch, skimmed a bit, and perhaps did a text search to see if someone mentioned a key word. Most assuredly, the entire HN readership did not read this entire thread.


I'm really grateful for you writing this. I don't know if I'd couch it in the Cathedral/Bazaar terminology, but I've had similar thoughts.


I think Munksgaard is supporting you, and that there are some problems with different languages.


Indeed, it was not meant as an attack, merely trying explain his style. I enjoy reading his blog posts, but I readily admit that I often have to seek out some sort of elaboration, as it's not always evident what his stance is, or even what he's talking about (I often wish he'd provide some more context).


Because no words are without context.


Actually, the nicest thing about HN for me is that I get to see my words evaluated on their own merit rather than what people know about me. HN has made me a better thinker and better communicator. Over the past year or so that I have contributed, I have clearly seen the way my thoughts diverge from the reality of what others believe. Sometimes I disagree and stop convincing people, other times I present arguments which are stubborn, but backed up.

You took the time to write a long essay, and present a novel idea. A lot of posters here disagree with you, but the mindset on HN has been to build first ask questions later. Convincing people in a single day to change their world view would be a huge undertaking regardless of whether you are thinking about it correctly or not.


And that is probably a good way to filter through the million monkeys.

But some topics would require entire books to present in a context free way, this is one of them, and I don't have the time and money to write that book, so you will have to make do with a column @ACM.

When I read something, and have the nagging feeling that there is context I'm missing, I go looking for that context, usually starting with "who the heck wrote this", rather than assume that the writer is a clueless bozo.

But whatever blows your hair back buddy...


I enjoyed your post, but I also understand why people hated it. They hated it because you presented an idea that is completely at odds with the way many HN readers think. Then, rather than sitting back and watching the discussion you became offended and went on an aggressive warpath.

You didn’t even consider that what I wrote was meant as a complement to the fact that you are taking a view that no one else currently takes. An idea that has real merit and is worth discussion. You took my comment as an insult and wrote:

“But whatever blows your hair back buddy...”


That is not an insult in my book.

To me it means: If you're happy what you're doing, then who am I to tell you're wrong ?


> I wonder if you bothered to check who wrote that article,

I don't think the appeal to authority fallacy will fly here. We are all too smart for that.

You arguments should be evaluated for their coherence and, in the end, it really doesn't matter who you are or what you did if you are wrong.

To use your example, it doesn't matter if you are Von Braun himself. If you don't bring enough propellant for your descent engine, you'll still crash on the Moon.


The point isn't an appeal to authority, it's that the author is a supporter of open source. Not Microsoft.


That is, unfortunately, rather the point. Microsoft is one of the best current examples of the cathedral. The extent that FreeBSD is not-Microsoft is the exact extent that it is more bazaar-like.


Your essay was unbalanced, yes open source sucks in the way you complained about, but Microsoft sucks even worse in some ways (if not quite as badly in others). Complaining about something in isolation is easy; everything sucks compared to perfection. But how is it compared to the real alternatives.


It amazes me that the argument against this is by attacking some "standard-bearer of his cause." Let's not ask how it is compared to existing alternatives, or to 'perfection', but rather the potential alternatives. Your argument is akin to "stop talking about 'engines'; horse-drawn carriages might suck, but so does walking."


He's not talking about engines though, but at about some golden age in Unix's history when All was Right. Or at least that's part of what I got from it.

He also mentions 'one person having responsibility'. Does he mean a benevolent dictator? Linux and Python have that. They're imperfect in their own ways too.


Who said anything about Microsoft ?


I think it's a cultural reflex. Microsoft has been paying so many people to criticize open source for so long that the first reaction when people see what they interpret as baseless criticism is to look for Ballmer's sweaty palm prints.

I understand where you stand and agree some of your criticism is valid, but "quality" doesn't have the same meaning for everyone.


it's a prominent example of the "cathedral" model.


First, it's actually not, because their architecture is focused on architecting a market, not a computing solution.

Second, just because there is one bad example of a cathedral, doesn't mean the cathedral model is bad.

Many years ago, a fella named Gettys (look him up!) wrote as part of sage advice for OSS philosophy:

"The only thing worse than generalizing from one example, is generalizing from no example at all."


> Second, just because there is one bad example of a cathedral, doesn't mean the cathedral model is bad.

right, but this could be applied to your example of the "configure" script as an indictment of the "bazaar" model too. Surely there's software that's come from the "bazaar" model that you feel is of high quality ? There's none whatsoever ?


Some software is extensively designed up front, like embedded avionics programs. The resources are made available because folks like the FAA demand it.

Why don't we employ rigorous software engineering principles to, say, iOS games? It would be a waste. It could be done, but in practice, people just don't care as much if a game on their phone crashes as they do if an airplane they're riding in crashes.

There's a wide spectrum of software out there, needing varying levels of robustness. There's not a lot that can be said about software development as a whole along these lines; inadequate design for an online banking site might be excessive design for a llama costume competition voting site.


You are not even in the same zip code as the point I'm trying to make, which, I guess, sort of makes my point.

Nobody says you have to write iOS games using the same methodology as avionics, what somebody is trying to say is that random walk will only get you so far...

iOS is a pretty bad example btw, because iOS is very much a Cathedral and has a designer and architect who cares and who is in control.

That's precisely why don't need to use autoconf to compile stuff for iOS.


> That's precisely why don't need to use autoconf to compile stuff for iOS.

You don't need it to compile stuff for one specific version of Debian with packages X, Y and Z installed, either. You're comparing apples to oranges.


But Debian v.x.y.z isn't a target; "Linux" is.


Not necessarily. Plenty of commercial packages say "made for Redhat X.Y.Z".

The fact that something only runs on iOs means it's a fairly limited piece of software in terms of portability - which is an acceptable tradeoff for many things, but not for others.


While I (a younger developer) may not have much experience in cathedrals, bazaars, or what have you. I've noticed one major issue come up again and again in this discussion here on HN: no-one seems to be able to agree on what these terms even mean and everyone is talking past one another. I've seen arguments using Office, Windows, iOS, and OSX as examples of Cathedrals, and I've seen just as many using them as examples of Bazaars.

It's very difficult to learn when everyone is just ranting.


iOS, and most of the stuff that comes out of Apple other than the hardware, does not have a single designer. This is one of the greatest myths that Jobs promulgated. He did not hand down directives from on high. Quite the opposite. He waited for new ideas to come from his engineers, then he would wield veto, but that's all.


This absolutely accords with my experience.


I almost posted that at the top-level of this thread, but it was your comment, not the original article, that prompted the train of thought behind my comment. I wasn't arguing with you.


Architects don't build cathedrals anymore, either. Neither do most software architects. Yet, look what they have built: an incredibly diverse collection of structures of all shapes and sizes, all working together to form an imperfect yet efficient system. Surely its success is due in part to its flexibility and imperfection, and the fact that they are no longer over-engineered and inflexible behemoths made from stone.

If you try too hard to design a rigid structure around anything, it can come back and bite you in the ass. Part of the reason why this seemingly disorganized and haphazard collection of buildings makes a working and thriving metropolis is because of its diversity and resilience to change and progress. If you attempt to over-engineer something of such complexity, you might just end up with Pyongyang instead of New York. Instead, we have a competing ecosystem of libraries and options with the good ones theoretically rising to the top. Better communities and communication of these values make this process work even better, just as in the greater economy. For all its complexity, this process works surprisingly well.

For all the explaining Kamp did in this article, the one thing he failed to account for was the resounding success of the current model. He did use a surprisingly apt metaphor, however. Cathedrals are no more; their old place as the center of society had to be explained by a series of myths and lies that placed ideals above reality. The thriving Bazaar easily replaced them at the center of any thriving modern city, based on the simple truth that the dynamic edge of reality was ever-changing, and that quality based on that reality would indeed be more successful.

Could it be better? Of course. But it is reality and truth that will move us forward—not mythology. Good riddance to cathedrals.


So you don't count the iOS and Android OS'es as cathedrals ?

Have you never wondered about their surprisingly coherent APIs and wondered why that was so different from, say, UNIX ?

See my other comment about knowing what a cathedral is to begin with.


Fair, but I would describe them more as well-engineered skyscrapers. Or even simply good city planning. They're large structures designed around the idea of flexibility and growth from the outside. Ironically, both of these described platforms have orders of magnitude more functionality in their 'bazaar' app-stores than the foundation itself. The foundation enables the market to work, but in itself it is simply a very well-designed support system.

I guess you're saying that there's no reason UNIX should be different—that it has somehow become a disorganized mess of libraries and code re-use, and that it is broken.

I don't see this. Android and iOS are like brand new cities; Dubai and Abu Dhabi. UNIX is Paris. Perhaps you can make some updates to the city infrastructure, but in the end the streets are still cobblestone underneath, designed for horse carts and not cars. But you have the Notre Dame at the center (ironically, a cathedral), the architecture, tons of history, you have a thriving culture and countless people working and playing in this city every single day with great success. You can complain about it all you want, but there is a reason that people are fundamentally attracted to it.

I think you're right about a lot of things, but if you propose rebuilding Paris, well, you may have some opposition. I think you might want to build a new city, rather than trying to build over very old history with some ideal of the cathedral you think it should have been. And to use your examples again, that strategy has been very successful for Android and iOS.


And now you are starting to see what is meant by "cathedral": Not a place of religious worship, but a construct with a coherent vision.

Read Brooks book.


I knew exactly what you meant by "cathedral," I simply extended the metaphor even further because I thought it was quite strikingly appropriate.

Will do, thanks for the thought-provoking article.


Paris was substantially redesigned as a planned city by Haussmann[1]. London is a better example where the street plan is pretty much unchanged.

[1] http://en.wikipedia.org/wiki/Haussmanns_renovation_of_Paris



yet efficient system.

Citation needed. :) What metric did you use to measure the efficiency of the software bazaar?


> an incredibly diverse collection of structures of all shapes and sizes, all working together to form an imperfect yet efficient system.

Efficient?


This article confused and dismayed me.

Since 2001, we've started in on the XP/Agile movement, which, think what you will of it, test-drives and version-controls relentlessly, and fosters a constant dialog about what quality is and how to achieve it.

Furthermore, some marvelous tools have been written in the last decade; I can hardly see how the author can complain about version-control systems, as Git and Mercurial--both miles better than what was available in 2001--are not least among them.

Other quality-enhancing things that have happened since 2001: continuous integration, build pipelines, Selenium and friends, behavior-driven development, REST architecture triumphant over the cathedral-like SOAP and XMLRPC, JSON over XML, lightweight messaging, the adoption of small, focused open-source tool over "quality-focused" large vendor bullshit-enterprise tools.

To the extent that UNIX sucks now, which it doesn't, it's because the hackers who work on it now have lagged the industry--not the other way around. So go find some other group of kids to shoo off your lawn.


You mean: "In 2001 we reinvented XP/Agile because we couldn't be bothered to read the old literature to see if somebody else had done something like that before" ?


I don't think that's quite fair on Fowler. C&B was published only 2 years before his book on XP, The Toyota Way was published in 2001 as well.

Really many of ideas which we've come to associate with "Agile"-like methods only really started to become codified about the end of the 1980's in more traditional manufacturing. I don't think it's too surprising that it's taken the software industry another 10 years for this stuff to become mainstream.

Also from your other comment above...

> Ever looked into how USA put a man on the moon ?

I'm sure FreeBSD would be a lot better if you gave it $25 billion USD over 11 years as well.

It takes more than just good will and intelligent people to produce high quality results.


I am afraid that the principles of Agile was clearly espoused by Tom Gilb in 1988, long before the Manifesto etc. He called it Evolutionary Programming but it is pretty much Scrum.

see: Tom Gilb Principles of Software Engineering Management (1988)

We do have an awful tendancy to reinvent the wheel, but I think that building Cathedrals vs building Bazaars is the wrong metaphor - I prefer building simple vs building complicated.

The "bad" people PHK describe are building complicated. The "good" people build as simple as possible, but no simpler.

edit: 1992. Only been building bungalows.


I think it's perfectly fair.

UNIX was originally written as pair-programming.

Brooks refers to agile in The Mythical Man-Month (he didn't invent it, check his references)


Right and a lot of the lean manufacturing (and 8-hour work day) research that lead up to MMM came from stuff done by Ford and Toyota before World War 1. If there hadn't been two world wars and the cold war since then I imagine this sort of knowledge might have spread around faster.

You are being overly optimistic about the pace of human cultural development. The rest of the world is only just starting to realise again that perhaps we have cultural issues that lead to low quality results. Bell Labs have numerous advantages over the rest of the world when it came to fostering the sort of culture that produces high quality results. Just reading the history books really isn't enough for that to come about.

TLDR; I blame Windows ME on World War 1.


UNIX was originally written as pair-programming.

From http://www.drdobbs.com/open-source/interview-with-ken-thomps... :

KT: I did the first of two or three versions of UNIX all alone.

Later, he emphasizes that they didn't even look at each others' code:

DDJ: Was there any concept of looking at each other's code or doing code reviews?

KT: [Shaking head] We were all pretty good coders.


>>Later, he emphasizes that they didn't even look at each others' code:

I remembering reading somewhere (Dennis Ritchie's C History article?) that once they both programmed a solution without consulting each other, and later when they read it, the solution was exactly the same, even down to the variables' name.

Power of C? Two people who thought alike?

>>KT: [Shaking head] We were all pretty good coders.

Which the modern world doesn't have enough of. Demand is so high that quality of supply doesn't matter, just enough to satiate the demand; which is in stark contrast to the early days, when computer science was a research field only privy to scientists and later to hackers and crackers. When it went mainstream, and grew to an industry, its no wonder that it acquired all that which comes with it.


I'm another grumpy old man, much like I imagine the OP. I have mixed feelings about this topic.

I found the autoconf comments amusing, we have supported a pretty broad range of platforms, from arm to 390's, IRIX to SCO, as well as Linux, Windows, MacOS, and our configure script is 157 lines of shell. Autoconf had its place but I think it was much more useful in the past and now it is baggage. The fact that it is still used (abused?) as much as it is sort of speaks to phk's points.

I think the jury is still out on whether the bazaar approach is better or not. It sure is messier but it also seems to adapt faster. I worked at cathedral places like Sun, and while I continue to emulate their approach I also question whether that approach is as nimble as the bazaar approach.

I've voted for the older style of more careful development and I think it has worked pretty well for us, we can support our products in the market place and support them well. The cost of that is we move more slowly, we tend to have the "right" answer but it takes us a while to get there. Bazaar approaches tend to come up with answers of varying quality faster.

It's really not clear to me which way is better. I'd be interested in hearing from someone / some company that is supporting a lot of picky enterprise customers with some sort of infrastructure product (database, source management, maybe bug tracking) and making a success of it with a bazaar approach. Seems like it would be tough but maybe I'm missing some insight.


The odd part about this “get off my lawn” article is that PHK has already shown how to fix the problem: Varnish is both a very good tool and one which has gotten attention and compliments for rejecting obsolete convention (e.g. relying on the VM, requiring a C compiler to be installed on a server, etc.). I would love to see autoconf massively simplified or outright avoided for most projects and the best way to do that would be to start providing good examples of how unnecessary it is even for major projects.

The way to address oversights in the bazaar model isn't to cram everyone back into the cathedral but to build support for change by showing where something is clearly better.

One area where PHK might see this is the way Linux has flown past FreeBSD - not due to endlessly-debated questions of kernel superiority but rather because Linux distributions like Debian provided a clearly superior experience for the overhall system by rejecting the decades of accumulated hacks (i.e. the ports system, monolithic config files, etc.) which had been the status quo for years and building better tools to reduce management overhead.


>Linux has flown past FreeBSD . . . because Linux distributions like Debian provided a clearly superior experience for the overhall system by rejecting the decades of accumulated hacks

Debian contains tons of accumulated hacks that I wish would be rejected.


A debian system from a user's perspective is so easy to use. It's when I dug into the packaging world that I reeled back in horror and shock.


Digging into /etc (particularly, the scripts that run on boot-up) made me reel back.


I won't disagree with that entirely but the Debian / Ubuntu community has shown considerably above average willingness to make major changes to fix that.


They mostly seem to make a mess while never achieving a design anywhere near as coherent as that of OS X, or as straight-forward as that of the BSDs.


Debian has also introduced new regrettable hacks, like run-parts's special handling (i.e., ignoring) of files whose names contain a period.


Linux has flown past BSD for reasons other than technical superiority, and more to do with the politics and ecosystem of software, especially related to lawsuits, uncertainty, network effects and momentum.

See also: MySQL vs. PostgreSQL.


> See also: MySQL vs. PostgreSQL.

This comparison doesn't hold true: while we might debate the technical superiority of the BSD kernel (and I have some NFS deadlocks to trade against ZFS), running many BSD systems is so much more work that it's more like comparing Postgres to Berkeley DB, where you have to take on responsibility for everything outside of the innermost core.

I ran FreeBSD and OpenBSD on a number of systems for years but was increasingly unable to justify the sysadmin overhead. There were only a couple of times where kernel issues were significant (NFS stability and SMP scalability, esp. when amd64 came out) and since Linux actually came in ahead of FreeBSD for our applications it was really unrealistic to locally duplicated all of the Debian infrastructure.


A very thought-provoking article. (Been programming for pay since 1966 here.)

A lot of the commentary seems to be confusing the waterfall model with the idea of the cathedral.

Another way to make this point is to note how few programmers these days read Dijkstra, or Knuth's TAOCP. (Knuth would have us believe that even he doesn't read it. See Coders At Work.) Among other things, Dijkstra taught understanding the entire program before setting down one line of code. Contrast this with TDD (which, believe it or not, some take to mean Test Driven Design).

Lately I have been in the Application Security business, and nowhere has the issue highlighted by the article been more obvious.

Edit: Mark Williams Company, inventor of Coherent OS, was not a paint company. It started out as Mark Williams Chemical Company, manufacturing Dr. Enuff, a vitamin supplement.


So if I wanted to start reading Dijkstra, what's the best place to start?


His hand-written notes are here http://www.cs.utexas.edu/~EWD/. One biting example is http://www.smaldone.com.ar/documentos/ewd/EWD707_pretty.pdf.

His Discipline of Programming http://www.amazon.com/Discipline-Programming-Edsger-W-Dijkst...

There are some here http://cs-exhibitions.uni-klu.ac.at/index.php?id=31 that might overlap with the texas ones. One famous one is about the cruelty of teaching computer programming: http://www.cs.utexas.edu/users/EWD/ewd10xx/EWD1036.PDF.

His wikipedia page has links to a number of his seminal ideas.


Yes, the bazaar is like evolution: messy, inefficient, and slow. Lots of bad ideas are tried; a lot of them stick around for as long as they provide more value than they subtract; and progress takes a long time. Just as the human body has components that are useless today (e.g., the coccyx), evolving software ecosystems always carry a lot useless baggage. That's how evolution works.

But Evolution copes better than intelligent, top-down design with the evolving constraints of a market landscape that is constantly shifting.


... but evolution will also quite happily run you over a cliff which you have been avoided if intelligence were applied.

Not to mention the fact that the bazaar is never going to put a man on the moon.

And yes, I wrote that piece.


Except that evolution did put a man on the moon...unless you are one of those who don't believe humans are the result of 3 billion years of evolution?

What you have to consider is that individual cells evolved, until they created a whole greater than themselves. No individual cell was in control of that first step Armstrong took onto the moon. Likewise, what we end up creating from so much "bazaar" development will almost certainly accomplish something amazing...

...without any of us being able to take credit for it.


Right. Not never, but after 3 billion years.


And an incredible amount of energy from the Sun.


> Not to mention the fact that the bazaar is never going to put a man on the moon.

I think we can see abundant proof evolution can go a long way towards that specific goal.

The reason your work machine, running BSD, works the way it does (and self-assembles that way) is because each part of its software environment is independent, designed for different goals and constraints and it just happens to work together well enough you can work with it. It would be easy to standardize Unixes, but then you wouldn't be able to compile M4 for Windows or use one of the 26 FORTRAN compilers absent from your system. Also, what happens when you realize your design led you to a dead end? Evolution usually has plans B to Z built in. It's really difficult to make something evolve into a corner.

Having plans B to Z built-in has an impact on the overall elegance and purity of the design.


>The reason your work machine, running BSD, works the way it does (and self-assembles that way) is because each part of its software environment is independent, designed for different goals and constraints and it just happens to work together well enough you can work with it

No, that's not the case at all actually. FreeBSD like the other BSDs is a complete OS, designed and built as a complete OS. It is not a collection of random tools by various people thrown together like a linux distro is.


  arcueid ~ $ du -hs /usr/src
  728M    /usr/src
  arcueid ~ $ du -hs /usr/src/contrib/
  307M    /usr/src/contrib/
So actually almost half of the FreeBSD OS is "random tools by various people thrown together".

Compared to that, in something like Debian the GNU tools, designed and built by the GNU with the intention to form a complete OS, probably constitute a greater proportion of the core OS.


Going by the size of the source code, including the configure scripts, makefiles, etc for other operating systems is a rather disingenuous way to measure.


It was the simplest 30-second way to compare I could think of. If you have a more accurate measure then by all means post your results.


The expression "a complete OS" is not what it used to mean. Are you implying FreeBSD has an X server, a dozen compilers and interpreters, a GUI and other components equally "designed and built" as a cohesive thing?


No, I am implying that those things aren't part of the OS. You can tell, because they aren't part of the OS, they are installed with the 3rd party package management tools. In linux there is no base OS, everything is 3rd party packages. There is no difference between firefox and ls. In BSDs there is, ls is not 3rd party software packaged up to be installed, it is part of the OS, written by, maintained by, documented by the same people as the kernel, the standard library, etc.


I would suspect GNU ls is more a part of a Debian install than, say, Gnome desktop. While not as clear-cut as in the BSD world, it's still the same idea.

Except that GNU ls was not designed to run on top of the Linux kernel, which is, on my book, a feature rather than, as you imply, a bug.


The only thing keeping ls on your debian system is other packages having a dependency on it. There is no distinction between operating system and 3rd party packages, all of it is 3rd party packages. I have no idea where you are getting this weird notion about ls and where it was designed to run being a bug. I simply corrected an erroneous statement about what makes up a BSD operating system.


Why would it make a difference whether ls is part of the operating system or not? It works.

And if, next week, I write a better ls, I can install it and use it.


Would you mind simply reading the thread? You are inventing an argument that does not exist. I did not make a value judgement, I corrected an erroneous statement. Nothing more.


I did read it. I just think the "complete OS" think doesn't really make sense. Is it superior because of it? More elegant?


I do not how to make this any clearer for you. You said the reason his freebsd system works is because a bunch of independent components happen to fit together. This is false, and that is what I said. That's it. If you said 1+1=3 and I said "actually, 1+1=2" would you then insist on a giant thread of trying to create an argument about whether or not two is a good number?


How is that a fact? "Open Source" space endeavors already exist.

Also I know NASA is famous for their solid code, but personally I still question the amount of effort that goes into it. Just because their approach works (if you can afford it) doesn't prove that there couldn't be a more effective way.

I suppose with infinite money you can make quite a lot of approaches work. You could have an army double check every line of code. Instead of pair programming, why not have 10 programmers look over each others shoulders?

Lastly, lots of intelligent beings happily drive over cliffs all the time, just look at the financial crisis.


> I suppose with infinite money you can make quite a lot of approaches work. You could have an army double check every line of code. Instead of pair programming, why not have 10 programmers look over each others shoulders?

Bingo: it's all about the economics of various tradeoffs. If I have a company that puts together web sites for small companies in the area, and we have NASA style coding standards, we're going to completely price ourselves out of the market. If NASA applied "little web shop" coding standards to their projects, they'd be lucky to get anything up in the air before it exploded.

The important thing is to be on the curve though, not behind it, meaning to make the best of the tradeoffs the economics of what you're doing push you towards.


> Not to mention the fact that the bazaar is never going to put a man on the moon.

Spacex's "careers" page has a whole lot of "linux" jobs listed, so the Bazaar at least seems to be contributing heavily to potential future space travel: http://www.spacex.com/careers.php


I've seen you mentioning some variant of "put a man on the moon" several times in this thread, apparently referring to the cathedral-like project to do so, along with several comments along the lines of "Microsoft software isn't a cathedral because it's designed to create a monopoly market, not software".

Unfortunately, those statements seem to be contradictory; the United States ran the Apollo program for national pride, to cow the Soviets, not out of some grand desire to visit the moon. The science done by Apollo was secondary, and if you believe exploring space was a primary goal, you are going to have a hard time with the question, "why haven't we done it lately?"


Evolution put a man on the moon already in 1969.

Maybe you will not accept this statements for similar reasons like the ones driving your essay.


phkamp: "never" is a looong time. Who knows what the Bazaar can achieve through 'dumb' trial and error over the coming decades and centuries?


http://en.wikipedia.org/wiki/Infinite_monkey_theorem comes to mind. Both efforts (cathedral and bazaar) have produced really useful software, but sometimes I have the feeling that the whole idea of "thinking about a problem" has been replaced by "hit the keyboard until something happens".


Lambasting libtool for providing a consistent experience across star-NIX is, imo, not the wisest move for a FreeBSDer.

Article: This is a horribly bad idea, already much criticized back in the 1980s when it appeared, as it allows source code to pretend to be portable behind the veneer of the configure script, rather than actually having the quality of portability to begin with. It is a travesty that the configure idea survived.

Good high-minded notions here. But configure, with it's standardized parameters for how to do stuff, is near irreplaceable at this point. Certainly a more stripped down version, one not built on M4, would be wise, but libtool/autoconf itself is used too broadly & with trepid familiarity by developers & upstream maintainers: in spite of so much of it being indeed old deprecated no longer useful cruft, the best we can hope for is duplicating a wide amount of the existing functionality in a cleaner manner.

But at what cost would reimplementation come? How many weird build targets would for months or years go unnoticedly broken?

The place where we escape these painful histories is where we leaving the old systems programming languages behind. Node's npm I'd call out as a shining beacon of sanity, enabled by the best most popular code distribution format ever conceived: source distribution, coupled with a well defined not-completely-totally-batshit algorithm for looking for said sources when a program runs & at runtimes goes off to find it's dependencies: http://nodejs.org/docs/latest/api/modules.html#modules_all_t...


Why should I as a FreeBSD person not be allowed to lambast the worst hack-on-hack-on-hack-on-hack I have to suffer ?

And trust me, libtool is replaceable, all it takes an agreement about a compiler and loader flag for producing shared libraries and you suddenly don't need it at all.


But that agreement would be based on our current best understanding of how computers should work. When you factor in the future (and evolving an existing codebase is a huge problem) we have to assume any understanding we have now is incomplete and flawed.

That's why things like libtool or autoconf evolved (or, better, were "iteratively designed") to be able to grow and encompass varying and different goals.


So what you're saying is that progressively tidying up a codebase to simplify fulfilling current requirements is always a bad idea because you might remove something that might make a hypothetical future requirement easier to fulfill?


No. I'm just reminding ourselves we don't know what you'll need in the future, that any decisions we make now are subject to change down the road and that it's foolish to assume we can design now what we'll be using ten years from now.


Dynamic linking has not changed in substantially interesting ways in the past 10-15 years.

We understand the pitfalls and complications, and can come up with some solutions on how to handle them in a forward-thinking way.

If things break in the future, we can revisit this then.

In the meantime, libtool is one of the most ridiculous time-sinks I've ever had the displeasure of working with.


Compiling and linking a library for XLC/C++ v7.0:

http://publib.boulder.ibm.com/infocenter/comphelp/v7v91/inde...


That's just defeatism, not arguing against cathedral-style development. Just because we're likely to be wrong to some degree doesn't mean we shouldn't try to be as close to right as possible.

Anyway, I get the impression that for the original example of libtool, cathedral vs bazaar is the wrong question to be asking. [1] The problem isn't one of design or implementation. It seems a problem of distribution. I posit you could build a simplified, cross-platform library linker via either a centralised or distributed process. The question is rather, could you get everyone to agree to actually use it? Considering the success of clang, I think you'd stand a reasonable chance. (speaking of which, clang doesn't exactly stand out as a pure-bred bazaar-model example; neither is GCC for that matter)

Actually, I think that's what's wrong with this whole thread and the article that triggers it. I'm fed up as anyone with the layers of crap that we're building our software on. I seem to spend most of my time yak shaving because of some legacy decision, not contributing new stuff. In fact, my main output seems to be adapters for crappy interfaces. So while I find myself nodding along with the original essay, I think cathedral vs bazaar has nothing to do with the quality of the code we use all the time. It's a question of having the resources and balls to replace the layers of crap when necessary instead of carrying on piling more of them. Whether or not that's possible is mainly a question of whether the systems you absolutely need to retain are open or not. (and open source software can still be developed cathedral-style - I'm pretty sure ESR considered GNU to be cathedral software)

As an aside, when was the last time the format of .a/.so/.dylib/.lib/.dll files drastically changed on the respective operating system? We change even prevailing processor architecture a lot more often than that!

[1] I'll have to admit ignorance on the specific detail on the problem that libtool solves (inelegantly). I hope that doesn't derail my argument too much.


> Just because we're likely to be wrong to some degree doesn't mean we shouldn't try to be as close to right as possible.

Of course not! We should try to get as much of what we are doing right. We also must realize we are not as clever as we think we are (or would like to be). Cathedrals are monuments to imaginary deities. As such, they aren't subject to the laws of reality. Our efforts should, OTOH, be guided by what's real and measurable and take into account that what we "know" but can't measure is nothing more than an educated guess.


I was quipping that I don't wish to see FreeBSD drift any further away from what unification there is.

My fear is that the awful hack-on-hack-ad-infinitum we do have at least binds us together, even if it is in misery. I'm not at all uncomfortable with replacing libtool, and I'm certain a very good job could be made. But I can hear 18 m68k afficianados crying out in pain that the new toolchain doesn't work on their NetBSD system, or the guy who wrote his own C compiler that needs some extra rules feeling totally shafted because now he was to write M4 and "M5," which for some reason he cannot stand any more than M4 even thought it looks nothing alike.

We stand unified. Not by good things. But by the woe we all suffer. To those out there seeking to crush that & make better, I wish you the best! Please keep in mind a simple mantra as you go about inventing the future, do no harm. Libtool maybe awful but it's how distributions are made.

phkamp, sorry for missing the reply. M4 is ugly, libtool takes forever, everything in it is a hack. It's proven to be an at least adequately flexible hack that has kept *nix unified, more or less. I respect that, but I'm not a systems programmer that gets burned by it on a regular basis either, and I don't mind that my OpenWRT compiles of all packages take 32 hours and it's all libtool's fault.


That "stripped down version, one not built on M4" exists and is called CMake. I've used it and it's very, very good.


I think FreeBSD, and the Linux distributions do try to cater to too many different people, and quality and coherence suffers a lot from this. I think we can get past this though. The culture of testing and good code is on the ascendant again in many quarters. You need more people to understand build, packaging and distribution better, sure. You also need autotools to die, as the use cases for it are mainly dead. You can generally write portable code to the systems that matter if you want to now, and it just works.

A lot of the problems are due to poor integration between languages, so for example the JVM people have reimplemented almost everything, as have the C++ people.


nodding violently.


"Later the configure scripts became more ambitious, and as an almost predictable application of the Peter Principle, rather than standardize Unix to eliminate the need for them, somebody wrote a program, autoconf, to write the configure scripts."

I'm not sure I understand how the Peter Principle applies here? Autoconf seems a rational solution in the economic sense: to standardise Unix, you need to have lots of influence to effect buy-in, whereas to write autoconf, you need a big dose of hacking talent. If you have the latter and not the former...


I had a hard time understand it as well. I interpreted it as the Peter Principle also applies to software rising to the level of it's incompetence, not just people.


Right, libtool is the same - the people that wrote it weren't in a position to demand that all the UNIX-likes out there standardise their ld flags, so they routed around the problem instead.


>the people that wrote it weren't in a position to demand that all the UNIX-likes out there standardise their ld flags

Agree, but once Linux became the dominant Unix-like, the major Linux distros like Debian and Redhat were probably in a position to replace uses of libtool in upstreams with a distro-wide standard for ld flags.


They could, but what would be the advantage? Such patches couldn't be accepted by upstream, and the Debian/Redhat source packages are mostly only built by Debian/Redhat maintainers, so would the juice be worth the squeeze?


I don't claim to know what other upstream maintainers would've done, but if I had been one in the 1990s, and Debian and Redhat had agreed on a std for ld flags, I would've announced that libtool would be removed from my project in 24 months, so other Unix-likes should get on the train.


Autoconf is an easy target for these kind of rants, but you know what? It does its job, and it does it very well. The ratio of autoconf to non-autoconf programs on my system is probably 10:1, but the ratio of build problems is something like 1:20.

If anyone ever managed to write a genuinely better build system, the bazaar would let it rise to the top; the gradual rise of e.g. cmake is testament to this. Trying to impose one solution top-down, e.g. the LSB standardization of RPM, has a far worse track record than letting the bazaar do its thing.


When OpenBSD replaced GNU libtool with a home grown perl version, it was so much faster I believe it literally cut days off machine time off a full ports build. For smaller packages, with tiny C files, running libtool.sh takes longer than running the compiler does. The majority of build time for some of those packages is still running configure, testing for things like <stdio.h>, which the package provides no workaround when missing. The OpenBSD project alone has spent years of machine time to running configure and libtool.

As for doing its job well, the failure mode of configure "you can't build this" is abysmal. Just give me a fucking Makefile, I'll fix it myself. I love packages that come with Makefiles that don't work. I pop "-I/usr/local/include" into CFLAGS, run make again, and boom. Done. Trying to do the same with configure? Forget about it. --with-include-dir or whatever doesn't work, because it's really running some bogus test in the background which expects /bin/bash to exist and so on and so forth.


>When OpenBSD replaced GNU libtool with a home grown perl version, it was so much faster I believe it literally cut days off machine time off a full ports build. For smaller packages, with tiny C files, running libtool.sh takes longer than running the compiler does. The majority of build time for some of those packages is still running configure, testing for things like <stdio.h>, which the package provides no workaround when missing. The OpenBSD project alone has spent years of machine time to running configure and libtool.

Sounds like the bazaar in action. I hope they succeed, but I know two gentoo projects that tried to do the same thing and were eventually abandoned as unworkable.

>As for doing its job well, the failure mode of configure "you can't build this" is abysmal. Just give me a fucking Makefile, I'll fix it myself. I love packages that come with Makefiles that don't work. I pop "-I/usr/local/include" into CFLAGS, run make again, and boom. Done. Trying to do the same with configure? Forget about it. --with-include-dir or whatever doesn't work, because it's really running some bogus test in the background which expects /bin/bash to exist and so on and so forth.

Sounds like you know make better than you know autoconf; I find it easier to fix autoconf problems on the rare occasions when they fail.


Familiarity with make may be part of it, but I picked up that level of familiarity in about five minutes. Common problem: program adds -ldl to linker flags (doesn't work on OpenBSD). Makefile fix: vi Makefile, /-ldl, xxxx, :wq. Done. Autofix: I dunno, but I'm sure it involves a lot more typing. Most makefiles are simple, most makefile problems and their fixes are simpler still. The dumbest thing that could possibly work often does.


I'm nodding so hard my neck hurts.


Have you seen redis?


I hear your statement about build problems as soon as someone does not use autoconf quite a bit.

However, in the past I’ve had the opposite experience: Trying to port software such as Apache, PHP or bacula to UNIX systems such as SGI IRIX, I always ended up writing a simple Makefile to compile the software instead of putting up with the multitude of autotools-fixing that would have been required. I reported one or two clear issues upstream and they have been fixed, but until the relevant fixes arrive at the projects (especially PHP came with an old version of autotools), some time will pass.

As a counter-example, take i3-wm: it ships with a GNU Makefile (OK, multiple Makefiles in subdirectories, one for each tool) and compiles on Linux, Mac OS X, FreeBSD, OpenBSD, NetBSD. Now let’s have a look at each of these:

• NetBSD: No patches required to the makefiles: http://cvsweb.se.netbsd.org/cgi-bin/bsdweb.cgi/wip/i3/patche...

• FreeBSD: No patches required to the makefiles: http://www.freebsd.org/cgi/cvsweb.cgi/ports/x11-wm/i3/files/ (they do their usual change to /usr/local in the ports Makefile)

• OpenBSD: several patches because OpenBSD lacks SHM support, doesn’t want (?) to use pkg-config at compile-time and quite a few bugfix backports: http://www.openbsd.org/cgi-bin/cvsweb/ports/x11/i3/patches/

I would argue that porting i3-wm to another platform is easy because you can understand the Makefiles and it’s very clear what they do.

As a conclusion, I just wanted to show you that there are counter-examples to both: situations where autotools really does not do a good job and situations where you can deliver good Makefiles without using autotools at all.


This resonates with me. I've inherited an application recently that uses autoconf. Makefile.am files generate Makefile.in, which become Makefiles, which actually do the build. Then there's autogen.sh, configure.ac and acinclude.m4. If I want to change something in the build process, I don't have a clue where to start.

The real kicker? It's a Python application, so there isn't much of a build process anyway. You can run it directly from source.


>However, in the past I’ve had the opposite experience: Trying to port software such as Apache, PHP or bacula to UNIX systems such as SGI IRIX, I always ended up writing a simple Makefile to compile the software instead of putting up with the multitude of autotools-fixing that would have been required. I reported one or two clear issues upstream and they have been fixed, but until the relevant fixes arrive at the projects (especially PHP came with an old version of autotools), some time will pass.

You know that you can replace config.guess and/or libtool with updated versions? Or run make -f Makefile.cvs to regenerate configure etc. from aclocal.m4?

And I would argue that the fact that you had the option of writing a Makefile is itself an advantage of autoconf, because it was written to generate Makefiles in exactly the kind of layering cruft that the OP is complaining about. A more "cathedrally" solution would probably have replaced Makefiles entirely, but by having autoconf generate them it makes it possible for people who know make to, in extremis, edit the generate files.

>As a counter-example, take i3-wm: it ships with a GNU Makefile (OK, multiple Makefiles in subdirectories, one for each tool) and compiles on Linux, Mac OS X, FreeBSD, OpenBSD, NetBSD.

Last time we had a post from the same guy I offered up two challenges: a) cross-compile the program for ARM, b) build the program on a version of SUA (the windows posix compatibility layer) that didn't even exist when the program was released. Both of these are things I can and have done trivially with autoconf-based builds; for most projects it requires no editing of files at all, just replacing config.guess with the latest version for challenge b. I'd be interested to see how building i3-wm was in those two cases (though not interested enough to go to the effort of actually doing it).


a) From the looks of things, "make CC=gcc-arm" should just work.

b) It's an X11 window manager. It's not likely to compile with SUA, and I doubt any amount of autofucking is going to change that.


SUA comes with a perfectly good set of X11 client libraries. You can ssh into a windows box and run your window manager from there with no problems (other than why you'd want to)


oh, cool. sorry, i didn't know that.


Autoconf is certainly extremely functional but it is not Good.

The irony in all of this of course is that autoconf was created to solve the portability issues between platforms that came from the Cathedral model.


No, autoconf happened because UNIX abandonned the cathedral model and went bazaar.


More accurately, it went multi-cathedral: major vendors were building temples to their self-perceived importance and unless you were a major customer they weren't interested in borrowing a better idea from someone else.

The bazaar at least opened up the option for most users to have some influence over the process.


Indeed, the first "Free" UNIX was released in the early 90's. The UNIX fragmentation of the 80's doesn't really describe the Bazaar from C&B to me either.


I doesn't change the fact that it's terrible and that better alternative exist (for instance, CMake), thereby making it yet another cruft from the past.


The bazaar is perfectly capable of adopting improvements that prove themselves, and eliminating cruft from the past, if the cost is worth it. See the recent systemd fuss. If CMake really is better then eventually all important projects will shift over to it, and distributions will stop installing autoconf by default, or eventually at all. The fact that this hasn't happened already tells us that the practical difference isn't big enough to justify expending a lot of effort on switching.


Disagree.

The bazaar model SUCKS at getting rid of cruft. New stuff gets added sure, but old dependencies never seem to go away. It's just accretion. The prevailing attitude is that "hard drive space is cheap."

This inevitable leads to calcification and the perpetuation of sub-par solutions, like auto-crap. It get used because "everything else uses it".


There are many old libraries which have been phased out and replaced in the OSS community. It's not fast but it's pretty good compared to the commercial world where companies like Microsoft or Oracle ship everything without gaping security holes until the last customer stops using it.


  The fact that this hasn't happened already tells us that 
  the practical difference isn't big enough to justify 
  expending a lot of effort on switching.
Are you arguing that it can't be a good idea, because if it was a good idea it would already have been done?


Yes, you're quite right, I can't use that argument. But the fact that similar ideas have been done tells us that the system is capable of implementing good ideas; autotools' continued existence is not the result of a general problem with the bazaar model; there must be some narrower reason why it hasn't been replaced.


Yeah, but cmake (with roots to GE corporate development and large government contracts) isn't exemplary of bazaar development. If anything it underscores the inferiority of the bazaar.


> The ratio of autoconf to non-autoconf programs on my system is probably 10:1, but the ratio of build problems is something like 1:20.

I guess you're running Linux, right? I'd be VERY interested to see this ratio if you ran Solaris.


FreeBSD for what it's worth


Interestingly, git just uses a makefile. no cmake/autoconf.


I write this as a thankful and proud user of Varnish; which =I assume= phkamp would place as a cathedral type of software. I agree with; quality requiring someone being responsible, importance of code reuse, and the observation of full 10 years of dumb copy/paste (though i believe in copy/paste). Eric Raymond's distinction of Cathedral/Bazaar is somewhat a useless dichotomy, on the problems mentioned. Here are 2 examples how Varnish could do better (independent of being developed as bazaar or cathedral)

1) Varnish does not support SSL, which complicates deployment architectures; and part of the the blame goes to lack of a good ssl implementation to copy from or to write an SSL proxy; https://www.varnish-cache.org/docs/trunk/phk/ssl.html I find this as an example of why neither bazaar nor cathedral based approaches yielding something "acceptable" in past 15 years, to copy and paste from.

2) Varnish VCL, as a DSL looks as obscure M4 macro language and the functions are not well planned; and another guy (from the bazaar) would design some more coherent language.

=> To my mind the culprit is the scarcity of monetization opportunities for infrastructure components, which pushes talent (candidates for "being responsible") to elsewhere where money is. I mean, nginx/varnish developers should earn 100s of millions; according to their merit. People are making billions by using ruby, python, rails, sinatra, django, debian, varnish, nginx, openssl, and hundreds of libraries for libtool... but their everyday creators don't get enough reward back to feel responsible. Passion and hobby, benevolence or PR are the only drivers for most infrastructure and library development. Some developers lose their faith in a man's lifespan, fight back or quit in full despair.


A restatement of the old dichotomy: the rebel alliance of young, creative, anarchist hackers who are more interested in fun versus the empire of old, methodical, hierarchical business-programmers who bean-count every line of documentation and analysis.


Other related, though not entirely equivalent ways of putting it: the Richard Gabriel distinction between the 'MIT' and 'New Jersey' approaches, and the old debate between scientifically optimized central planning versus the chaos (and possibly emergent order) of the marketplace.


I doubt the dichotomy is between the cathedral and the bazaar.

All software projects have a list of gatekeepers or maintainers, who decide which changes should go in and which ones shouldn't. Some of them have a long list of people with commit access (like Subversion), while others have a handful of maintainers (like Linux). Some of them have grand roadmaps and bug trackers (like Firefox), while others have no agenda or bug trackers (like Git). Some projects have many dependencies, use libtool and a lot of auto-magic (like Subversion), while others are still adamant about not having a configure script, maintaining a handwritten Makefile, and having 1~2 dependencies (like Git).

From what I've noticed, as a general rule of thumb, the projects with few strict maintainers are better off than those who give away commit access easily/ have lazy maintainers. It seems obvious when I put it like that.

The way I understand it, the cathedral model is about carefully planning out the project, assigning various people chunks of the total work. The bazaar model is open to accepting random contributions from a wider audience, and runs on a looser agenda. It's just that the cathedral model works better for certain kinds of software (games, office suites, professional audio/ video suites), while the bazaar model works better for other kinds of software (web servers, version control systems, development toolchains). When it comes to an operating system, having a curator decide APIs and standards (OS X) certainly wins over an mashup of billions of packages (Debian).


having a curator decide APIs and standards (OS X) certainly wins over an mashup of billions of packages (Debian).

That's your opinion, and you're entitled to it, but it certainly doesn't make it true. To me, Debian has always been the most elegant of operating systems, with a lot of thought put into how things are laid out and making sure that literally thousands and thousands of programs play very nicely together, not to mention compile and run properly on a multitude of architectures. And while I'm sure you can drag up some anecdotes about how Debian didn't detect some guy's wireless chipset, I can find complaints of things in OSX (right in the replies to this article!) that I've never had a problem with in Debian. Not to mention I don't like being told what I can and cannot do with my hardware.


True, I use Debian myself and can't stand OS X. I meant that for an average end-user, OS X makes more sense.


I hadn't realized that autoconf used M4. M4 is amazingly hard to work with, mainly because it uses ` and ' as delimiters, making it very hard to even read unless you have syntax highlighting set up to show those quotes as different characters.

Here's an example from the web:

   define(`START_HTML',
   `<html>
   <head>
     <meta http-equiv="Content-Type" content="text/html;
      charset=iso-8859-1">
     <meta name="Author" content="D. Robert Adams">
     <title>$1</title>
   </head>
   <body text="#000000"
     ifdef(`BACKGROUND_IMAGE',
           `background="BACKGROUND_IMAGE"')
     bgcolor="#e5e5e5" link="#3333ff"
   vlink="#000099"
     alink="#ffffff">
   ')


m4 allows you to redefine the delimiters (yes, it's that crazy). Autoconf uses square brackets.


It's a matter of perspectives. The article author's one is that of taste. The bazaar folks' (including web startup folks)is that of practicality. To illustrate using the same example from the piece - so what if a bunch of crypto code is copy / pasted? It's all out in the open - if anyone ever comes up with an actual problem with the code copy / paste, how hard can it be to fix it?

Matter of the fact is that the world doesn't have a critical mass of people with agreeing tastes and fundamentals to get them to come together and build something moderately complex.

What we do have however is a world full of reasonable people who just want to build something and make it work so they can solve some problem of theirs. There are also people who will happily reuse what the folks before them built - improving it in the process.

If we were to impose a high barrier to entry for people to code and design systems in absolute best possible way - we are effectively saying only a few will build software for the whole world. That'd be a net loss IMHO.

The point about bad design is worth arguing when software doesn't do what you want it to do, i.e. fails practice. But that is mitigated by the fact that individual stall in the bazaar does enforce some design, some testing, some sort of sanity to ensure it at least works most of the times.

Is it a sad situation - yes, we would be much better off with every programmer being perfect. But so long as that is impractical the bazaar alternative works in creating tons of fixable, mostly usable, continuously improved software - for which I can hardly call it a lost cause.


> To illustrate using the same example from the piece - so what if a bunch of crypto code is copy / pasted? It's all out in the open - if anyone ever comes up with an actual problem with the code copy / paste, how hard can it be to fix it?

Crypto is a bad example to use. Crypto is hard. I don't want random people writing their own crypto, I want them to carefully use other crypto packages, after having spent a long time reading the documentation.

Copy pasting is better than creating their own from new, but still leaves them potentially open to weird edge cases, but now with a false sense of security because they copy pasted.


I think the big problem with all the copy paste code is that you wind up with many redundant dependencies and equivalent pieces of software in varying qualities. Granted that it's open source, you can't rely on any kind of competitive (ie market-driven) mechanism to improve all of these redundant pieces of software over time. This is an inefficient part of how things are done....but that's not to say there's an obvious, more-efficient alternative to what we're doing now.


Absolutely - it is a problem in that it is inefficient and error prone. And @DanBC - yes, no doubt people shouldn't be writing their own crypto (but to be fair copy/paste is not akin to making their own crypto - it's just a wrong way to reuse.)

But my point was that this isn't a "it doesn't work" type problem, this is a "it can get better" type of a problem - and I've seen many instances where these types of problems are discussed on the mailing lists of several projects and people with an itch to fix it go ahead and do it. It just highlights the differences in perspectives - get it to work first and even if we had to hack it we can make the design better later on if there are enough people bothered by it.


People are getting confused between two different categories for software.

Software License metric: [F] Free and Open Source, [P] Proprietary/Closed source

This metric can also be modeled as a continuous variable rather than a discrete variable. But let us stick to two values for simplicity.

Development model metric: ranges from extremely [C] Cathedral-type, .........., to extremely [B] Anarchism/Bazaar-type

FOSS proponents don't care about development model as long as it's FOSS.

Let [x][y] denote the Software License metric (x) and Development model metric (y) of a software project.

Observations:

1. [P][C] is the combination that FOSS proponents hate the most.

2. [x][B] where x ∈ {F, P}; is less peer-reviewed (anyone can commit anything), so less accountability/responsibility, highly decentralized, so no guarantee of quality.

2.1. [P][B] sounds like a contradiction!

2.2. [F][B] Poul-Henning Kamp seems to have problems with this kind of setups.

3. [F][y] where y -> B (i.e. closer to [B] than it's closer to [C]). Mostly same as [F][C] except that it is mildly better.

4. [F][C] In this setup, the cathedral authority is the bottleneck in improving the project.

5. [F][y] where y -> C (i.e. closer to [C] than it's closer to [B]). This kind of setup "works like a charm!" See the overall success of GNU/Linux in the industry! There are some people who act as maintainers of Linux, but come on, you too can become one! The more popular such a software is the more thoroughly it is reviewed. "Given enough eyeballs, all bugs are shallow."


This comment thread is a bit depressing, but ignoring that.

One of the bits about the piece that has me scratching my head a bit is whether the mess that is dependency management in OSS operating systems (and generally OSS software distribution models) matters _enough_. While I very much share the author's reaction of "can't we do better?", it also feels that optimizing that mess isn't just a design exercise as much as a social one, because OSS isn't really a bazaar but a constellation of connected more-or-less-cathedralish bazaars. And I don't know that this mess is deeply problematic (although the author does find some egregious issues).

Design happens at a specific scale, and some scales reward investment in design more than others. Designing a chair that can be mass-produced is more effective than designing a room, which can't. One could argue that designing a self-contained piece of software (e.g. the Python runtime) matters more than designing a deliberately open system (e.g. the ecosystem of Python libraries), and that the alternative (random competition, forking, etc.) is Good Enough.

[for carbon dating: had my first IT job in the early 80s, as a teenager]


"Quality happens only when someone is responsible for it."

I admire that as an example of good headline-writing: it tells you one of the important points that the article wants to make. I find it interesting to read this piece in light of recent discussions about Worse-Is-Better vs. The-Right-Thing: I read PHK as arguing definitely from a The Right Thing perspective. I find his argument appealing, but I'm not entirely sure that I'm persuaded by it. On the appealing side, I look at Apple, Python, and emacs: things I use every day, things shaped unmistakeably by someone with vision, authority, and taste, and think that there's got to be something to this argument.


> Brooks offers well-reasoned hope that there can be a better way.

I'd prefer to see the better way, in terms of a working system, rather than read about it. That doesn't mean something that's better than, say, Linux in whatever niche: that's not too hard. I mean something that can be used and repurposed for things the original authors had no idea about.


I'm not a coder. And that piece illustrates why Maemo went nowhere with the general public and why Nokia's Internet Tablets failed. I tried to download software only to find out they required "dependencies." Then I had to chase those down. And once I did, I found the software to be buggy and unreliable -- and that's why people go buy iPads. They don't want to jump through so many hoops to get so little. The journey is no reward.

Edited to add: See "The Paradox of Choice" by Barry Schwartz too.


Nokia Internet Tablet failed for the masses, because they did not invest in it. Instead they had an internal fight Symbian vs. Linux. Nobody won, instead Apple and Samsung did.

BTW: Did you know that Maemo has an automatic Package Manager that resolves dependencies automatically?


You are getting into side issues here instead of the user experience. And such Package Management did not exist when I had the 770 -- at least if you were getting Maemo apps outside of whatever "curation" Nokia might have been attempting at the time.


That's why you need a curated package repository. I don't have to even know that "dependencies" exist to use Debian or Ubuntu; the package manager is nice enough to inform me that it determined that these-and-these other packages are required to install the package I wanted, and that it's now going to automatically install them for me, but that's just details that could be easily abstracted out.


Curated package repositories are just a band-aid on the disaster. They improve things by centralizing the ridiculous and unnecessary cost of dealing with the mess.

It would be better to simply not have that cost.


Eh? The OP here was lauding the ipad as some sort of "solution" to the "mess," but Apple's app-store is pretty much the purest and most absurdly restricted example of a "curated package repository" around!

Debian's (and Ubuntu's, etc) package repositories are a breath of freedom and flexibility by comparison...


Curated Linux package repositories are all about making painful things slightly less painful, including:

- Managing complex dependency graphs. There are often incompatible changes in the nodes of the dependency graph that require wholesale updating of every node linked to the changed one.

- Managing complex installation. As per UNIX standards, the files which comprise a single software package are spread across the file system in such a way that it's effectively impossible to clean them up without a package manager properly tracking them.

Apple's "curated package repository" is something else entirely.


>>>The OP here was lauding the ipad as some sort of "solution" to the "mess,"

Note I was not lauding the iPad. I was using it as an example of the kind of Click & Go experience people want. I could have very well said Windows machines vs Linux boxes, because most Windows programs contain everything you need to go without searching for "dependencies."


Maemo was apt-based as well.


The bazaar was possible because cathedrals like Unix existed which were robust enough to gloat over failures of sound design on part of newbie programmers. Yes, we are now coming to increasingly complex systems in the web too, which are still held together with nothing more than duct tape. Scripting languages which are not type safe, cut and paste and tolerant browsers which hog memory are all manifestaions. Not a great way to build a huge edifice I would say.


It seems like FOSS and Unix-related software development was never held accountable to customers so it never made uniformity - or even reliability - a requirement. If something doesn't work, it's not my fault! It's one of the thousands of other products I depend upon that aren't shipped with an operating system that's to blame. It worked for me is the cry of the guilty developer.

You'll notice that Windows development follows a model based on at least the principles that code isn't cheap and you are accountable to a customer. Even Freeware is developed with a mentality that ensures the shipped application will work as well as it can without the customer needing to do anything. Duplicated code becomes a requirement (when you can find code to duplicate) and dependencies are non-existent.

The idea that you have go get some other product and install it to make a Windows app work would be absurd to a user. Yet in the FOSS world, it's absurd not to assume this. There's also some expectation of payment from one of these users. Could it not be that the quality and usability of FOSS could benefit from more commerce?


You're not old enough to remember how "InstallShield" came about, are you ? :-)


Nope. Care to enlighten?

(Regardless of what may have prompted its development, its use since around 1992 doesn't really discredit my point... It takes care of things for the user that FOSS doesn't deal with, or at least not in a portable way)


InstallShield was a created by a startup, to give windows-apps a non-insane way of installing themselves.

Microsoft was forced to buy them, almost at gunpoint, by 3rd. party software developers.


I don't think they were ever acquired by Microsoft. Perhaps you're thinking of Macrovision.

They've gone through several company names and changes in ownership: http://en.wikipedia.org/wiki/InstallShield


A minor point, but this is wrong in several ways.

InstallShield was never acquired by Microsoft, and was never actually very good. You can think of it as being a leaky abstraction over the underlying horror of Windows Installer, allowing just enough to poke through that it was unreliable. There has never been a really sane way to install windows apps, and there still isn't - the closest approximation today is WiX.

Possibly the manifest-based Windows 8 application packages provide a better story, I'm not familiar enough to say.


Cool. Now if only FOSS would adopt a non-insane way of installing software...


"apt-get install <name of package>" is about as good as software installation gets


Actually there's much better, but none that are freely available. And it's still distribution-specific which is retarded.


100% agreement from here. Andoids idea of a uid per app is a good place to start.


Part of me wonders if git provides a hint here... What about a sha1 hash per app?


You mean like by having package managers that are built into the OS distribution that can install the applications you want with a single command?


That's not a software installer. That's a distribution-specific package manager. Huge difference.

Commercial software for Linux typically does not come as a package. It starts with some kind of shell script and shoves itself into a corner of the filesystem (usually /opt/$PRODUCT/), and generally ships all its dependencies and does the work of mangling init to work on whatever distribution it's being shoehorned into. If it's a particularly expensive product it might have a GUI or ncurses installer or configuration wizard.

That's retarded, but it's still better than having a half dozen independent binary formats and a hundred distribution-specific packages.

If distribution managers would stop being asshats they would all agree on a set of APIs that they could support to perform operations in a platform-agnostic way, and software developers could in a free-market way make installer tools that work with the API. So you could have a shitty Gnome installer, a shitty KDE installer, a shitty ncurses installer, and an installer that uses Java just for the hell of it.

Packages are not things which users should ever have to touch. They are a specific collection of meta-information for the operating system to do mediocre actions like removing files, copying files, understanding if it has everything it needs to run some files, and telling you what the hell is installed. Packages certainly are not intended to hold a user's hand or provide all dependencies in and of themselves. But that's what you need if you want your users to be able to just install the damn software.

Solaris DStream packages have this wonderful ability to contain multiple packages in themselves, so you can install one or more select packages, or all of them if you need to. The easiest way to ship your application is to provide all the dependent packages in the DStream, and if you need one, you install it. This works great for Solaris (even if you have a bunch of different Solaris versions), but would be completely impractical for Linux because there's so many damn Linux distributions even if they have the same binary package format.

Most Linux software out there (assuming the kernel version is right and the glibc ABI hasn't changed) will work on other Linux distributions without code changes. But we waste all this time just making it work on other distros after the code has been compiled! Does that seem right to you? That because distributions want to live in a silo and do things their own way, you have to spend hours (or days) finicking with independent distributions just to release a new minor version of your app that everyone can use? For a bunch of really smart people we end up doing some stupid things.


Commercial software can be distributed through the package managers, though you do have to tell the users to add archive sources for those or they have to provide those packages. We do see some of this being done.

One thing its useful to keep in mind, is that the different parent distributions are more-or-less different operating systems. They don't have differences because they are special snowflakes. Comparing the deployment story for trying to write a package compatible with every Linux distribution to Windows 32-bit applications is not really apples to apples.


The fundamental flaw in CatB is that it was essentially an adaptation of ESR's libertarian creed, that Free Market is infallible and would always boil down to the best optimal solution. While that may work on the scale of a single project, it fails when applied to a full ecosystem, and thus we get dozens of competing projects aiming at the solving the same problems (WMs, desktops...).

It also postulated that code being open would incite devs to be as their technical best, in order to gain peer respect. That didn't happen either, quite the contrary the OSS community has proved to be rather conservative and traditional, Unix being seen as "The Right Way", not to be deviated from.

In short, the Bazaar model completely failed in its promise to always let the best solution win. What we got instead was perpetual chaos, and less than 1% of the desktop share.

I attended the 1st GUADEC back in 2000. If at that time we had known where we'd actually be 12 years later, we'd all have left in disgust.

I moved to OS X in 2008, my only regret was not doing it any sooner.


Serious question for phk: Despite your rant, why do you continue to use autotools in Varnish? Is it because libraries that Varnish depends on require autotools? Or did you simply follow convention?


The reason varnish uses autotools is that I delegated the build/release infrastructure to somebody else.

From my own time as FreeBSD release engineer I know that task must come with freedom to pick tools and methods, and therefore it does.

That doesn't mean I have to like the choice, but I have respect it, (or take over the build/release responsibility myself.)


But pretty much everyone is the same boat. NOBODY likes autotools, but until something comes along which is better and does the same job, we'll switch.

Don't underestimate the size and demands of the job though.


It seems that people would like to think that cathedral > bazzar is like waterfall > agile, but I'd also posit that it can mean walled garden > OSS... the iphone is a beautiful object, I am also post 90's so I've never been involved with a cathedral project, but I see his point.

On the other hand, why can't I watch a flash movie on my iphone? Sometimes it'd be convenient to do that. I think it's funny that there is no engineering restriction on being able to do that, it's simply an effect of cathedral vs. bazaar culture at apple. In fact people forget that the true cathedral would be an iphone without the app store, as Jobs had originally envisioned.

I definitely don't agree with the assumption that the cathedral is always better and preferable.


Thanks, phkamp, for the best explanation I've seen for why I switched from Linux to OS X. I was employed in an IT job before 1990.


Here's an optimistic thought: one great company can change things. Google's engineering-driven culture did that. And Apple has made people care about design, albeit only the surface-aesthetic sense of "design", which is what they identify with Apple despite Jobs' dictum "design is how it works". Well, suppose a new company came along that succeeded massively by means of good software design. A lot of things would turn on their head. People would have to stop saying "that's not feasible" or "things are too complicated" or "market forces demand otherwise", just as Apple's success has caused them – against all odds! – to have to stop saying those things about elegance and beauty.

Such a company could have a profound effect because many of the best and/or most aptitudinous (sorry) programmers would be inspired by it. Right now we have no great example to point to, just a bunch of intuitions and (by now) raggedy traditions that we barely understand, which is what I take to be the point of the OP. But great design feeds the soul in a way that the steaming landfills many of us spend our days adding code to never can. So there's a fair amount of talent out there waiting to be kindled. One bolt of lightning and who knows.

Yeah it hasn't happened yet, but it's easy to imagine how it could. Historical accident plays a role in such things.


There's an element to that article that is complaining about the process that Unix took to get to where it stands now. I feel the author needs to skim some books on evolution to see that there is nothing inherently wrong with this approach.

For instance, his example of Unix is not unlike any large organism who, through years of evolution, has many structures that are no longer needed (e.g. appendix) or structures that seem unnecessarily complex (e.g. recurrent laryngeal nerve) when our 20/20 hindsight can design a better solution.

I didn't find too much advice in the article, but I wish he had proposed something better. Even life can potentially create a mutant that contains no archaic structures in it, but it will only last if it passes a fitness test. Why doesn't he try rewriting some of the gnarled pieces of Unix code to make it better (like removing the need for autoconf)?


> I feel the author needs to skim some books on evolution to see that there is nothing inherently wrong with this approach.

This is broken thinking.

You encountered,

"Process X has attribute A, and this is bad."

and you respond with,

"No, because Process Y also has attribute A."

This is only valid if we are either 1) guaranteed that there is nothing bad about process Y, or 2) considering processes X and Y as alternative options whose merits we are comparing.

Dispensing quickly with 2 - we are considering "Cathedral-mode development" vs. "Bazaar-mode development", not "Evolution" vs. "Bazaar-mode development", so this is not applicable.

Without further support, option 1 strikes me as confusion of is-ought. It is the case that evolution works this way, so any design process ought to work this way. There is, however, plenty "wrong with" evolution as an approach to producing designs for a given purpose. It is horribly inefficient compared to design, suffers from greater path dependence, doesn't keep notes about systems that may have worked that went extinct for unrelated reasons, etc. The attribute A that you are comparing is in fact one of the things "wrong with" evolution and "wrong with" Bazaar-mode development.

(Note, of course, that a problem with evolution in terms of producing optimal designs for our goals is not the same thing as a problem with evolution as an explanation of the world around us.)


All of this is just such antagonistic linkbate if you ask me. No offense to OP, but it really just seems like oh I had a controversial thought so I'll play up two sides.

Maybe one isn't better than the other? Maybe they are just different kinds of things and some people work better in one or the other. Maybe some projects do to. We live and work in dynamic environments and there is no big T Truth in design. People change, the requirements change, the market changes.

Indulging in this kind of petty software holier-than-thou while highlighting only flaws and failures as defining characteristics of another generation is only destructive for the entire community.

If OP was serious this would have actually been about showing a new generation of coders good cathedrals, and not just banal whining.


According to Wikipedia The 'Cathedral' and 'Bazaar' are supposed to describe open source project development models. So, how can MS Windows, Office and OS X / iOS be 'Cathedrals' or 'Bazaars'? AFAIK none of them is (fully) open source.

From Wikipedia: "The Cathedral model, in which source code is available with each software release, but code developed between releases is restricted to an exclusive group of software developers. GNU Emacs and GCC are presented as examples."

Sounds more like Android is a 'Cathedral' and Linux is a 'Bazaar' ?

http://en.wikipedia.org/wiki/The_Cathedral_and_the_Bazaar


I wrote this analogy to explain this situation to my Facebook friends:

Hypothetical cooking analogy for the non-techies: Back in the day, if you wanted to cook up a fancy meal, there was only one butcher in town, and he did things the "right" way. He only used one variety of beef, because it was the "best", and he only sold the "best" cut of that beef. Similarly, there was only one farmer, who only sold the "right" crops, because they had the "right" flavor. When you wanted to cook a fancy meal, you didn't really have any decisions to make, because there was only the "One True Way": one "correct" cut of meat, one "correct" vegetable selection, etc.

Since then, an explosion of variety has occurred. However, perhaps we have swung too far in the other direction. Rather than choosing among the 6 different makers of pasta sauce, you now must choose among 6 varieties of pasta sauce from each of the six makers. This way lies decision fatigue.

However, the community of grocers got together and came up with an ingenious artificial intelligence agent built into a pick-packer system. Now, you don't need to even know a single brand name of pasta sauce. You can simply arrive at the store, tell the system what recipe you are following, tell it that you prefer your tomatoes from Spain, your beef the organic grass-fed variety, and your garlic from the Mediterranean, but you'll take whatever variety of pasta is local and most fresh. The system then assembles your grocery cart automatically for you, and you can choose to remain blissfully ignorant of all of the decision-fatigue inducing choices which occurred under the hood.

However, if you are of the crotchety-old-man sort, and you "Only buy my rib-eyes from Lambert and Sons, because he's the only butcher who does it right", you can of course choose not to use the AI pick-packer, and you can take it upon yourself to select the "best of breed" ingredients for your meal, and pretend that the explosion of variety never happened.

So here's the deal. Poul-Henning Kamp wrote an article lamenting the fact that the explosion of variety happened and that the AI pick-packer was invented, because now everything is a giant mess, and back in his day, there was only one right way to do it, dammit.


His most recent project is the Varnish HTTP accelerator, which is used to speed up large Web sites such as Facebook.

The Cathedral speeding up the Bazaar? Oh sweet irony.


Nature does pretty well with bazaar style development and she usually wins doesn't she?

There's a lot to be said about building via rapidly iterating around feedback.


But that process has a way of ruthlessly culling the herd. Does not seem to be happening in this world of infinite backwards compatibility.


And I have a tail-bone, but no tail ...


Not exactly. The result of "hops through the local minima" style design is hardly optimal. Yes, it works but is best of the worst more often than not.


Thirteen years ago, Eric Raymond's book The Cathedral and the Bazaar (O'Reilly Media, 2001)

Off by two?

Seriously, though, how do you get 13 out of that (unless calculating from when the author conceivably began working on the book per se)?

I'm starting to read the article; I found mildly disconcerting that the first sentence appears to make a mis-statement, but I don't expect that to foreshadow the meat of the message. (We'll see. :-)


The editors put in that reference, the essay was available before the book was published.


Fair enough. But a reader not familiar with this is going to wonder. (Maybe none, given that this is in the ACM Queue.)

Maybe the following edit?

"Thirteen years ago, Eric Raymond's essay and subsequent book The Cathedral and the Bazaar (O'Reilly Media, 2001)..."

P.S. Done reading now. Enjoyed/appreciated your essay. Amongst other things, perhaps useful for the younger / newer to the technology generation to appreciate where this "configure soup" comes from.


That is the revised edition, the first edition of the book was published in 1999, which was 13 years ago.


CatB was an essay on a web page for years before the book of that name.


I wish there were an English word for Baazaar; we used to have the word Market but now that word means aggregate demand IE the job market not the fish market.

Also I wish we could understand "Cathedral vs Bazaar" as an analogy and stop comparing Notre Dame Cathedral and Istanbul Rug Market in this thread.

"Rug Market" software = I recompile my kernel to upgrade = Debian

"Cathedral" software = the central planning organization sell me my OS upgrade = everything Apple

It should be obvious that Cathedral development can guarantee higher quality. Critical systems get written in ADA and VLISP and run on QNX. You would be insane to to try to deploy a critical system in C++/Ruby on Debian, and, depending on the circumstances, you might get arrested if you tried.

"Rug Market" however can scale to larger codebase sizes, because "Cathedral" costs more per line of code. So "Rug Market" wins, when software correctness doesn't really matter.


"bazaar" is pretty much an english word by now.


How ironic that this complaint of poor quality and lowered standards comes from a FreeBSD developer who actively attacks people who care about producing quality software. Cathedral vs bazaar isn't the issue. It is simply putting quality as a priority vs not making it a priority. Which is why some cathedrals produce poor quality software (freebsd), and others produce high quality software (netbsd).


I do ?


Remember when freebsd included a closed source binary blob to support atheros wireless chipsets? And some people in the openbsd and netbsd groups decided that since their OSes have quality standards and don't allow garbage like that, they would make an open source driver? And then you yelled shit at them at a conference and called Reyk a terrorist? It is one thing to hate everyone who works an OS other than the one you prefer, that's just typical childishness. But to attack them for having higher standards than you do? Yes, I would say that puts you in the position of "not being one to talk about quality".


Lets just say that maybe I have a slightly different recollection about that meeting ? :-)


"First talk was OpenBSD's wireless talk. Much rethoric about freedom and little appreciation for the way the world actually works, as opposed to how it looks like it works from the headquarters of OpenBSD.

Asked directly if they thought they could defend their reverse engineering of for instance the Atheros HAL. The answer as I heard it was "Laws don't apply to us".

I guess I'm not the right person to appreciate the valor of their crusade, but somehow their rethoric reminds me too much about Rote Arme Fraktion and similar terrorgroups of the seventies which tried to change society by bombing political correct holes in it.

If OpenBSD aims to corner the paranoid/radical part of the market they're welcome to it for all I care.

Doesn't sound like it is doing anything good for their wireless support however."

I dunno, sounds to me like you bitterly despise anyone who dares to do things correctly instead of doing whatever is expedient. OpenBSD's stance on the wireless issue was correct, has been proven to be correct, and should be admired. And I'm a netbsd guy, so I am "supposed to" hate openbsd.


The fact that OpenBSD put any users and source-code holders in USA in direct violation of USA (stupid!) laws on spectrum access is the context you are missing here.


Even if we were to throw common sense and reason to the wind and pretend that is even remotely close to the truth, it would in no way explain your behaviour. You outright stated that "the way the world works" is accepting closed source garbage that can not be verified, debugged or fixed. You think the way the world works is compromising security and stability for convenience. And you belittle people who do care about quality, as if it somehow hurts you that such people exist.


In the article he uses the example of compiling Firefox.

This is a lengthy affair on any of the BSD's.

The question is, is it even worth it? I mean, what does Firefox give me that is worth all that time compiling?

One has to have a big bloated browser to do online banking and similar things, but it is entirely unnecessary for lots of other things, like YouTube. To get content and play it, which are two separate tasks, you certainly do not need Firefox.

That makes kids upset. They have invested all their energy into learning the browser and how to manipulate it. 100% web everything. Client-server.

Anything that threatens these ideas is offensive.

Well, that's what's keeping back progress.

You will find not freedom and flexibility to explore the possibilities by trying to do everything through a browser, over the web, and solely within a client-server construct. Those are self-imposed limitations that today's developers readily accept.

As phk says, he's writing for people who can think for themselves. If you cannot think for yourself, if you're just good at repeating the same old things: "OSS is great!" "The web is amazing!" "Google!" "Facebook!" then his writing will not make sense to you and will probably e offensive.

If I'm not mistaken, he's implying we can do better. And the evidence to support that is that we have done better. But it was a long time ago, and bubble seeking programmers do not want to pay attention to history. Go figure. Tha sad truth is most programmers today are caught in the mediocre moment, thinking it's Nirvana.

Please don't wake me up. I'm enjoying this dream.


phk how dare you question autoconf, libtool and package collections. Simplicity? Are you kidding? Nothing is simple. Simlicity is HARD. You are a grumpy old man. Leave us kids alone, we're having fun.

End sarcasm.

I can compile my bloated BSD kernel loaded down with every available driver I'll never need in a fraction of the time it takes me to compile Firefox: that is, measured in _minutes_. On an underpowered computer it can take _days_ to compile Firefox with all its supposed dendencies. I say supposed because no one is really sure. It's too much work to check and really find out.

Anyway, negative coding has been outlawed. It's inconsiderate. Even if you found something that could be removed, you would sacrafice every ounce of your karma if you dared to remove it.

End more sarcasm.

The FreeBSD ports collection, like OpenBSD's or pkgsrc is black box.

Unfortunately, with that last line, I'm dead serious.

My solution to autoconf and libtool is to run make >log 2>log2 and then do some sed transformations on the logs to produce my own "compile script". Sometimes I even change libtool's script to make the "quiet" flag do what the "verbose" flag does, so it can't hide anything it's doing. I then save my "compile script" for the next time I need to compile the program on that type of system.

One of my todo projects is to analyse the sum total of *BSD patches, looking for patterns. One of my pet peeves is having to make patches just to compile a program on the each BSD. Silly little MI differences that totally defeat automation and require silly little patches even for the simplest of programs.

Anyway, you can't say that the youth has completely abandoned "traditions" from ye olde UNIX, no matter how much they seem to hate "old things". They stand by the old relics of autoconf and libtool year after year and even vigorously defend using them.

The reason they can't replace these crusty old things is because they refuse to put in the effort to learn how they work. If you want to build a better mousetrap, first you have to understand how the old ones work, and specifically how and where they could be improved.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: