Lambasting libtool for providing a consistent experience across star-NIX is, imo, not the wisest move for a FreeBSDer.
Article: This is a horribly bad idea, already much criticized back in the 1980s when it appeared, as it allows source code to pretend to be portable behind the veneer of the configure script, rather than actually having the quality of portability to begin with. It is a travesty that the configure idea survived.
Good high-minded notions here. But configure, with it's standardized parameters for how to do stuff, is near irreplaceable at this point. Certainly a more stripped down version, one not built on M4, would be wise, but libtool/autoconf itself is used too broadly & with trepid familiarity by developers & upstream maintainers: in spite of so much of it being indeed old deprecated no longer useful cruft, the best we can hope for is duplicating a wide amount of the existing functionality in a cleaner manner.
But at what cost would reimplementation come? How many weird build targets would for months or years go unnoticedly broken?
The place where we escape these painful histories is where we leaving the old systems programming languages behind. Node's npm I'd call out as a shining beacon of sanity, enabled by the best most popular code distribution format ever conceived: source distribution, coupled with a well defined not-completely-totally-batshit algorithm for looking for said sources when a program runs & at runtimes goes off to find it's dependencies:
http://nodejs.org/docs/latest/api/modules.html#modules_all_t...
Why should I as a FreeBSD person not be allowed to lambast the worst hack-on-hack-on-hack-on-hack I have to suffer ?
And trust me, libtool is replaceable, all it takes an agreement about a compiler and loader flag for producing shared libraries and you suddenly don't need it at all.
But that agreement would be based on our current best understanding of how computers should work. When you factor in the future (and evolving an existing codebase is a huge problem) we have to assume any understanding we have now is incomplete and flawed.
That's why things like libtool or autoconf evolved (or, better, were "iteratively designed") to be able to grow and encompass varying and different goals.
So what you're saying is that progressively tidying up a codebase to simplify fulfilling current requirements is always a bad idea because you might remove something that might make a hypothetical future requirement easier to fulfill?
No. I'm just reminding ourselves we don't know what you'll need in the future, that any decisions we make now are subject to change down the road and that it's foolish to assume we can design now what we'll be using ten years from now.
That's just defeatism, not arguing against cathedral-style development. Just because we're likely to be wrong to some degree doesn't mean we shouldn't try to be as close to right as possible.
Anyway, I get the impression that for the original example of libtool, cathedral vs bazaar is the wrong question to be asking. [1] The problem isn't one of design or implementation. It seems a problem of distribution. I posit you could build a simplified, cross-platform library linker via either a centralised or distributed process. The question is rather, could you get everyone to agree to actually use it? Considering the success of clang, I think you'd stand a reasonable chance. (speaking of which, clang doesn't exactly stand out as a pure-bred bazaar-model example; neither is GCC for that matter)
Actually, I think that's what's wrong with this whole thread and the article that triggers it. I'm fed up as anyone with the layers of crap that we're building our software on. I seem to spend most of my time yak shaving because of some legacy decision, not contributing new stuff. In fact, my main output seems to be adapters for crappy interfaces. So while I find myself nodding along with the original essay, I think cathedral vs bazaar has nothing to do with the quality of the code we use all the time. It's a question of having the resources and balls to replace the layers of crap when necessary instead of carrying on piling more of them. Whether or not that's possible is mainly a question of whether the systems you absolutely need to retain are open or not. (and open source software can still be developed cathedral-style - I'm pretty sure ESR considered GNU to be cathedral software)
As an aside, when was the last time the format of .a/.so/.dylib/.lib/.dll files drastically changed on the respective operating system? We change even prevailing processor architecture a lot more often than that!
[1] I'll have to admit ignorance on the specific detail on the problem that libtool solves (inelegantly). I hope that doesn't derail my argument too much.
> Just because we're likely to be wrong to some degree doesn't mean we shouldn't try to be as close to right as possible.
Of course not! We should try to get as much of what we are doing right. We also must realize we are not as clever as we think we are (or would like to be). Cathedrals are monuments to imaginary deities. As such, they aren't subject to the laws of reality. Our efforts should, OTOH, be guided by what's real and measurable and take into account that what we "know" but can't measure is nothing more than an educated guess.
I was quipping that I don't wish to see FreeBSD drift any further away from what unification there is.
My fear is that the awful hack-on-hack-ad-infinitum we do have at least binds us together, even if it is in misery. I'm not at all uncomfortable with replacing libtool, and I'm certain a very good job could be made. But I can hear 18 m68k afficianados crying out in pain that the new toolchain doesn't work on their NetBSD system, or the guy who wrote his own C compiler that needs some extra rules feeling totally shafted because now he was to write M4 and "M5," which for some reason he cannot stand any more than M4 even thought it looks nothing alike.
We stand unified. Not by good things. But by the woe we all suffer. To those out there seeking to crush that & make better, I wish you the best! Please keep in mind a simple mantra as you go about inventing the future, do no harm. Libtool maybe awful but it's how distributions are made.
phkamp, sorry for missing the reply. M4 is ugly, libtool takes forever, everything in it is a hack. It's proven to be an at least adequately flexible hack that has kept *nix unified, more or less. I respect that, but I'm not a systems programmer that gets burned by it on a regular basis either, and I don't mind that my OpenWRT compiles of all packages take 32 hours and it's all libtool's fault.
Article: This is a horribly bad idea, already much criticized back in the 1980s when it appeared, as it allows source code to pretend to be portable behind the veneer of the configure script, rather than actually having the quality of portability to begin with. It is a travesty that the configure idea survived.
Good high-minded notions here. But configure, with it's standardized parameters for how to do stuff, is near irreplaceable at this point. Certainly a more stripped down version, one not built on M4, would be wise, but libtool/autoconf itself is used too broadly & with trepid familiarity by developers & upstream maintainers: in spite of so much of it being indeed old deprecated no longer useful cruft, the best we can hope for is duplicating a wide amount of the existing functionality in a cleaner manner.
But at what cost would reimplementation come? How many weird build targets would for months or years go unnoticedly broken?
The place where we escape these painful histories is where we leaving the old systems programming languages behind. Node's npm I'd call out as a shining beacon of sanity, enabled by the best most popular code distribution format ever conceived: source distribution, coupled with a well defined not-completely-totally-batshit algorithm for looking for said sources when a program runs & at runtimes goes off to find it's dependencies: http://nodejs.org/docs/latest/api/modules.html#modules_all_t...