Allow me to confirm that the loose coupling comment is about modularity. Two pieces of code are tightly coupled if there are complex interconnections between them. They are loosely coupled if they interact through well-defined interfaces that expose very little of the internals.

A large system built of loosely coupled pieces is generally going to be much more reliable, maintainable, etc than one which is built in a tightly coupled fashion. Therefore loose coupling is generally a good thing. Code Complete has lots more to say on this topic.

Exactly. After all, to make minor changes in tightly coupled systems, one often has to redesign much of it to get it all to work together again after that apparently minor change is made.

I'm even of the opinion that the performance benefits of tight coupling are mostly imaginary due to unforseen indirect consequences of tight coupling in large projects; when things need to be altered during development, sloppier fixes are encouraged by the difficulties created by tight coupling.

Of course, I'm far from the foremost expert in code modularity, and could well be out of my depth here.

The time when tighter coupling makes sense is when you are pushing the frontier of what is possible. In that situation attempting to enforce loose coupling pushes you to choose a way of modularizing before you understand the problem space well enough to pick a good modularization. Maintaining it also prevents you from using certain optimizations that you might come up with after you create your design.

This does not mean that people who are trying to accomplish the novel have free license to throw out the rule-book. To the contrary they are unlikely to successfully push those limits unless they try hard to make the complexities that they will encounter manageable. However they will also need to selectively violate normal principles just to make what they want to do possible.

However history shows that, when technology catches up to what they were trying to do, the people who created the integrated technology generally lose in the long run to latecomers with more modular technologies. Does this mean that they did something silly? Not at all! History suggests two other things as well. The first is that the recognition of the right way to modularize the problem is often only possible after someone has solved it badly, first. The second is that there is a lot of money to be made by being the first to be able to do something that people want to do.

For more on this, I highly recommend The Innovator's Solution. Be warned, it is a management book and not a technology book per se. However I think that what it has to say about which technologies are likely to replace others is useful for people who work with technology.

I don't think I'd say that my "conclusion" was "wrong" so much as that it was incompletely stated. To round it out nicely, I should have made reference to the idea that first, one should create a throw-away.

I know that the idea of creating a throw-away version first is an oft-cited principle of software design in general, but it applies even moreso to innovations: first, you innovate, then you create something useful. Sure, tightly couple in the throes of creative frenzy if you must, but then recreate your innovation with more modularity. There's a big difference between screwing around with new ideas and writing good code.

So, yeah, I think that tight coupling has its place in experimenting with new ideas, but that doesn't mean you should be calling something a release version before you've refactored and restructured so that it's more modular.

But, again, I could well be speaking too vehemently of things I don't understand well enough. Considering our relative levels of experience with Perl code, I'd be inclined to say that, all else being equal, yours is more likely the "right answer" than mine.