Hmm... I've never actually had any optimization-based issues. I'm always so curious why people don't crank it up to 11 when they're compiling. Granted, I'm kind of a wimp now and just use whatever `CMAKE_BUILD_TYPE=Release` does :P

Thats not OO, it's just an SVO-style dispatch notation. You can have one without the other.

I think Epy was being simplistic for the sake of argumentation. He didn't mean that he liked OOP because of the notation exclusively, but because of the meaning (semantics) of that notation.

Not strangely, however, I disagree. I think the procedural model is much, much, better. I can appreciate the dot notation semantics in OOP and the powerful visions of programming logic and organization that object.action gives. It's appealing and it adds a level of thin abstraction with the real world objects that procedural programming lacks entirely. I certainly was drawn to it when I first meet OOP in my life. But these days I believe firmly that I was drawn not to a solution, but to a trap. Like a moth is drawn to a flame. Because for object.action to be fully realized you will necessarily have to enter into the whole OOP shebang, which is exactly my beef with the paradigm.

The procedural semantics may seem less atractive and without any real-life comparable counterparts (except if you like mathematics). But in the end it makes projects of any size easier to understand and follow, because of the linear and way which it organizes your code and the much simpler and predictable execution paths and resulting tracebacks. How much more time we spend on the job of maintaining or fixing code and how much more important for those tasks are these elements? And despite that I think OOP hasn't even shown its uglier side until it comes the time for we to learn an already existing project of some complexity. Then, boy oh boy! Do you need an object tree viewer...

Originally Posted by brewbuck:Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

I'm more on the grounds that 0 + 0 is still 0 and that programming is, generally speaking, lame. Although I can generally agree with you and am a believer that a project with a procedural codebase punctuated here and there with small cases of OO objects and just enough functional constructs where they make sense is a much superior solution, it still strikes me that the reason individually all these paradigms suck is because our computer architectures suck.

We tend to forget how seminal and immature our computer architecture really is, blinded as we are by a false sense of modernity. I have no doubt that there isn't a solution to the problem of code organization and maintenance with this architecture, only rough patches. The escalation of team size and bug count in the software industry over the years as the complexity of software has risen, and how we have grown to become lenient and accepting of that fact, has all the characteristics of a bubble that can't be sustained indefinitely. Eventually within the next 100 years we will come to realize a new computer architecture, either by the constant pressure for processing power or simply because the software requirements have grown so complex that our (already proven) archaic programming models can no longer fit. Then it's a safe bet to expect our first software development revolution.

Last edited by Mario F.; 02-21-2017 at 05:40 AM.

Originally Posted by brewbuck:Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

I'm very excited to see what future computers would look like and all the kinds of things they'd allow us to do. Unfortunately, I'm not imaginative enough to conceive any real possibilities but it is a fun thought.

I'm very excited to see what future computers would look like and all the kinds of things they'd allow us to do. Unfortunately, I'm not imaginative enough to conceive any real possibilities but it is a fun thought.

I just hope the phrase "Resistance is futile" is NOT used.

Tim S.

"...a computer is a stupid machine with the ability to do incredibly smart things, while computer programmers are smart people with the ability to do incredibly stupid things. They are,in short, a perfect match.." Bill Bryson

Resistance will not be futile. I predict the next wave of hardware architecture will be less open and much more complex to manufacture and operate. Quantum computers, or the (more probable) biocomputers, will take a long time to reach the common household, after they reach the market. They will. But when they do, they will very likely come with the added benefit of being more of a much more technical nature and having removed the old paradigm of Programming Is For All, which will be the best thing happening to this market. For this reason, this old and decrepit architecture will remain a viable alternative, especially among hobbyists, even after you can buy a biocomputer for the same price. But your grandsons or their sons probably won't have much a say on that matter. In the end the new generation will become cheaper to produce, incredibly more powerful and will interface better with the remaining advances in technology. And the times of programming in the jungle will be over.

Originally Posted by brewbuck:Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

From my perspective, it's great! The more crappy coders there are, the more job security I'll have. You know what's worse than not having code? Having crappy code! People will gladly pay for someone to come in and fix the mess created by someone else.