> Mistake 1: Switch from DRY to premature optimization.
"Premature optimization" is largely a bogus concept, because the meaning of "optimization" has shifted a lot since the concept was first created.
People now use optimization to mean "sensible design that does not needlessly waste resources".
In this meaning of optimization, "premature optimization" is a bogus concept.
You should absolutely ALWAYS write non-pessimized code by default.
What the original concept referred to is what people now call "micro optimizations". Sure, premature micro optimizations is often a waste of time. But this is irrelevant to the context of this discussion.
> In this meaning of optimization, "premature optimization" is a bogus concept.
The idea is that you can end up optimizing before you know the entire use-case, because software engineering isn't like building bridges or skyscrapers.
I'm a performance geek, but I love code I can easily change rather than code that is fast until some customers have touched it. Mostly out of experience with PMs with selection bias on who they get feedback from ("faster horses" or "wires we can hook phones to").
The first thing to optimize is how fast you can solve a new problem that you didn't think about - or as my guru said "the biggest performance improvement is when code goes from not working to working properly".
The other problem with highly optimized code is that it is often checked-in after all the optimizations, so the evolution of thinking is lost entirely. I'd love to see a working bit + 25 commits to optimize it rather than 1 squashed commit.
Optimized code that works usually doesn't suffer from this commentary so the biggest opponents I have with this are the most skilled people who write code with barely any bugs - I don't bother fighting them much, but the "fun" people with work understand my point even if they write great code first time around.
These two are mostly why I talk to people about not prematurely optimizing things, because I end up "fixing" code written by 15 or more people which has performance issues after integration (or on first contact with customer).
The reasoning behind discouraging premature optimization makes no distinction between "micro optimizations" and any other kind, the purpose of this guidance is to minimize wasting time building unnecessarily complex solutions based on untested performance assumptions.
If you're writing and enterprise app and lean back in your chair and start to think about speeding things up with loop unrolling and avx instruction sets then you're doing the premature optimization thing.
But trying to limit large nested loops is easy fruit that doesn't take much effort to pick.
the typo is "non-pessimized code", which should be "non-optimized code".
I see humor in thinking if my code is pessimistic enough. Have I assumed that the edge cases will happen and worked around them? Do I expect (and handle) crashes, i/o failures, network timeouts, etc?
"code pessimism" could be an interesting metric.
The typo in the other post was "superb owl" which should have been "super bowl". Several people on that thread enjoyed the typo, including a comment from CostalCoder saying "Please, please do not correct that typo!"
I think they used that term on purpose. Non-pessimized in this case is the same as optimized, and I believe it's a reference to this video https://youtu.be/7YpFGkG-u1w
"Premature optimization" is largely a bogus concept, because the meaning of "optimization" has shifted a lot since the concept was first created.
People now use optimization to mean "sensible design that does not needlessly waste resources".
In this meaning of optimization, "premature optimization" is a bogus concept.
You should absolutely ALWAYS write non-pessimized code by default.
What the original concept referred to is what people now call "micro optimizations". Sure, premature micro optimizations is often a waste of time. But this is irrelevant to the context of this discussion.