Some foundational computer science relating to performance I've found poorly understood by the programming books I read as a child:
In compsci to reason about performance we derive a formula for how many loop iterations or bytes of memory a datastructure or algorithm uses & take the largest magnitude component of it. Namely O(1), O(log n), O(n), O(n log n), O(n^2), or O(2^n). O(2^n) means we're undesirably bruteforcing a solution, whilst O(1) or O(log n) are ideal.
If you have otherwise squeezed all the performance out from your code & need more, take measurements. Know what you're doing.
For people who care about, support, or build Free, Libre, and Open Source Software (FLOSS).