A previous colleague of mine had a saying, 'Computers don't get tired'. By that he meant developer time is more precious than compute time. In modern day web development, it's generally better to push sub-optimal code that brings features than optimizing until your face turns blue. Would be interesting to figure out how long an optimized piece of code has to run to make up the time (and money!) spent on optimizing it.