It's been suggested that programmers need to master three numbers to properly implement computer logic: 0, 1 and ∞.
Yes, your first concern as a programmer is correctness. Performance is a critical aspect of correctness since you write once, and it runs thousands, millions, or billions of times. But writing programs that are solely functional and fast is analogous to writing a program that can't do any branches or loops-- in both cases, you're focused on only the "first pass". At this point you're operating in the "aspiring to master level 1" arena of programming.
But making your code easy to understand, modify and extend is like realizing the power of implementing branches and loops. It's when you realize that your source code has iterative power, not just your executable. The code you write very well may be iterated over many times, and given that programmers generally cost more than computer cycles, optimizing for future iterations of your code elevates you to the "aspiring to master level ∞" arena. That's why we use higher order languages and source control!
If your only concern is performance and correctness then why even bother checking it in? Compile it to binary (or write it in machine language), and check that in instead. It'll be faster, and save your unfortunate maintainers (maybe that's even you) the trouble of figuring out what you originally did.
When you realize that the iterative value of your source code is orders of magnitude greater than the iterative value of your executable, you'll pour as much effort into making your program easy to understand and maintain as you will in making it correct and efficient.
Note: The inspiration for this post was a quote from one of the gods of Computer Science, Brian Kernighan:
"Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it."