Amazing article, I love how he explains well the differents level to ‘optimize’.
Very much reminds me of Casey Muratori’s “Philosophies of Optimization” talk where he mentioned what he calls “pessimization” - instead of optimizing (making a program do a thing faster), pessimize it - make it do less.
The less a program does/needs to do, the faster it will be.
No bytecode compilation by default. pip compiles .py files to .pyc during installation. uv skips this step, shaving time off every install. You can opt in if you want it.
So it makes installation faster by making runtime slower.
Ignoring requires-python upper bounds. When a package says it requires python<4.0, uv ignores the upper bound and only checks the lower. This reduces resolver backtracking dramatically since upper bounds are almost always wrong. Packages declare python<4.0 because they haven’t tested on Python 4, not because they’ll actually break. The constraint is defensive, not predictive.
So it makes installation faster by installing untested code.
Sounds like a non-starter to me.





