

Not a game dev either but my guess would be the main reason is server performance/compute cost.
Any checks that are done on the client run on the users’ hardware instead of the publisher having to pay for more/better servers and electricity.
I think the disconnect with most other types of developers stems from the respective goal hierarchies. In most fields of computing, correctness isn’t just a high-value goal - it’s a non-negotiable prerequisite. With online multiplayer games, one of your chief concerns is latency and it can make sense to trade some cheating for a decrease in lag. Especially if you have other ways of reducing cheating that don’t cost you any server processing power.
Also, aren’t many of the client side anti-cheat solutions reused in several games? If you’re mainly checking that the player is running exactly the same client that you published, I imagine the development cost for anti-cheat is lower.
TLDR: Money. It’s always money.












Side-rant:
I rarely write Python code. One reason for that is the lack of type safety.
Whenever I’m automating something and try to use some 3rd party Python library, it feels like there’s a good 50/50 chance that front and center in its API is some method that takes a dict of strings. What the fuck. I feel like there’s perhaps also something of a cultural difference between users of scripting languages and those of backend languages.
What you described sounds so much worse though holy shit.