As a resident of a vassal state I’m locked in a bubble of naïve liberal media coverage treating Democrats as the left. I realised that despite how evident it is for everyone here that the Dems have become increasingly rightwing with Biden and Harris, I’ve never found a source that bothered demonstrating it by compiling evidence.
Of course search engines are being completely useless at this, just shoveling endless trash from British and US corporate media
They haven’t ever been left.
the closest they’ve ever been to it was when they kinda supported some leftist policies (as long as black people didnt get any) to save capitalism and deflate the workers’ movement
this was FDR