Is the United States of America over?
-
With the stark division between left/right (Democrats/Republicans, progressives/conservatives, whatever you want to call it) is the "United" States over? I might be pessimistic, but it seems that the disconnect between the citizens is too great to fix. I don't think I'm alone in thinking this, but what are other opinions?
Capitalist democracies all over the Westernised world are transforming. I don't say collapsing, but transforming. We've all been force fed a steady diet of fake freedoms, culture wars and bullshit nationalism pretty much since the end of WW2 to spread uncertainty and fear. It's all coming to a head now because populations are panicking (by design) and turning to fascist 'strong man' populist leaders like Trump, Farage, Meloni, Orban etc who are all financed and propped up by our next set of leaders - the billionaire's who will operate a worldwide network of oligarchical fascism, whilst the herd are all distracted by manufactured outrage at trans people existing and what an actors politics are and how the immigrants are simultaneously taking all the jobs and draining us dry on the dole.
-
the disconnect between the citizens is too great to fix.
The citizens of both sides have yet to understand that their conflict is not their biggest problem.
The conflict between the classes is much more severe.
Once when this is out in the open, the end might happen. Until then, it is a kind of intermediate state: not the former united states anymore, but no new thing either.
yeah...I'm never accepting someone who can support a convicted rapist and suspected child rapist.
fuck everyone who supports that bag of shit.