Do you think Western countries are becoming more socialist in the wake of the

Steven TJ

New member
global financial crisis? Since the GFC hit, many Western countries have tried to nationalise banks and governments have bought majority stakes in troubled banks. There has been talk about creating 'bad' banks that are government owned to buy back toxic assets and bad debt. In recent elections, the US and has demonstrated a swing towards the pro-worker, left-wing politics. Obama has been even accused of being secretly Socialist by promoting universal public healthcare. Isn't it funny how Western countries, once promoters of free-market economics and less regulation, are now doing the same things that they are so against beforehand - like BIG government intervention, big spending, government-owned banks, reverse-privatisation, etc? Isn't this making them more like those Communist countries?
 
They're not.

There is almost no organised left in most western countries anymore. They are nationalising banks not for left-wing moral reasons, but as a business decision.

In fact, after watching the European elections recently, parties on the far right are the ones who have made the most significant gains. It's scary
 
They are becoming socialist because capitalist-socialist nations are the best in the world. It really doesn't have to do with the financial crisis, although the US is starting to adopt it because pure capitalism doesn't work and has failed them miserably.
 
Back
Top