For general discussion and debate. Possible talking point: Are we witnessing the end of American capitalism?
First we're going to bailout banks and brokerages firms. Next it will be our failing auto industry. Then if Obama becomes president we're going to nationalize health care.
For almost thirty years, a conservative revolution led by President Reagan ushered in a new era of capitalism in our nation creating wealth and a standard of living never seen before. Now, it looks like that's all coming to an end, and a new, FDR-style socialism is replacing it likely making it impossible for baby boomers to ever retire while dooming their children to a far less desirable lifestyle.
Do you believe this is the case, what's the real cause, who's to blame, and what can be done before America becomes France?