Reality Shows Trump Fiction Showing What Businessmen Are Like

May 16th, 2013 11:54 AM
With the entertainment industry pumping out a different crop of reality shows every season, a new phenomenon had occurred. Business, success, and making money are suddenly portrayed positively. Reality shows depict business as a positive thing because, in reality, average Americans don’t see business and money as an inherently bad thing. That’s a complete turn-around from how…