It's official. The healthcare industry has become the top employer in the United States, employing more people than any other industry. But you probably already knew that.
As Derek Thompson writes in ...