For years Americans have been told to trust in “the market” to solve all our problems. It certainly leads to great wealth and contentment. For some. But we live in a world where we are capable of looking and seeing other countries without such market devotion. And they’re happier
I don’t think nor do I want America to be as skeptical of business as say the standard in Europe. But I don’t think we can keep people satisfied anymore by saying the market is eventually going to provide. The market doesn’t give a shit if there’s no profit.
There’s no profit in clean air and water, better education, expansive health care and a strong public infrastructure and safety net. There’s no incentive for the market to do any of that. But there’s a societal need. So we will act and do it.
So the market is going to be there and business will grow and people will make a metric ton of money. But the goddamn market is also going to help finance the stuff we need that it won’t provide on its own. It’s not “what’s best for business” but it’s what’s good for society.
The counterargument will boil down to “liberty,” but sorry, the freedom to die because pharma likes profits from medicines when the option is there to simply get people the medicine they need to fucking live is going to make the sloganeering a hollow battle cry.