The central paradox of the inexorable advance of artificial intelligence and automation is that some of the most important ways humans will add value in the future are activities that we undervalue now: social interactions, creativity, and caregiving.
But how do we change the culture to be able to monetize those contributions? For centuries, compensation has been minimal to nothing for caring for children and aging parents. The recent expansion of the service economy has created many caregiver jobs, but these tend to be low paid. Our economy doesn’t reflect how important the human touch is, much less that it will be even more so in the future.
Recent articles about Zora, the robot caregiver that (I can’t bring myself to say “who”) is being tested in French nursing homes as a cure for loneliness, talked about how some elderly resident were developing emotional attachments to Zora. But it should be painfully obvious how far short Zora falls.
The futurist CK Kerley likes to say “Humanity is the killer app!” And she’s right.
But that doesn’t change the central economic dilemma around caregiving and social interaction. How do we compensate the most valued activities in society?
Over the weekend, I finished reading Kai-Fu Lee’s terrific book, AI Superpowers : China, Silicon Valley, and the New World Order. He not only gives a clear description of the growth of new technologies and their socio-economic implications, but he also adds a human perspective.
Lee also points out, rightly, that the biggest danger in AI is its impact on labor markets and social systems, potentially creating dystopian scenarios of social unrest and destruction. He offers some insights into how to offset some of the resulting issues. His goal is no less than the creation of a system that provides for everyone, using AI generated wealth to build a “more compassionate, loving, and ultimately human” society.
Utopian? Perhaps. But he’s laid out the right questions and goals and thinking creatively about how to get there.
He’s skeptical of the universal basic income idea –basically, that everyone would get a certain amount of money to meet their basic needs– that is popular among many in the tech community. UBI would be exorbitantly expensive if it were truly universal, which is a big “if” for practical, political and ethical reasons. More important, in his view, is the problem that UBI might soothe the consciences of the mega-rich of the technology world, but it fails to recognize the human need for purpose.
Instead of UBI, Lee proposes a social investment stipend. In his conception, the government would pay this salary to “those who invest their time and energy in those activities that promote a kind, compassionate, and creative society” involving care work, community service, and education.
“Providing a stipend in exchange for participation in prosocial activities reinforces a clear message: It took efforts from people all across society to help us reach this point of economic abundance. We are now collectively using that abundance to recommit ourselves to one another, reinforcing the bonds of compassion and love that make us human,” he wrote.
Like UBI, such a stipend could be financed by gradually phasing in higher taxes on tech companies making astronomical profits and generating huge productivity gains.
Coincidentally, I was just reading an article in a recent issue of The Economist that used a term, “abnormal profits.” Though the magazine was talking about the negative consequences of industries dominated by just a few giant companies with pricing power, that could apply here as well.
Lee makes another important point: that there is actually a connection between the exponential, scalable growth of tech companies and the economic reality of human care giving and other of the “prosocial” activities he describes: “When someone builds a great company around human care work, they cannot digitally replicate their services and blast them out across the globe. Instead, the business must be built piece by piece, worker by worker.” In other words, it’s not the kind of business a tech VC investor would “waste” their time with.
Lee sees human care companies as another important part of the future of work, and a role for a new kind of multiple bottom line investing for impact. In this kind of company, the creation of meaningful jobs -and in turn, the positive impact on society and civility- is part of the return on investment being measured: conversation partners for the elderly, coaches for youth sports, oral history gatherers, for example.
Lee doesn’t go into the weeds on how a tax on “abnormally profitable” IT companies would work. But it occurs to me that tax policy might cut both ways: on the one hand, higher taxes on tech giants dominating their markets, and lower taxes on companies that meet clear, consistent impact investing standards.
Lots of food for thought.
#futureofwork #ai #automation #ubi #socialinvestment #impactinvesting #zora #robot #caregiver #policy
This is the fifth installment of my new weekly LinkedIn series, “Around My Mind” – a regular walk through the ideas, events, people, and places that kick my synapses into action, sparking sometimes surprising or counter-intuitive connections.
Click the blue button on the top right hand on this LinkedIn page to subscribe to “Around My Mind” and get notifications of new posts. Please don’t be shy about sharing, leaving comments or dropping me a private note with your own reactions.
Latest posts by Michele Wucker (see all)
- Musings on Risk and Prevention, Courtesy of the Flu - March 8, 2019
- Amazon Prime’s “Forever” Takes Big Risks - February 9, 2019
- A Few of My Favorite 2019 Top Risks Lists - January 25, 2019