Meta-ing the Metaverse

As Facebook decides it wants to rule the metaverse and people are finally beginning to recognize dangers inherent to social media and the like, it is time to step back for a really big-picture look at what we are dealing with. As a psychiatrist and futurist, I expressed concern decades back. And while it is good that people are now asking whether curbs may be needed, it is rare that the really hard and ultimately important questions are being addressed. It is essential that we confront them. If we face these questions head on, it is still possible that there may be nothing we can do. But if we look at them clearly, at least we have a chance.

What I will suggest may sound sensationalistic, but people who are familiar with my thinking know that sensationalism is the last thing I would have interest in. On my list of truly existential dangers we face as a species—which includes weapons of mass destruction, climate change, and pandemic—dangers that relate to misuse of information technologies are at the top my list. Part of the reason is the dangers themselves. The larger reason is that I’m not sure there is anything ultimately we can do about them. It is important that we at least do what is possible. 

The most important questions we confront with social media and the like are different from what now tends to receive attention. The issue is not just disinformation and fake news, though that is part of it. It is also something more basic than how algorithmic amplification of our collected information exacerbates conflict and the generation of silos of the like-minded that results, thought that too is certainly of concern. It has more to do with ourselves. 

The More Fundamental Concern

We gain insight into this more fundamental concern in the mechanisms through which social media and the like become addictive. Addiction works by providing artificial substitutes for fulfillment. We get feelings of excitement, pleasure, or meaning without the vulnerability required for the real thing. Digital media are becoming hugely profitable through the increasingly sophisticated selling of pseudo-significance. We are creating ever more effective digital designer drugs. Current suggestions that in the future we could stimulate pleasure centers in the brain directly through digital means highlights both the mechanisms and dangers. That, in effect, is what we are already doing. 

This would be a major concern at any time, but it presents a particularly troubling circumstance in ours. I’ve written extensive about how today, societally, we confront a Crisis of Purpose. Stories and values that have before worked to guide us are no longer serving their historical functions. More and more often today people feel disoriented and rudderless. In this context, pseudo-significance becomes especially appealing. And if it is not just of a personal sort, but permeates our consensus reality, as is increasingly the case with our digital world, this effect becomes particularly pronounced. We see this today with Facebook and the like. And the addition of augmented reality that would accompany the idea of a metaverse only promises to amply the effects. 

What Can We Do?

So what do we do? Interventions we hear suggested certainly have a place. We can do a better job of filtering disinformation and incendiary content. We could also limit collection of information. And we could put curbs on the  algorithmic amplification of data. Each of these actions would provide some benefit. 

There is also something else we could do that could have great effect, though it would require steps we currently would not consider. A major reason that Facebook and the like are becoming ever more sophisticated digital designer drugs is that they are advertising driven. A lot would change if we put major limits on this business model. Given that this would mean huge decreases in profitability, it would likely take the specter of cataclysmic consequences becoming inescapable—as we are beginning to   recognize today with climate change—for such action to become acceptable. That said, I suspect that we will see really substantive progress only with the implementation of this kind of constraint. 

Even this, however, will not ultimately be enough. Machine learning algorithms don’t need ill intent or even a simple desire to maximize profit for them to have destructive effects. Instruct an algorithm to attract the maximum number of eyeballs (which is what people most often want them to do) and content that is ever more addictive and divisive becomes the natural result. Addiction is the best way to assure attention and divisive content is particularly addictive. Over the long term, content that actually benefits us stands little chance in this context. 

Where the Solution Must Ultimately Lie

Which brings us to the only real solution. The antidote, too, is necessarily ourselves. At our best, we are capable of choice. One of the inherent consequence of a world define by addictive dynamics is that our capacity for real choice is undermined. It is a capacity we must reclaim. I chose long ago to give at most limited attention to things like social media, this for a very simple reason. I value my life. And I think one of the most important things I can do is to not let my life be coopted by artificial substitutes. I choose to have digital content in my life only when I see that it will provide clear benefit. 

Importantly, even this answer is more complicated than it might appear. The task when I work on addiction issues with someone in therapy is ultimately to help them learn to “just say no.” But getting beyond the pseudo-significance a drug provides and saying no only becomes possible once the person has found sufficient significance in their life that living a full life becomes of clearly greater importance. To say no societally when digital media fail to benefit us, we must have done a good enough job of confronting our time’s Crisis of Purpose that we have a real reason for doing so. In the end, the solutions must come from engaging a new, more ultimately compelling kind of guiding story (what Creative Systems Theory describes with the concept of Cultural Maturity). I suspect this is the only way that we can make the needed, more courageous kinds of choices, both personally, and together as a species. 


Fill out the form below to receive monthly articles and updates from Charles Johnston, M.D.