Skip to main content
SearchLoginLogin or Signup

Rethinking Engagement: Challenging the Financial Model of Social Media Platforms

Published onJan 27, 2022
Rethinking Engagement: Challenging the Financial Model of Social Media Platforms

Editor’s Note: Social media has long been accused of contributing to mental health issues, but this call to action cuts to the heart of the matter: engagement with the platform has become the most important metric, and the algorithms that best produce results tap into negative bias of the users. Thomas Casey makes the case for a financial model that would be more beneficial to users of these platforms and society overall.

Keywords: social media, mental health, platform providers, recommendation engines, digital economy, Facebook

1. If the Product Is Free . . . You Are the Product

Years ago, I was browsing for several hours online, following the insipid content and strange people recommended by a social media platform. In a moment of consciousness, I realized I was being manipulated to spend an inordinate amount of time on this, and comprehended the slippery slope that is the social media engagement model. I cut the cord by canceling my account.

Engagement is the currency of social media platforms, and driving engagement involves recommending content and users of interest. Recommender systems use data, computing power, and machine learning to get a user’s attention and keep them engaged. The better a platform’s algorithms are at keeping users engaged, the more the platform can tout its value to advertisers. In 2020, the global market for Digital Advertising and Marketing was estimated at US$350B, and it is expected to more than double by 2026 to US$768B (Global Industry Analysts, 2021). This is in large part a result of increased engagement. According to a Pew Research Center report from March 2021, 85% of Americans go online daily, and nearly a third of adults are online “constantly” (Perrin & Atske, 2021).

2. Using Negativity Bias to Drive Engagement

Negativity bias is a tendency in people to prioritize negative stimuli by producing more neural activity than in response to equally intense positive stimuli (Ito et al., 1998). The reason is based on our evolutionary need to prioritize things that are broken or that put us at potential risk of some harm. This survival technique manifests itself in our having a stronger attraction to negative things than positive things. Combined with availability bias (where we overestimate the importance of things that are most prevalent in our minds) and confirmation bias (where we specifically look for justifications of our beliefs), negativity bias helps explain why people tend to focus more on negative content rather than neutral or positive content when online. 

 Regardless of whether this is an explicit objective, any effective recommender system that is designed to drive engagement will ultimately unlock the inherent biological triggers of negative bias. The impacts are only now starting to be fully realized, but this suggests a cycle with very real, detrimental effects:

  1. Exposing end users to content that reinforces unhealthy predispositions and encourages development of harmful ideas that would otherwise not have manifested,

  2. Encouraging people to bypass the typical inquiry-based learning process by which ideas and content are independently explored; instead relinquishing the supply of information to the recommender system, whereby,

  3. More personalized feedback based on personal behavior is delivered without context; thus, unlocking innate negative triggers more effectively and ultimately increasing an overall desire to stay engaged.

  4. Repeat the cycle . . .

As reported in an article in The Wall Street Journal (Wells et al., 2021), internal research at Facebook found that “the tendency to share only the best moments, a pressure to look perfect, and an addictive product, can send teens spiraling toward eating disorders, an unhealthy sense of their own bodies, and depression.” The internal research further found that “the Explore page, which serves users photos and videos curated by an algorithm, can send users deep into content that can be harmful.” This scenario outlines the precarious intersection point whereby the same content that drives heavy, if not addictive, engagement is also the content that is most likely to drive negative mental health outcomes. Furthermore, these negative outcomes are not exclusive to teenage cohorts (Hou et al., 2019).

Despite promises to the contrary, it seems that platform providers are finding it very difficult to control these negative outcomes. Attempts to introduce content controls face accusations of bias in arbitrating what is and what is not suitable. In addition to these hurdles, the financial motivation to maintain the status quo is immense.

3. Leading the Effort to Address the Challenge

Many foods, drugs, procedures, or beliefs that were once deemed beneficial have later been banned when proven to be harmful. These efforts were not easy and most took decades to address, even after the detrimental effects were realized. With this in mind, we need today a conscious and deliberate effort to address what may be the greatest mental health issue of the 21st century. In addressing the challenge of engagement-based recommendations, the following constituents all have a role to play:

  • Social Medial Platforms—define and advance a new market currency such that platforms can generate value by promoting the best of us rather than enabling the worst of us.

  • Advertisers—understand the implications of your advertising dollars and hold platform providers accountable in helping to support a new financial model or in forgoing these platforms altogether.

  • Politicians—focus on what is best for your constituents without picking a side or weaponizing the discussion.

  • Academics—make the scientific case against the current paradigm in an unbiased and verifiable way while leading the effort to rethink go-forward options.

  • Individuals—hold all the above accountable while taking personal responsibility to limit the influence of algorithms by restricting engagement.

Ultimately, it is not clear whether social media platforms can be made safe for end users. It may be that these negative aspects are a fundamental feature of the technology, the same way there is no such thing as a healthy cigarette. If this turns out to be so, then we will need new platform models that engage by demonstrating the value to users that comes from engagement versus deriving value through engagement in and of itself.

4. The Future Is Now

People often quote George Orwell’s 1984 as a cautionary tale of the future, but it seems that Aldous Huxley is the more relevant prognosticator of our present. Huxley stated in Brave New World that he feared society would be given so much that we would be reduced to passivity and egoism, that eventually truth would drown in a sea of irrelevance. Our species is unique in its ability to recognize what we are doing to ourselves, and it is imperative that we make engagement-at-any-cost recommender systems a thing of the past or submit to becoming passive participants in a world defined by algorithms with an indirect yet inherent goal of exposing the worst of ourselves.

Disclosure Statement

Tom Casey has no financial or non-financial disclosures to share for this article.


Global Industry Analysts. (2021, June). Digital advertising and marketing - Global market trajectory & analytics.

Hou, Y., Xiong, D., Jiang, T., Song, L., & Wang, Q. (2019). Social media addiction: Its impact, mediation, and intervention. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 13(1), Article 4.

Ito, T. A., Larsen, J. T., Smith, N. K., & Cacioppo, J. T. (1998). Negative information weighs more heavily on the brain: The negativity bias in evaluative categorizations. Journal of Personality and Social Psychology, 75(4), 887–900

Perrin, A., & Atske, S. (2021, March 26). About three-in-ten U.S. adults say they are “almost constantly” online. Pew Research Center.

Wells, G., Horwitz, J., & Seetharaman, D. (2021, September 14). Facebook knows Instagram is toxic for teen girls, company documents show. The Wall Street Journal.

©2022 Tom Casey. This article is licensed under a Creative Commons Attribution (CC BY 4.0) International license, except where otherwise indicated with respect to particular material included in the article.

No comments here
Why not start the discussion?