The Societal Impact of the Broken Social Business Model

Facebook was in the news a few times this past week. A junior data scientist’s farewell email to her Facebook colleagues was published by Buzzfeed, celebrities boycotted Facebook and Instagram for an entire day to protest hate speech, and the premiere of the Netflix documentary, ‘The Social Dilemma’ … to name a few. Each share a common thread and serves as a compilation of common themes that are bubbling in our society.

TFD helps media companies effectively utilize Facebook and other social media platforms to reach more users, drive traffic, sell subscriptions, and collect email addresses for newsletters. We also work with the Facebook Journalism Project to educate publishers around the globe on best practices for audience engagement and monetization. We know how powerful social media can be for brands and we leverage our knowledge of the Facebook algorithms everyday to do exactly that. We don’t believe that social media is inherently bad, but there are some serious problems with its implementation and exploitation of users, and ethically, we feel the need to discuss it.

There’s a lot to unpack here, but taking inspiration from the week’s headlines and The Social Dilemma, we have some thoughts on the state of social media, our participation, and the role of the publisher.

Human beings only evolved into societal groups in the last 5,000 years, during which norms have been established about how we communicate, interact and depend on one another. Mainstream social media has been a part of our lives for 0.34% of the time that we have existed in formalized societal groups for about… 17 years. In that time, entire industries have spawned out of the globalization of human connection. My job didn’t even exist when I was in high school, just a mere 10 years ago. Hundreds of thousands of other jobs have emerged as a result of social media coming into our lives, but there have been downsides, escalating in recent years with the 2x increase in reach of disinformation campaigns spanning 72 countries, the use of social media user profile data to influence US elections, and the role Facebook played in the genocide of Rohingaya muslims in Myanmar. The Buzzfeed piece outlines “blatant attempts by foreign national governments to abuse the platform on vast scales to mislead their own citizenry, and cause international news on multiple occasions.”

The Social Dilemma rightfully points out that when you’re not paying for the product, you are the product. Social media platforms harvest billions of user engagement data points both on-platform and off-platform, find patterns among user interests, and package that data for advertisers to target potential customers. This is accomplished using complex algorithms that ingest user engagement data (clicks, likes, reactions, shares, saves, websites visited, time spent with the post), on-site engagement data collected by website pixels (page meta data, micro-interactions, and ecommerce events), and serves it up on a platter to marketers.

The documentary also outlines how social media websites and apps were never  designed to cultivate the ‘best user experience’, but rather to keep the user engaged as long as possible to generate maximum profit for and from advertisers. Using persuasive tech principles, platforms serve as many advertising impressions as possible to increase their profits each quarter. Profits which should be noted, have dramatically increased as a result of the pandemic. Studies have shown how these dark design techniques are effecting us both psychologically and physiologically, creating dopamine rewards for positive social media interactions (likes), resulting in wide-spread social media addiction. 

It’s been documented that the content that drives the most engagement on social media is content that incites outrage. And this monetization of outrage has contributed to the rampant increase in polarization in our society and around the world. The Buzzfeed piece cites examples of eight countries and one US political figure that were subjected to “inauthentic activity” or pervasive bot campaigns aimed at spreading misinformation to influence political outcomes. More concerning was learning that a single junior employee was in charge of accelerating these threats up the ranks at Facebook and was admittedly making decisions that would impact entire governments. 

We’ve experienced these themes in the US as well, with hate crimes on the rise, drastic increases in membership of anti-LGBTQ and white nationalist hate groups, and the rise of dangerous conspiracy theories being spread by ‘super spreader’ pages that take advantage of the engagement-based algorithm. In fact, a Facebook Internal Report in 2018 showed that 64% of people who joined extremist groups on Facebook did so because the algorithms suggested them. The report actually  said “our algorithms exploit the human brain’s attraction to divisiveness.” Scary, right?

There’s something else that should be mentioned. 26 words tucked into the Communications Decency Act (CDA) of 1996 (7 years before MySpace launched). Section 230 of the CDA says that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (47 U.S.C § 230). This means that content curation websites (i.e. social media platforms) essentially became newsstands, not publishers, and therefore not legally liable for the information or disinformation published within them. Think of it this way: if a book or publisher is accused of committing slander within its pages, Barnes & Noble is not liable. Facebook is Barnes & Noble. If I decide to slander someone on Facebook, I am personally liable, Facebook is not. Section 230 also allows website owners to moderate legal content without being open to litigation. Here is where arguments of free speech collide with  deplatforming hate groups, which has been subject to lots of partisinal debate.

Ultimately, there is no responsibility on the part of the platform as to what content is  posted and the business model is based entirely on continued engagement. So, you can see how this could culminate into serious societal polarization that, left unchecked, could lead to rampant democratic decline. 

So what do we do, delete our social media accounts? No.

Here are a few TFD approved ideas for building a more community centric audience:

  • With the rise in Watch Party app extensions, why not start a movie club? Publishers could create a mini-newsletter outlining what movie they’ll be watching, a fun snack to make ahead of time, and your audience could watch it together. Discussion about the movie could happen in a Facebook Group created for that purpose. If something meme-able happens during the Watch Party, or in the group, make an ecommerce play for revenue printed tees or a mug with the quote or meme. Take inspiration from the Barstool sports revenue model, (niche communities love merch).

  • People are looking for ways to be creative, we gave up on harvesting sourdough starters what feels like a century ago. Publishers can run a user generated content competition campaign with your latest issue? Ask your readers to use a #ExploringCityAtHome for how they’re reading your next issue in these “uncertain times”. Partner with a boutique hotel and give the users a contact-free staycation to escape from their day-to-day. Check with your legal teams to ensure you’re complying with sweepstakes and contest regulations.

  • Give your audience something positive to look forward to. Put uplifting content in front of them on social media or by email, on a schedule that they can rely on. Some Good News debuted on YouTube in late march and quickly amassed 72 million views and 2.58 subscribers in 6 weeks. The @goodnews_movement on Instagram takes user-submitted uplifting stories and spreads them online. This could be easy to pivot to your own community. Ask your audience to DM you uplifting things that have been happening to them. Every Sunday morning post these things on socials to bring joy to your audience, spreading positive news that is happening in their community.

  • Try to help your community foster new habits that encourage time away from computers. Publishers could start a city-wide book club, where everyone reads the same book. Pick a demographic to target and start with a few books that would interest them specifically. Leverage platforms that already exist to foster community engagement, Good Reads has book clubs, and Amazon is due to release their own soon. Create a mini-newsletter (or podcast) to keep these users up-to-date on thoughts about the book. Partner with your local libraries to ensure they have copies in stock. Use it as a chance to educate your community on eBooks that can be checked out with a library card. Create Zoom meetings or Live events where users can come and talk about the book and connect to other readers. Maybe the books you read could be about history in your city or events, feature local authors, draw awareness to your locally owned bookshops. 

We also owe it to our communities to ensure that we have resources to help discern the media they engage with. Educate your communities about their voting rights, polling station locations, and encourage your readers to register to vote. If you’re so inclined, take a stand on political issues in your community. Local journalism has long been a trusted resource for trustworthy information and we should ensure it remains that way.  Provide your readers with information they can trust, escape with, rely on and seek comfort in. We’re all going to need it.

The American Press Institute has published 9 Tips for Election Misinformation, please consider reading it. 

If you’d like a better understanding of all the data that Facebook is using about you for ad targeting, visit https://www.facebook.com/ds/preferences/.

9 Things to Try Before The Year is Over

Getting More from What You’ve Got

Getting More from What You’ve Got