Facebook and the "two-feed solution": Social media giant can be both a "platform" and a publisher

Mark Zuckerberg's company can resolve its current dilemma. But only if it faces the truth about its hypocrisy

Published September 1, 2018 10:00AM (EDT)

 (Salon/Benjamin Wheelock)
(Salon/Benjamin Wheelock)

Let me first state that I put Mark Zuckerberg's pronouncements on content in the same bucket as ExxonMobil on climate change and Donald Trump on draining the swamp. They are meant to feign concern, do as little as possible, and get back to the mission at hand – making lots and lots of money.

We now have Zuckerberg awkwardly defending the right of Holocaust deniers to post because he understands their intent and ignorance of the truth. When CNN reporter Oliver Darby asked John Hegeman, vice president of Facebook’s newsfeed, about combating misinformation and the likes of Alex Jones' Infowars, the latter suggested, "I guess just for being false, that doesn't violate the community standards.” His solution, prior to a recent removal of select Infowars content, was to down-rank such content in Facebook's "News Feed."

And there’s the rub. There is always a new algorithm, or the promise of artificial intelligence, to maintain a we-are-a-platform posture, which at its core is a myth.

The origin story of this content calamity does not begin with a monkey in an African rainforest, but with two little-discussed laws that have allowed Facebook to claim it is a “platform,” free from the legal constraints governing traditional publishers. Long before Facebook, Twitter, Instagram or YouTube existed, Congress passed the Communications Decency Act (CDA) in 1996, meant primarily as a response to internet pornography. Section 230 states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The law effectively immunized providers from tort liability, such as defamation.

In 1998 Congress passed the Digital Millennium Copyright Act (DMCA), a copyright law governing online digital media intended to protect websites that ingest large quantities of user-generated content (UGC). The law established “safe harbor” protections from copyright infringement claims if modest efforts to protect content and remove infringing content upon notification were adhered to. Silicon Valley argued platforms were internet service providers (ISPs), not "publishers" like newspapers, who are liable for defamation and copyright infringement. Using a telephone analogy, they argued that no one sues AT&T for slanderous remarks spoken over a phone line.

Without traditional legal constraints, the floodgates were open. From day one, these “platforms” functioned more and more like publishers – curating content and, as complaints grew, adding more moderators and fact-checkers. All the while the platforms fought  to protect the legal immunity that fueled their unprecedented growth and profits. This singular posture now threatens the free speech that Silicon Valley claims to protect, while exposing indefensible moderation protocols and secretive algorithms.

I am reminded of Israel’s quandary as described by Middle East experts. Hard-liners who cling to a one-state solution will demographically induce apartheid and ultimately lose the Jewish state they fight so hard to protect. Facebook, by holding on to its platform-only position while attempting to moderate billions of posts a day, will ultimately invite a congressional solution forcing it to become a publisher, or as Congress likes to call it, a media company. I would argue that Facebook is a platform and a publisher – and should embrace the original intent of both.

The "Two-Feed solution"

Your Feed, the platform side, would contain content you create and proactively follow. Facebook Feed, the publisher side, would be content Facebook sends your way which they moderate, fact-check, subjectively rank or associate with advertising. The original intent of the DMCA and CDA would be respected, clarified and enforced. Platform content would be truly hands-off to maximize freedom of expression, and publisher content would be accountable, ending Facebook’s profiting from the distribution and implied endorsement of toxic content.

To achieve this goal, we would need to begin by clarifying what a publisher does.

A traditional publisher like the New York Times can generate content by paid staff, commissioned content, reviewed submissions from freelancers or licensed third-party news from agencies like the Associated Press (AP). Writers may work with a variety of editors, fact-checkers and legal staff to polish a work. A third-party source like AP will maintain its own editorial staff and accept responsibility for content it supplies. Once a story, video or graphic is approved by senior editors, a publisher’s second core function kicks in -- to oversee headline selection and curate placement of the work – on page 1, page 23, in the digital product only or not making the final cut at all. Placement and curation can be as important in publishing as the story itself. Publishers with advertising accept or reject ads and associate those ads with specific sections or content. Publishers with subscription firewalls decide what content is free and paid.  

If the subject of a Times story feels slandered, they sue the publisher. If the story was supplied by AP they still sue the Times, which then sues AP. This chain keeps everyone on their toes and is the bedrock of media accountability.

To fully understand how topsy-turvy the content distribution chain became in the Silicon Valley era, one need only look at fact-checkers. Facebook fact-checks content after it is posted in the platform's newsfeed. Even if content is flagged and removed, the damage is already done. A publisher reviews content before distribution, because of its exposure for slander and copyright infringement. Even if Facebook did not create the original story, they curate and rank the content, fact check, moderate and associate advertising with specific content. They make money on toxic content and by “feeding” the content provide an indirect endorsement, all while maintaining a supposedly neutral "platform" status. It is no wonder trust in media is at an all-time low.

In the two-feed solution, “Your Feed” will only contain content you post or from a source you have deliberately chosen to follow. And only content that source created, to reduce virality. Facebook will feed the content chronologically. If I visit your page and see Infowars in Your Feed, I’ll know you’re researching fanatics or you’re a jerk. It’s on you. Facebook will function as a true “platform” and have no responsibility for fact-checking or moderating primary posts by users, except when flagged for illegal activity.

Facebook Feed will contain content Facebook chooses to push in your direction, along with any advertisements they associate with the content. The company can now allocate its moderation and fact checking resources to content it publishes in its own feed. Trust me, all the gobbledygook about community standards will be greatly reduced, and when Facebook is on the hook legally, Infowars will be dropped from its feed like a hot potato.  

When Russian fake accounts and false advertisements were initially exposed, advertisers panicked and Silicon Valley took a blunt instrument to moderation in what is now called the Adpocalypse – forcing hundreds of legitimate publishers and independent news sources out of business when traffic dried up practically overnight. Everyone may be happy about censoring Infowars, but David Greene, staff attorney for the Electronic Frontier Foundation, notes in the Washington Post, Alex Jones was not the only casualty. 

“Moroccan atheists, women discussing online harassment, ads featuring crucifixes, black and Muslim activists reposting racist messages they received, trans models, drag performers, indigenous women, childbirth images, photos of breast-feeding” have all been censored, Greene wrote, along with evidence of war crimes and police brutality. My proposed solution will protect free speech by reducing pressure on Facebook to remove content they no longer deliver in their feed.

Then there are those pesky algorithms – used to “down-rank” Infowars and choose the content in your newsfeed. An algorithm is editing. Down-ranking Infowars is curation, an editorial decision made as a publisher. A human being makes a subjective decision about the content. For clarity – let’s call it “algo-editing” – let's note that it may be a subjective editorial decision, made by humans but executed repetitively by a computer.

Algo-editing is distinct from neutral categorization using metadata or taxonomies. Think of the Dewey decimal system in a library. The library makes no judgment, or prioritization. It simply categorizes books for easy access. To place content chronologically into sports, music and business verticals, or to geo-gate content by region or language, would fall on the platform side of this equation for a site like YouTube.

With less censorship on the platform side, users will need more information before being exposed to content. The content rating system used for movies (G, PG, R, MA) with data on nudity, language and violence and stronger parental controls would help, particularly with video content. Illegal content like beheadings and child pornography would still be illegal and quickly taken down when flagged.

READ MORE: Don’t blame the Mafia: American capitalism made Donald Trump

There are other weak points in the ecosystem, from registration to advertising to sharing and search. The operative principal remains -- more accountability for all participants in the supply chain. If you share content, you are a mini-publisher. OK, you’re not the bad guy pulling the trigger in the bank, but you are driving the car to the bank – you are an accomplice of sorts. Maybe users are required to share content with more context beyond a simple "Like." Advertisers certainly continue to hide behind programmatic ad services and feign ignorance of where their ads appear. Search algorithms are making value judgments about what content you see – but those discussions are for another day.

Recently, as head of content at Pond5, a stock-licensing company with over 40 million media assets and 50,000 contributors. I oversaw a soup-to-nuts review of content ingest, curation and licensing policies to better protect creators. Our team identified stronger digital-identity checks of contributors as the single best way to improve security and protect copyright. Yes, Facebook needs to protect anonymity for some contributors, especially under repressive regimes, but no component in the process is more out of control than contributor identity. And again, it was profit driven to spur growth. The New York Times reported Facebook removed 585 million fake accounts in the first quarter of 2018. It’s still just too easy to scam. 

Facebook wants us to believe it can moderate and fact-check billions of often long, complex content submissions each day. I argue that is impossible to achieve. Facebook and Silicon Valley are asking us to ignore publishing rules that have served our media ecosystem for hundreds of years. I argue that is the root cause of distrust in media.

If you fact check, you’re a publisher. If you rank and curate content, you’re a publisher. If you associate advertising with specific content, you’re a publisher.

The Two-Feed Solution would allow Facebook to act as the neutral platform it claims to be and allocate moderation resources to content it chooses to stand behind. The Two-Feed Solution would level the playing field for publishers ravaged by Silicon Valley platforms without any skin in the game and reboot a sense of accountability that could help rebuild public trust in media.

Today's hottest topics

Check out the latest stories and most recent guests on SalonTV.


By Rick Gell

Rick Gell is a board member of the Digital Media Licensing Association and was previously head of content at Pond5.

MORE FROM Rick Gell


Related Topics ------------------------------------------

Content Facebook Fake News Internet Culture Journalism Mark Zuckerberg Online Culture Publishers Social Media