How Unbundling and ‘Literacy Friction’ Might Influence the (Facebook) Oversight Board’s Decision on Trump

This was originally provided as a public comment to the Facebook Oversight Board to inform its decision on President Trump’s Facebook account; that document can be found here. It is being shared here for ease of reading, and because of the media literacy connection between ‘literacy friction’ and the recently published brief on ‘contextualization engines’.

It does not advocate for Trump either being platformed or deplatformed from Facebook, but instead seeks the broaden the understanding of what Facebook is and broaden the options available to Facebook and the Oversight Board.

The questions that have been posed to the Oversight Board represent a false dichotomy when there are better means to balance the benefits and harms of such a momentous decision. Alternatives to Facebook’s current actions of “indefinitely prohibiting Donald J. Trump’s access to posting content” may provide a superior approach to balancing the voice, safety, and dignity of the relevant stakeholders. This is of particular importance as banning a political leader may cause more off-platform harm if it leads to an increased sense of victimization and encourages movement to platforms that are more overtly fomenting violence[1].

Therefore the Oversight Board should unbundle the Facebook account to determine what functions, both explicit and implicit, are appropriate to continue to provide to those who may have violated community standards — and particularly to determine what functions should be provided to violative political leaders.

Facebook posting[2] alone can be unbundled into a significant number of explicit functions including:

  1. Providing free storage for new and existing content.
  2. Enabling free broadcasting to a large audience.
  3. Allowing one to pay to advertise and broadcast across Facebook properties.
  4. Enabling one to target this advertising in a variety of ways.
  5. Providing free analytics for understanding how page posts perform.

However, there are another set of implicit functions that Facebook also provides around posting including:

  1. Allowing frictionless submission of arbitrary amounts and types of content — instantaneously and without review.
  2. Enabling near frictionless access to content through links and search.
  3. Enabling near frictionless re-sharing of content.
  4. Recommending content such as videos.
  5. Enabling the frictionless creation of unmoderated conversational spaces in the comment section of posts.
  6. Associating content with the Facebook brand[3].

This analysis leads us to the key questions:

Could one enable content posting — the storage of content within a Facebook property for others to access — while limiting many or all of these other explicit and implicit functions?[4][5]
Moreover, could one address other problematic platform properties through a careful choice of functions?

We believe the answer is yes. This is particularly true when there is a limited number of accounts, such as those of political leaders. Examples of such restrictions that might have been applied to Trump include:

  • Disabling all broadcasting and recommendations of posts.
  • Enforcing a 7 day delay before a submitted post is shown on the platform.
  • Limiting a user to 3 posts per week
  • Manual expert review of posts before submission for violations.[6]
  • Disabling comments, reactions, and resharing for posts.

This is just a short list of potential simple options that might dramatically limit the platform-based harm caused by a user that we introduce in order to be illustrative — it does not even touch on ads, which might be eliminated or restricted.

Finally, there is an unbundling option that addresses the core function of frictionless access which motivates bans, while also improving key societal literacies:

Literacy friction: Requiring those aiming to access violative content to demonstrate relevant literacies or engage with material teaching those literacies.

These educational experiences might involve watching videos and answering a comprehension quiz on topics related to each category of violative content or user. They could support key literacies (civic, media, emotional, etc.), contextualizing relevant topics (e.g. how US elections work; how to recognize disinformation; how to handle anger healthily)[8], analogous to what the best journalists aim to do.

Unbundling enables Facebook and the Oversight Board to move past false binaries, more safely address violative behaviors, and even improve core societal literacies.

[1] The long term impacts of deplatforming are unclear, especially for those with huge audiences.
[2] One can further unbundle other aspects of a Facebook account, including components crucial to many; such as the ability to send direct messages. While those capabilities may be outside of the scope of this case, they have extremely significant impacts on the ordinary Facebook user.
[3] Facebook’s consistent and iconic visual design is instantly recognizable in screenshots, which can then be deployed to other media, from Twitter to cable news and may potentially lend legitimacy to a message, particularly for a verified account. This could be a significant concern, and it may be challenging to address this without unintended negative consequences.
[4] This could be a significant engineering burden for the company to implement, depending on their current architecture, but this should not unduly influence board recommendations which may help guide future architectural decisions. That said, while some of these recommendations may provide value, it may be important to evaluate if the societal benefits of implementing them are higher than that of other changes that could be made with a similar level of resourcing.
[5] It is worth noting that Facebook did appear to act in a more nuanced way than e.g. Twitter in this regard, by doing at least some unbundling—by continuing to allow the storage and access of old posts.
[6] Perhaps even with some monetary cost per post to cover the costs of review.
[7] For example, verifying competence in a literacy domain every 3 months to retain access.
[8] There are significant challenges around coverage and languages; we believe those are resolvable.

Written by Aviv Ovadya (The Thoughtful Technology Project), who you can find on Twitter @metaviv, via this mailing list, or on email.

Founder of the Thoughtful Technology Project & GMF non-res fellow. Prev Tow fellow & Chief Technologist @ Center for Social Media Responsibility. av@aviv.me