This council is presented as an independent entity, with an international scope; Facebook will have to comply with its decisions, rendered in a transparent manner.
The moderation of content on social networks is regularly debated, and that of Facebook is of course not immune to criticism. It must be said that with its 2 billion users encouraged to publish regularly, this represents a great challenge. To this end, the company has been working for several years on the constitution of a Supervisory Board. Users now have the option to enter it, if they believe that the social network has unfairly moderated one of their post.
Watchwords: independence, competence, accessibility, and transparency
From now on, if Facebook or Instagram censors a post, the victim will be able to challenge this decision by calling on the supervisory body. This council has 40 members. Among them, a Nobel Peace Prize laureate, Tawakkol Karman, a former Prime Minister of Denmark, Helle Thorning-Schmidt, university professors, lawyers, etc. Each member is appointed for three years. On a purely financial level, to ensure the sustainability and independence of the entity, Facebook has allocated $ 130 million from an irrevocable trust fund.
On the page dedicated to it, it is written that this council’s mission is ” to promote freedom of expression by making reasoned and independent decisions regarding the content published on Facebook and Instagram, as well as by issuing recommendations on Facebook content rules impacted ”. Four key principles are put forward: independence, competence, accessibility and transparency.
Its executive director, Thomas Hughes, told reporters yesterday: “ Our goal is to build an institution that does not just respond to one movement or follow a specific news cycle, but that protects human rights and long-term freedom of expression ”.
Cases studied on a case-by-case basis
The operating principle of this council is as follows: the members sit in turn on a committee which selects the cases by a majority vote. Then, a jury made up of five members, at least one of whom must be from the country where the contentious content was published, will have the onerous task of determining whether it respects “the rules and values of Facebook”.
On the other hand, before making a decision, the jurors will invite Facebook users to express themselves and discuss the case in question via a dedicated space; those who wish will have the opportunity to be notified of each new case examined. In addition, Helle Thorning-Schmidt specifies that each verdict rendered will be published and archived on the council’s website; this, for the sake of transparency.
Of course, board members are aware that they will not be able to handle all requests. Jamal Greene, Columbia law professor and co-chair of the board, explains, “ We won’t be able to hear all appeals, just because the volume submitted will be too high, but we want our decisions to be influential and have an impact beyond the single case ” .
The ability to request a review of a possible third-party publication within a few months
Currently, users do not have the opportunity to designate third-party content that they would like to submit to the council for judgment. This feature will nevertheless be available ” in the coming months” .
Thus, as it stands, Facebook only authorizes the author of a removed publication to request a reassessment of this decision from the board. In some cases, the procedure may benefit from an expedited process; however, this modality is suspended until the US election on November 3.
For a user to be able to appeal to the board, its content must therefore have been moderated by Facebook or Instagram beforehand; he then has 15 days to initiate an appeal. If this is accepted, the commission has 90 days to render its verdict.
For its part, Facebook has promised not to interfere with the decisions taken by the supervisory body; in addition, the latter will have the choice to accept or reject the cases submitted by Facebook, as part of the voting procedure mentioned above.
Facebook CEO Mark Zuckerberg said in an interview with The Information that he was “ optimistic” about the role of the board and felt “ that it is very important to create more independent governance”; he adds that ” assuming the model works as intended, I hope to either expand its role or add other forms of formal governance to other aspects of our content policies.