Episode 27 - Embracing our Freedom of Expression to Talk Censorship

Episode #27 – Embracing Our Freedom of Expression to Talk Censorship

In Episode #27, Tom and Stu are discussing the concept of Freedom of Expression and Censorship. Now more than ever, we must be cautious of overreach by governments and technology companies, as they look to combat the rise in misinformation amid worldwide protests, civil uprising and the COVID-19 pandemic. Our freedom of speech and expression is a basic human right, however, it is clear that technology companies aren’t doing enough to protect individuals and society at large. As more and more pressure is placed on these institutions, to take responsibility for moderating the content on their platform, we find ourselves walking the tightrope that is democratic society’s requirement in allowing for open debate while also protecting vulnerable communities & promoting equality for all. 

 

Key Message: To protect our basic human right to freedom of expression, we must ensure TRANSPARENCY, CONSISTENCY, CLARITY & PRECISION is inherently embraced in censorship policies within both government & technology institutions

 

Some of the questions and topics we cover throughout the episode:

  • What is freedom of expression?
  • What is censorship?
  • What isn’t meant by freedom of expression?
  • Free speech vs hate speech
  • Why freedom of expression is fundamental to democracy
  • The United Nations framework for protecting human rights
  • Facebook Moderators 
  • Difference between Government & Technology company obligations
 

“Freedom of speech[2] is a principle that supports the freedom of an individual or a community to articulate their opinions and ideas without fear of retaliation, censorship, or legal sanction. The term “freedom of expression” is sometimes used synonymously but includes any act of seeking, receiving, and imparting information or ideas, regardless of the medium used.” – Wikipedia 

 

The First Amendment forbids government censorship of expression; it does not prevent a public institution from deciding what to present on its website/app (i.e this is why Youtube or Netflix is able to decide what to remove from its site – although as we’ve seen, public opinion can often sway these decisions which is why further work needs to be done to ensure transparency, consistency, clarity & precision).

 


Source: https://xkcd.com/1357/

“Freedom of speech and expression, therefore, may not be recognized as being absolute, and common limitations or boundaries to freedom of speech relate to libel, slander, obscenity, pornography, sedition, incitement, fighting words, classified information, copyright violation, trade secrets, food labeling, non-disclosure agreements, the right to privacy, dignity, the right to be forgotten, public security, and perjury. Justifications for such include the harm principle, proposed by John Stuart Mill in On Liberty, which suggests that: “the only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others.”[4] “ – Wikipedia 

 

“The University is not engaged in making ideas safe for students. It is engaged in making students safe for ideas. Thus it permits the freest expression of views before students, trusting to their good sense in passing judgment on these views.” – University of California President Clark Kerr (1961) 

 

“We can never be sure that the opinion we are endeavoring to stifle is a false opinion; and if we were sure, stifling it would be an evil still.” – John Stuart Mill, On Liberty, 1859

 

Hate Speech: genuine harms, such as the kind resulting from speech that incites violence or discrimination against the vulnerable or the silencing of the marginalized. 

 

  • “Under international human rights law, the limitation of hate speech seems to demand a reconciliation of two sets of values: democratic society’s requirements to allow open debate and individual autonomy and development with the also compelling obligation to prevent attacks on vulnerable communities and ensure the equal and non-discriminatory participation of all individuals in public life”
  • Rules should be subject to public comment and regular legislative or administrative processes. 
  • the absence of restriction does not mean the absence of action
  • Governments have been increasing the pressure on companies to serve as the adjudicators of hate speech. The process of adoption should also be subject to rigorous rule of law standards, with adequate opportunity for public input and hearings and evaluation of alternatives and of the impact on human rights.
  • Problematically, an upload filter requirement “would enable the blocking of content without any form of due process even before it is published, reversing the well-established presumption that States, not individuals, bear the burden of justifying restrictions on freedom of expression”
  • Companies do not have the obligations of Governments, but their impact is of a sort that requires them to assess the same kind of questions about protecting their users’ right to freedom of expression.
  • They should draw on internal and independent human rights expertise, including “meaningful consultation with potentially affected groups and other relevant stakeholders”  (principle 18). They should regularly evaluate the effectiveness of their approaches to human rights harms (principle 20). 
  • The largest companies should bear the burden of these resources and share their knowledge and tools widely, as open source, to ensure that smaller companies, and smaller markets, have access to such technology

Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression 

 

 

Questions to consider:

  • If daily exposure to fringe views and conspiracy theories will eventually change the way the consumer of that media feels about those topics, should we allow free expression of all ideas or does the harm principle become more important?
  • While it is important to recognize the need for freedom of expression, as there is no such thing as objectionable truth, there is also a fine line to be played between what is good for society, and what is just a hindrance
  • But then who gets to decide what is good for society?
  • With the use of social media as it stands, are we are censoring ourselves almost as much as the algorithm is censoring us based on its own internal biases?
 

Facebook Content Moderators

  • While social media companies employ moderators, it’s hard to stay objective when you’re only seeing content that has been flagged, day in and day out. It’s also a job that has largely been outsourced either to overseas or to people within the American society that are considered low-skill despite the fact they’re making very nuanced decisions that affect the mindset of society at large. These technology companies haven’t figured out how to train the moderators, and provide enough societal context for all the content to make sense. Should it even be their job to do so? “many, many of these decisions involve making subjective judgments. There is no policy that can account for every imaginable variation.”
  • “My hope, if I want to be an optimist about all of this – it’s that by devolving some of the power that these tech companies have back to the people, that these systems will be more accountable to their everyday users, that they’ll sort of seem legitimate and that we can bring some semblance of democracy to what, you know, will always remain private companies.” 
  • “And so at a certain point, Facebook has to ask itself, why do these rules exist, and why is it that the rules right now seem to be doing a better job of protecting Alex Jones’ right to speak his mind than they do the victim of a shooting”
  • – Casey Newton For Facebook Content Moderators, Traumatizing Material Is A Job Hazard 
 

Fun Facts:

  • “In fact, according to CrowdTangle, a data-analytics firm owned by Facebook, content from conservative news organizations dominates Facebook and often outperforms content from straightforward news organizations. Additionally, over the last month on Facebook, Trump has captured 91% of the total interactions on content posted by the US presidential candidates, according to CrowdTangle. Biden has captured only 9%.” – Trump says right-wing voices are being censored. The data says something else 
 

Further Reading & Watching:

For Facebook Content Moderators, Traumatizing Material Is A Job Hazard

Manufacturing Consent: Noam Chomsky and the Media – Feature, Documentary  

CHALLENGES TO FREEDOM OF EXPRESSION IN THE NEXT DECADE 

One-pager on “incitement to hatred” 

Policy Recommendations: Internet Freedom 

Freedom of Expression vs. Hate Speech, Fake and Misleading News 

15.4 Censorship and Freedom of Speech – Understanding Media and Culture 

Freedom of expression : Films, fiction and censorship 

 

 

Special Note:

The Special Rapporteurs and Working Groups are part of what is known as the Special Procedures of the Human Rights Council. Special Procedures, the largest body of independent experts in the UN Human Rights system, is the general name of the Council’s independent fact-finding and monitoring mechanisms that address either specific country situations or thematic issues in all parts of the world. Special Procedures experts work on a voluntary basis; they are not UN staff and do not receive a salary for their work. They are independent from any government or organisation and serve in their individual capacity.

 

 

Like us on Facebook and Instagram @tomstuandyou or send us an email at hi@tomstuandyou.com with any questions, comments, thoughts, or queries you have on the topics we’ve discussed or any suggestions of topics you’d like to hear us address. 

 

Thanks so much for your support! 

On The Show Today

Episode Tags

Similar Episodes

Recent Episodes

Share this post

Share on facebook
Share on google
Share on twitter
Share on linkedin
Share on email