Reaction Paper #1

One factor which makes a huge difference in the culture and toxicity of online communities is the level of anonymity they provide. After all, if no one can trace what you say back to you, you can say whatever you want without consequence – at least in theory. As Anil Dash brings up in the article “If your website’s full of assholes, it’s your fault”, there is a difference between the anonymity provided by having a pseudonym that you publish things to the community under, and true anonymity1. A prime example of this is tumblr: users have the ability to send each other “asks”, and they can choose whether or not to let the receiver see the name of the account that sent it. Anonymous asks have developed a negative reputation among the community, because people are more likely to send cruel asks to others when it can’t be traced back to them. 4chan is another community that is infamous for both the level of anonymity it provides its users and for generally being a very negative space.

Another factor in the level of tolerance an online community appears to have for hurtful behavior is the systems it has for dealing with said behavior. On Discord, for example, every server is run solely by the people who make it, and a select few others whom the server creators give moderation powers to. The server can easily become an unpleasant place to be under any of the following circumstances: the moderators are not active very often; there are very few of them compared to the overall number of people in the server; they are lenient on people who break the rules, or even actively support them and their toxic behavior. Even on sites like YouTube and Facebook that have corporations behind them and huge teams employed to monitor for content that breaks their Terms of Service, there is so much content posted regularly that keeping up with it is a monumental, if not impossible, task. 

On top of that, there is a mental and emotional cost for the people who take on such roles. Back in February, Casey Newton wrote an article for The Verge about the experiences of some of the people who review reported content on Facebook. Their job is a balance of “striving to purge Facebook of its worst content while protecting the maximum amount of legitimate (if uncomfortable) speech,” 2 and they are evaluated based on their efficiency and accuracy. The article goes over many sickening aspects of the situation, such as how employees are allotted very little break time because of the sheer number of posts there are to review. Exposure to horrific posts like video of people being killed causes many of the workers to develop PTSD or other mental illnesses, while consistent exposure to offensive humor and conspiracy theories bleeds into their personal lives and, seemingly antithetical to the purpose of the space, creates a working atmosphere where these things are tolerated. Most of the former employees interviewed for the article quit the job after a year3.

The main impression I took away from this article is how impossible it is to regulate online communities on a large scale. This is why I do my best to regulate which segments of online communities I participate in. For example, I used to be in a few different discord servers, but some of them occasionally posted political content from sources that I deeply distrusted, and some of the biggest members in those servers would repeatedly say casually bigoted things. When I realized that I couldn’t trust the mods of the servers to keep things on topic and crack down on the bigotry, I quietly left the servers, and have since been much more active in the couple I feel comfortable in. 

The readings I’ve done for this class, particularly the article by Jennifer Peepas for Coral, have only further reinforced that I made the right decision. She tailored her blog, Captain Awkward, to be a community where visitors are assured the opportunity to “have a reasonably civil and constructive discussion” centered “around the idea that there is power in speaking up…  It was always about the words”4. To me, this is the biggest part of being a digital citizen: regulating your own online experience to get the most out of it. Use the tools at your exposure, like blocking people and filtering tags, so you can have a good time in the communities you’re interested in. 

Works Cited

Dash, Anil. “If Your Website’s Full of Assholes, It’s Your Fault.” Anil Dash, Anil Dash, 20 July 2011, http://anildash.com/2011/07/20/if_your_websites_full_of_assholes_its_your_fault-2/

Newton, Casey. “The Secret Lives of Facebook Moderators in America.” The Verge, Vox Media, 25 Feb. 2019, https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

Peepas, Jennifer. “Why Captain Awkward Wanted A Different Kind Of Comments.” The Coral Project Guides, Vox Media, 26 July 2017, https://guides.coralproject.net/captain-awkward/

  1. Dash 2011
  2. Newton 2019
  3. Newton 2019
  4. Peepas 2017

One thought on “Reaction Paper #1”

  1. >how impossible it is to regulate online communities on a large scale.

    We completely agree. Technology can only help up to a point.

    Thanks for using our piece in your paper!

    Andrew Losowsky
    Head of The Coral Project

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php