Reflection of Research Talk Part 2/3- The Infrastructure of Hate: How Online Networks Are Manipulated and Exploited by Right-Wing Extremists

By: Paola Molina 2/10/2020

Alt-Right Uses YouTuber’s Marketing Strategies to Gain Fame; They Use YouTube’s Algorithms As They are Intended and Get Rewarded

Dr. Julia DeCook talks further about the vulnerability of these networks. She mentions that it is very easy for these groups to manipulate online spaces. One example she mentions is that they change the algorithms of companies like Google and YouTube so that their propaganda can come into your life. We see this in action when people look at “lets play” videos. She says that it won’t take long for people to start seeing alt-right videos as they navigate through this type of content because one of the alt-right’s main goals is to recruit young men. Young men make a big part of the gaming community; hence, why they are most likely to manipulate the algorithm so that their hateful content shows up there. Moreover, DeCook outlines another reason they have been so successful at having audiences for their extreme ideology, and it is because they use mobilization strategies typically used by internet influences to get clicks and views. Some of those strategies include using a lot of hashtags, writing provocative headlines, posting constantly and frequently, answering comments, among others.

This fact stunned me, and I looked for more research on how these Youtuber’s techniques could possibly help White Supremacists, and the answer was that they are one of the main forms, if not the main form, in which White Supremacists spread their message in today’s era. YouTube is one of the biggest social media sites that exists right now, so it would be logical, given how we saw the example of the martinlutherking[dot]org website, that they would use that popular tool and manipulate it to their will. Emma Grey Ellis of Wired explains in her article, “The Alt-Right Are Savvy Internet Users. Stop Letting Them Surprise You” the steps these groups take to spread their propaganda on YouTube. She explains that their rise to fame is not that different from YouTube stars, “While the commentators—who range in ideology from mainstream libertarian to openly white nationalist—certainly owe a debt to Glenn Beck and Sean Hannity in their tone and style, their strategy is all Pewdiepie and Jeffree Star and Team Ten. Which is to say, they play to YouTube’s algorithms, just like anyone who is trying to become a star” (Ellis, 2018). She has such strong language when it comes to describing why they keep having a platform, and it is because YouTube ultimately rewards them. Since they are doing what other Youtubers are doing, “Far-right YouTube is ultimately just YouTube…Regardless of whether you’re spewing hate or showing off your Target hauls, YouTube’s reward systems work based on behavior, indiscriminately of content—something no amount of fact-checking will fix” (Ellis, 2018). Using a study from Data & Society, Alternative Influence: Broadcasting the Reactionary Right on YouTube by Rebecca Lewis, she outlines how these alt-right individuals use the algorithm of YouTube as it is intended,


“They establish their authenticity like any vlogger: Look directly into a camera you set up in an intimate location like your bedroom, address your audience like friends, and weave big ideas together with anecdotes from your own life. Once your cred is established with your audience, you can use that clout to call your detractors out as “fake” and expect, not just to be believed, but also to gain countercultural appeal… It’s also how far-right YouTubers convince their viewers to disregard the mainstream media…YouTube’s Alternative Influence Network has also learned to sell their ideas like other YouTubers sell products: They post testimonials about how much good far-right politics has done for them, they use their own celebrity to give their ideas some glamor, hijack keywords to make sure they come up in your searches, and engineer small scandals to make themselves newsworthy” (Ellis, 2018)

Thus, this is what makes them grow, attract attention, and get more views. YouTube will then take notice of those numbers, not of the content, and keep rewarding those alt-right YouTubers by giving them the chance to further promote and have people see their content more often. Which makes this all more frustrating because these algorithms add more wood to the fire. They make it easier for these people to come out of the dark places of the internet and have their message spread like crazy without consequences. Other social media sites they use are Reddit and 4Chan, where Daniels explains that, “Algorithms speed up the spread of White supremacist ideology, as when memes like “Pepe the Frog” travel from 4chan or Reddit to mainstream news sites. And algorithms, aided by cable news networks, amplify and systematically move White supremacist talking points into the mainstream of political discourse” (Daniels, 2018). Just like YouTube, by hanging on to something that is popular and manipulating it (memes as Daniels mentions in this case) these groups can successfully get people to talk about it. Next thing we have is mainstream news talking about the Pepe symbol, which is good to keep the public informed, but it further broadcasts their hateful ideology.

I agree with Daniels that, “We have to recognize that the algorithms of search engines and social media platforms facilitated these hate crimes” because these algorithms serve on a silver platter what these alt-right extremists want – they keep reaffirming their racist and bigoted beliefs by giving them the information they are so desperately trying to confirm. I also agree with Ellis that in a way, social media sites are complicit in how these White Supremacists keep having a spot on the internet, “Without YouTube’s help, it’s going to be difficult to stop people with bad ideas from using a platform exactly the way they’re supposed to… So it’s time to stop being surprised when the far-right is good at the internet. It’s time to expect to see them trending, and hold platforms accountable when they do” (Ellis, 2018). Thus, it was very fascinating to read about how these groups use those same influencer techniques to get to fame. It is disheartening to see this happen, but it is a reality, and we have to find solutions. Like Dr. DeCook mentioned before, it is really difficult to ban them because they will find a way out eventually, but having read the various ways they exploit certain tools like social media sites, I think one of the solutions could be that these social media sites have to understand how they embolden these White Supremacists and do something about it. They have to look at the content they are rewarding and understand that their presence in their social media sites are what is causing the horrific hate crimes we are seeing more and more often. They need to realize they sometimes clicks and views are not as valuable if what causes those clicks and views are negative and hateful rhetoric that has been the major cause of hateful messaging we have seen lately. So, we should hold these companies accountable for aiding and smoothing out the path for White Supremacists to gain fame, form a cult, and promote messages that hurt people.

The Numbers and Cases that Indicate How Right Extremism is a Rising Threat

To understand how the prevalence of how the alt-right is affecting the public we have to look at some facts. According to the ADL (Anti-Defamation League), “In terms of lethal violence, 2018 was dominated by right-wing extremism.  Every one of the 50 murders documented by the COE [ADL’s Center on Extremism] was committed by a person or persons with ties to right-wing extremism, although in one incident the perpetrator had switched from white supremacist to radical Islamist beliefs prior to committing the murder” (ADL). The ADL’s study, “Murder and Extremism in the United States in 2018”, has a bunch of statistics in which they link White Supremacy as rising threat because they have caused deadly actions. Let’s not forget the many recent senseless shootings that have occurred at the hands of these individuals like in 2018 where a shooter killed 11 people at a Pittsburg synagogue over anti-Jewish hate. In 2017 Heather Heyer was murdered at the hands of a White Supremist in Charlottesville when said individual drove into a crowd of protestors who were against the hate groups marching in that city. This is just to name a few, but it is a cause of concern because obviously the alt-right speech is not just words or “free speech” anymore – they have turned deadly to innocent people.

Another example is how we let racist and bigoted alt-right media influencers have the ability to spread their extremist ideology like InfoWars. Until recently, you could not touch such organizations because that would mean limiting their free speech. But let’s consider how these media organizations have caused continuous harm to other people because of their extreme propaganda. Alex Jones of InfoWars has spread conspiracy theories that the Sandy Hook shooting never happened and that the parents were paid actors. He is the most famous example, but there have been other organizations and followers who regurgitate this conspiracy theory. This has been such a horrific and painful experience for the parents of the Sandy Hook victims because, not only do they have to repeat to these individuals that what they are saying is crazy, but they have to constantly relive the pain they went through, and continue to fight such conspiracy theories over and over again. Svrluga describes the horrific scene these individuals made the families go through, “Even before the funerals were over, grieving families became targets, with people accusing them of being actors paid to play a role. Over time, many assumed they would eventually be left alone, but theories flourished in anonymous online forums and on social media, and Pozner [one of the Sandy Hook Shooting victim’s parents] received death threats” (Svluga). One example of this tiresome battle is the case of Lenny Pozner, father of 6-year old Noah Pozner who was killed in the shooting. Mr.Pozner sued James H. Fetzer and Mike Palecek because these individuals claimed that  he had faked his son’s death certificate. The jury compensated Mr.Pozner for the defamation case he presented and harm that was being caused (Svrluga). Unfortunately, he is not the only parent to face constant harassment, but this is just to showcase that what these alt-right participants do have negative consequences that hurt people – at that point it is not “freedom of speech”, but danger to public safety.

In the last part of this blog, we will see how the political climate plays a role and what could be possible solutions. Part Will Publish On Wednesday, February 12, 2020

Bibliography

Daniels, Jessie. “The Algorithmic Rise of the ‘Alt-Right.’” Contexts, vol. 17, no. 1, Feb. 2018, pp. 60–65, doi:10.1177/1536504218766547.

DeCook, Julia. “The Infrastructure of Hate: How Online Networks Are Manipulated and Exploited by the Right-Wing Extremist.” Research Talk Conference, October 16, 2019, School of Communication – Loyola University Chicago, Chicago, IL.

“Documenting Hate: Charlottesville.” Frontline and ProPublica, https://www.pbs.org/wgbh/frontline/film/documenting-hate-charlottesville/. Accessed Oct. 16, 2019.

“Murder and Extremism in the United States in 2018.” Anti-Defamation League, https://www.adl.org/murder-and-extremism-2018. Accessed 24 Oct. 2019.

Svrluga, Susan. “Jury Awards $450,000 to Father of Sandy Hook Victim in Defamation Case.” Washington Post, https://www.washingtonpost.com/education/2019/10/16/jury-awards-father-sandy-hook-victim-defamation-case/. Accessed 22 Oct. 2019.

“The Alt-Right Are Savvy Internet Users. Stop Letting Them Surprise You.” Wired. www.wired.com, https://www.wired.com/story/alt-right-youtube-savvy-data-and-society/. Accessed 25 Oct. 2019.

Thomson, Keith. “White Supremacist Site MartinLutherKing.Org Marks 12th Anniversary.” HuffPost, 16 Jan. 2011, https://www.huffpost.com/entry/white-supremacist-site-ma_b_809755. Accessed 28 Oct. 2019.