## Colophon title:: Out of the Rabbit Hole: From Radicalization to Amplification of Far-Right Content Online type:: [[literature-note]] tags:: [[youtube]] [[radicalisation]] [[algorithm]] url:: https://cyber.fsi.stanford.edu/events/out-rabbit-hole-radicalization-amplification-far-right-content-online file:: status:: [[brewing]] ## Notes - These are some rough notes I captured while listening in to webinar featuring Becca Lewis. - I don't believe Chatham House rules were invoked, so I think this is ok to publish. - Don't quote anything on here, as it is a combination of what was said and my own interpretation. From the webinar's registration page: > Conventional wisdom suggests that conspiracy theories and far-right propaganda thrive mainly at the end of algorithmic rabbit holes, in the deep, dark corners of the internet. This presentation will show that the opposite is true by explaining how in fact, harmful ideas gain traction through the charisma and popularity of internet celebrities in mainstream social media contexts. Through her extensive research on far-right YouTubers, Becca Lewis argues that instead of merely focusing our responses on the threat of algorithmic rabbit holes, we must also understand the power of amplification through thriving alternative media systems on- and offline. Is the YouTube algorithm radicalising people to the far right? - Journalistic consensus: Yes. Cited Tufekci, Roose - Academics: Split. O'callaghan 2015, Riberio 2019. Recent scholarship rejects this (missed the citations :() #### Look Beyond the Algorithm - Lewis has argued that we need to look beyond the algorithm. - If the user became a fan of a YouTuber, then they would subscribe/follow. - Creators appeared across each other channels to be introduced to new audiences. #### Importance of Framing and Metaphors Framing about YT algorithm radicalising is incorrect, and causes us to miss many aspects (highlighted below in the 6 assumptions) Use of the rabbit hole metaphor. - metaphors shape how we think. Lakoff and Johnson 1980 - Alice in wonderland origins. Flexible refers to LSD trips and the Matrix - Internet metaphors: information superhighways. New metaphors focused on depth in the mid 2000s. Deep web, dark web. - conflation led to belief in the existence of a subsurface internet where morally wrong activities took place. But good actors can operate in the dark web (e.g. activitsts in authoritarian regimes) and bad actors on the surface web (various supremacists on social media platforms). #### Challenging rabbit hole metaphors ###### Assumption 1: White supremacist and Male supremacist discourse is fringe and extreme. - Extemism definition from berger does not reference mainstream but in-group/ out-group dynamics. (missed quote :() - Describes great replacement theory and 4 quotes. NZ Christchurch shooter, jason kessler unite the right, charlie kirk turning point usa, tucker carlson) - These 4 actors are very different, yet researchers tend try to draw lines to classify them that don't exist in reality. ###### Assumption 2: Alternative media is distinctive and segmented from mainstream media - The term 'mainstream media' has lost meaning. Used along 4 dimensions but none standard (Lewis addressed shortcomings with each of these) - Political orientation? - Institutional affiliation? - size of audience? - profit? - More productive way is through dynamics (I missed some bits here) - Individual creators - media orgs - audiences - YouTube/platforms ###### Assumption 3: Online political media operates on a left-center-spectrum. - There are challenges over what is far-right/radical right/extreme right/ - Criticises ad fontes media chart: - false equivalence between left and right - values 'center' as newsworthy - Notes that creators don't follow left/right neatly (Look up nozzle - right and left mix). - Moved away from far right etc to white/male supremacist. - Researchers looking at alternate frames such as alt light, anti woke, etc. ###### Assumption 4: Viewers move over time and space but ideas and channels stay fixed. - Alt right effectively doesn't exist now. Creators have distanced themselves from the term after the "PR disaster" that was Charlottesville. But they moved to qanon etc. anti sjw moved to anti-crt. - Some individual channels gain, some lose. New names on emerge on twitch/tiktok/ig - Individual politics also change. - This poses methodological issues as researchers use older lists as proxies. - Ideas don't stay fixed either. Youtubers pickup and amplify existing ideas. ###### Assumption 5: Algorithms act upon humans and not in response to us - there is a back and forth process - Hypodermic needle model which has been discarded but made a comeback. - Youtube creators are aware of how certain aspects of the algorith works and try to shape their videos accordingly. ###### Way forward Shift many of categories we use as conceptual tools into objects of research. *Refers to upcoming Alice. E. Marwick paper that questions what it means to be radicalised. te*