

There is no place on the internet quite as contentious as the Intellectual Dark Web (IDW). First defined by Bari Weiss in his seminal 2018 New York Times article, the IDW is the convergence of various media personalities and intellectuals engaging in conversations about topics often disregarded by the mainstream media. When examining the dissemination of hate speech and harmful discourses on the internet, specifically via the Intellectual Dark Web, content moderation should not be reliant on a single actor within the system. Rather, this responsibility should be shared between users, platforms, and regulatory institutions.
Key to understanding the complexities of the Intellectual Dark Web is understanding the context in which it has risen to prominence. Weiss (2018) recognises that a key goal of the IDW is to protect “free speech”, yet fails to outline what constitutes free speech within the community. Ben Shapiro once defined the IDW as “a network of friends who are willing to talk about things we disagree about” (Shapiro, 2020). This explanation gives insight into understanding the limits and impacts of speech within this context. Borrowing off the concepts set out by Reimer and Peter (2021), “free speech” on the IDW is established by finding a space whereby content is available to the masses, regardless of potentially harmful content. They also suggest that this content should not be subject to corporate or governmental censorship.
The Intellectual Dark Web capitalises on the ideas which founded the early internet in order to situate its movement as foundational to the protection of free speech. In doing so, the IDW has been responsible for shifting internet users’ understanding of where the line between free speech and harmful speech occurs. The existence of the IDW is not unfounded when considering the historical landscapes of the internet. Early ideas of the internet considered freedom, transparency, and the open flow of communication central to its performance (O’Hara & Hall, 2018). Essential to these foundations was the lack of structural governance, as spontaneous order would naturally occur in following notions of liberal ideals (Popiel, 2018).
What distinguishes the IDW from other forms of the internet community is that it has no singular tangible place of organisation. Rather, the IDW is conceptualised as a ‘movement’ that’s existence has been predicated on the principles which dictated the internet’s development (Weiss, 2018, Parks 2020). The introduction of Web 2.0 enhanced the idea of collaborative content generation and freedom of expression with the introduction of user-friendly platforms (Reimer & Peter, 2021). This enhanced the ability of online communities to establish meaningful connections with the content they consume, thus altering the media people consume.

Examining the ways in which Jordan Peterson and Ben Shapiro articulate their desires for freedom of speech highlights the ways in which the key interests of the IDW are in tension with the current values of a ‘safe internet’ in the wake of platformisation. The IDW is conceptualised as a movement that’s existence has been predicated on the principles which dictated the internet’s development (Weiss, 2018, Parks 2020). Despite the potential for diversity within the IDW, its key figures seem to be relatively similar; namely being white and male (Parks 2020). The diversity of the IDW seems to lie most predominantly within their political alignment (Weiss, 2018). Figures such as Ben Shapiro and Joe Rogan stand as conservative mouthpieces within the IDW, yet Liberal centrists such as Jordan Peterson still maintain a degree of dominance within the field (Weiss, 2018., Farrell, 2018).
Issues of diversity and coordination of online communities are important to recognise, as it has the potential to influence content created by an individual, thus shaping the views of its consumers. The rise of ‘social bubbles’ due to algorithmic user-sorting on digital platforms (Reimer & Peter, 2021) has the ability to potentially place consumers in a microcosm of ideology and information access. The impacts of such digital landscapes has had become evident in how the broader internet conceptualises the IDW. Specifically, it has become recognised as a “gateway” for harmful, far-right extremist ideology (Lamoureux, 2019). Shapiro’s content specifically highlights this phenomenon, after rising to popularity depicting him as “destroying” opponents he debates with, speaking fastly and often over his adversaries (Shapiro, 2021). By framing their content as intellectual and nuanced (K. N. C. &A. M., 2019), individuals such as Shapiro have enhanced the ability for algorithmic user-sorting on platforms to generate a social bubble that associates right-wing ideology with intelligence.
Tensions between varying institutional interests have made moderation of the Intellectual Dark Web an ambiguous space to navigate. As the IDW is renowned for its apprehension to moderate the content of its most notable figures external regulation of content comes from two key institutions; the social media platforms which host them, and government regulation. Economic interests have been regarded as central to the perseverance of the intellectual dark web, and this protection is equated with the fight to protect modern free speech. By this framework, the rise of the Intellectual Dark Web, specifically in its reliance on platforms such as YouTube comes from a response to the platformisation of the internet (Oliva, 2020).
Corporate regulation of content from digital platforms lies precariously between protecting the interests of its users, and its own economic profitability (Martin, 2019). Maintaining the original values of an open, free internet, (O’Hara & Hall, 2018) Section 230 of the United States’ Communications Decency Act has outlined the responsibilities of platforms in moderating content since 1996. Developed prior to the advent of the current digital platform landscape, the application of Section 230 to contemporary platforms potentially changes the implications it holds. Specifically, with the rise of algorithmic user and content sorting (Reimer & Peter, 2021), the implication that a platform is not responsible for the content it hosts can mean that at times, harmful or illegal content, not be moderated effectively (Oliva, 2020). Due to the shortfalls of Section 230 in moderating contemporary digital landscapes, Social media companies have begun to recognise a degree of corporate responsibility in regulating the distribution of harmful content on their servers (Oliva, 2020). Adoption of Terms of Service Agreements by platforms has created outlines as to what content is appropriate to distribute.

The ambiguities of responsibility regarding content moderation create a tension often explored by leaders on the IDW. Section 230 was ultimately designed by American lawmakers in order to minimise the discouragement of free speech on the internet (Tarleton, 2018). It, therefore, has become a reasonable deduction within IDW communities that the moderation of ‘harmful’ content poses a threat to the ability to freely express a political opinion, eroding the rights of these users (Parks 2020). This conflict of ideas has played out in varying degrees as the IDW becomes more widespread. The boycott of platforms such as Patreon by IDW users due to their increasingly restrictive terms of service has seen just how influential these tensions are to audiences consuming such content. Whilst Section 230 dictates no responsibility for platforms to moderate problematic content on their servers, self-regulation by platforms remains vague and is often done in order to protect economic interests (Popiel 2018). Moderating problematic content has therefore become a task of finding a way in which measures taken by both corporate and governmental institutions work effectively in maintaining the original ideals of the internet, and doing so with little economic impact.
Examining the Intellectual Dark Web through the lens of the communications political economy reveals the extent to which problematic content on the internet has become a more prevalent issue than ever. As competing ideas of how the internet should be conceptualised occur with competing interests, a collaborative effort from various institutions may allow for meaningful discourse to examine how harmful content best be moderated in future digital economies.
References
Ernest, S. K. [@Scottkernest]. (2020, Feb 10). Jordan Peterson.. and most of the “intellectual
dark web” tend to be gateways. [Twitter].
https://twitter.com/scottkernest/status/1226573739573334023
Farrell, H. (2018, May 10). The “Intellectual Dark Web,” explained: what Jordan Peterson has in
common with the alt-right. Vox.
https://www.vox.com/the-big-idea/2018/5/10/17338290/intellectual-dark-web-
rogan-peterson-harris-times-weiss
Hasan, M [@Mehdirhasan]. (2020, May 12). The president calls for the firing of a prominent
journalist he doesn’t like [Twitter].
https://twitter.com/mehdirhasan/status/1259950420421738499
N. C., A. M., (2019, Mar 28) Inside the mind of Ben Shapiro, a radical conservative. The Economist.
https://www.economist.com/open-future/2019/03/28/inside-the-mind-of-ben-shapiro-
a-radical-conservative
Lamoureux, M. (2019, Aug 29) YouTube Commenters Shift From ‘Intellectual Dark Web’ Fans
to the Far-Right, Study Shows. Vice News.
https://www.vice.com/en/article/pa7pvb/what-79-million-youtube-comments-can-
tell-us-about-far-right-radicalization
Martin, F. (2019) The Business of News Sharing. In Martin, F. & Dwyer, T. (Eds). Sharing News
Online: Commentary Cultures and Social Media News Ecologies. (pp. 91-127). Springer
International Publishing, Cham.
O’Hara, K., Hall, W. (2018, December 7). Four Internets: The Geopolitics of Digital
Governance. CIGI paper 206. Centre for International Governance Innovation.
https://www.cigionline.org/publications/four-internets-geopolitics-digital-governance/
Oliva, T, D. (2020). Content Moderation Technologies: Applying Human Rights Standards to
Protect Freedom of Expression. Human Rights Law Review, 20(4). 607-640.
https://doi.org/10.1093/hrlr/ngaa032
Parks, G. (2020). Considering the Purpose of “An Alternative Sense-Making Collective”: A
Rhetorical Analysis of the Intellectual Dark Web. Southern Communication Journal,
85(3), 178-190. https://doi-org./10.1080/1041794X.2020.1765006
Popiel, P. (2018). The Tech Lobby: Tracing the Contours of New Media Elite Lobbying Power.
Communication, Culture and Critique, 11(4), 566-585. https://doi.org/10.1093/ccc/tcy027
Reimer, K & Peter, S. (2021). Algorithmic audiencing: Why we need to rethink free speech on
social media. Journal of Information Technology, 36(4). 409-426.
https://doi-org/10.1177/0268396221101335
Rubin, D [The Rubin Report]. (2018). What is The Intellectual Dark Web? | DIRECT|
MESSAGE | Rubin Report. [YouTube]
https://www.youtube.com/watch?v=n5HN-KT9rj0&ab_channel=TheRubinReport
Shapiro, B. [Ben Shapiro]. (2021, March 27) Ben Shapiro DESTROYS Megan Rapinoe and the
gender pay gap. [Youtube Video].
https://www.youtube.com/watch?v=6RDZw_xSvDg&ab_channel=BenShapiro
Sommer, W (2018, Dec 18). Stars of ‘Intellectual Dark Web’ Scramble to Save Their Cash
Cows. The Daily Beast.
https://www.thedailybeast.com/stars-of-intellectual-dark-web-scramble-to-save-
their-cash-cows
Tarleton, G. (2018). Regulation of and by Platforms. In. Burgess, J. E. (Ed.) the SAGE
Handbook of Social Media. SAGE Publications, London.
Twitter. (N.D). The Twitter Rules. https://help.twitter.com/en/rules-and-policies/twitter-rules
Weiss, B. (2018, May 8). Meet the Renegades of the Intellectual Dark Web. The New York Times.
https://www.nytimes.com/2018/05/08/opinion/intellectual-dark-web.html