Select Page

Rationalist community

The rationalist community is a 21st century movement that formed around a group of internet blogs including LessWrong and Astral Codex Ten (formerly known as Slate Star Codex). The movement gained prominence in the San Francisco Bay Area. Its adherents claim to use rationality to avoid cognitive biases. Common interests include transhumanism, statistics, effective altruism, and mitigating existential risk from artificial general intelligence.

Beliefs

Rationalists are concerned with applying Bayesian inference to understand the world as it really is, avoiding cognitive biases, emotionality, or political correctness.[1][2][3][4] Writing for The New Atlantic, Tara Burton describes rationalist culture as having a "technocratic focus on ameliorating the human condition through hyper-utilitarian goals",[5] with the "distinctly liberal optimism... that defines so much of Silicon Valley ideology — that intelligent people, using the right epistemic tools, can think better, and save the world by doing so".[6]

Early rationalist blogs LessWrong and Slate Star Codex attracted a STEM-interested audience that cared about self-improvement, and was suspicious of the humanities and how human emotions inhibit rational thinking.[7] The movement connected to the founder culture of Silicon Valley and its faith in the power of intelligent capitalists and technocrats to create widespread prosperity.[8][9]

Bloomberg Businessweek journalist Ellen Huet adds that the rationalist movement "valorizes extremes: seeking rational truth above all else, donating the most money and doing the utmost good for the most important reason. This way of thinking can lend an attractive clarity, but it can also provide cover for destructive or despicable behavior".[10] Huet also notes that the borders of the community are blurry,[11] and members who have drifted from core orthodoxies will self-describe as post-rationalist or EA-adjacent.[12]

One of the main interests of the rationalist community is combating existential risk posed by the emergence of an artificial superintelligence.[13][14] Many members of the rationalist community believe only a small number of people, including themselves, have the knowledge and skill required to prevent human extinction.[15][16][17]

History

The rationalist community emerged in the 2000s on various blogs on the Internet, including Overcoming Bias, LessWrong, and Slate Star Codex.[18][19][20]

Eliezer Yudkowsky, who created LessWrong and is regarded as a major figure within the movement, serially published the Harry Potter fanfiction Harry Potter and the Methods of Rationality from 2010 to 2015, which led people towards LessWrong and the rationalist community.[21][22] Harry Potter and the Methods of Rationality was a highly popular fanfiction and is well-regarded within the rationalist community;[23][24] a 2013 LessWrong survey revealed a quarter of its users had found the site due to the fanfiction.[25]

In the 2010s, the rationalist community emerged as a major force in Silicon Valley, with many rationalists working for large technology companies.[26][27] Billionaires Elon Musk, Peter Thiel and Ethereum creator Vitalik Buterin have donated to rationalist-associated institutions.[28][29]

Despite the online origins of the movement, the community is active and close-knit offline, especially in the San Francisco Bay Area, where many rationalists live in intentional communities and engage in polyamorous relationships with other rationalists.[30][31][32] Bay Area organizations associated with the rationalist community include the Center for Applied Rationality, which teaches the techniques of rationality espoused by rationalists, and the Machine Intelligence Research Institute, which conducts research on AI safety.[33][34][35]

Criticism

According to Ellen Huet writing in Bloomberg Businessweek in 2023, "Several current and former members of the community say its dynamics can be "cult-like"".[36] Journalist Allegra Rosenberg describes adherents who became "disillusioned with that whole scene, because it's a little culty, it's a little dogmatic."[37] Émile Torres describes TESCREALism, which includes rationalists, as "operat[ing] like a cult."[38]

Huet also reports the stories of eight women with allegations of sexual misconduct, which they describe as pervasive in the rationalist community.[39]

Writing in The New Yorker, Gideon Lewis-Kraus argues that rationalists "have given safe harbor to some genuinely egregious ideas," such as scientific racism and neoreactionary views, and that "the rationalists' general willingness to pursue orderly exchanges on objectionable topics, often with monstrous people, remains not only a point of pride but a constitutive part of the subculture's self-understanding."[40]

Offshoots and overlapping movements

Effective altruism and transhumanism

The rationalist community has a large overlap with effective altruism[41][42] and transhumanism.[43] Critics such as computer scientist Timnit Gebru and philosopher Émile P. Torres additionally link rationalists with other philosophies they collectively name TESCREAL: Transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism.[44]

Postrationalists

The postrationalists are a loose group of one-time rationalists who became disillusioned with the rationalist community, which they came to perceive as culty [45] and unhumanistic.[5] The term is also used as a hedge by people in the community who have drifted from its orthodoxy.[12] This community also goes by the acronym TPOT, standing for This Part of Twitter.[46][47]

Zizians

The Zizians are a spin-off group from rationalism with an ideological emphasis on veganism and anarchism, which became well known in 2025 for being suspected of involvement in four alleged murders.[48] The Zizians formed around the Bay Area rationalist community, but became disillusioned with rationalist organizations and leaders. Among the Zizians' accusations against them were anti-transgender discrimination, misuse of donor funds to pay off a sexual misconduct accuser, and not valuing animal welfare in plans for human-friendly AI.[49]

See also

References

  1. ^ Huet 2023, a community of people who call themselves rationalists and aim to keep their thinking unbiased, even when the conclusions are scary.
  2. ^ Burton 2023, Central to the rationalist worldview was the idea that nothing — not social niceties, not fear of political incorrectness, certainly not unwarranted emotion — could, or should, get between human beings and their ability to apprehend the world as it really is.
  3. ^ Frank, Sam (January 2015). "Come With Us If You Want to Live". Harper's Magazine. Archived from the original on 2025-02-02. Retrieved 2025-04-02. "Bayesian" has a special status in the rationalist community because it's the least imperfect way to think
  4. ^ Metz 2021, The Rationalists saw themselves as people who applied scientific thought to almost any topic. This often involved "Bayesian reasoning," a way of using statistics and probability to inform beliefs.
  5. ^ a b Burton 2023, To them, rationality culture's technocratic focus on ameliorating the human condition through hyper-utilitarian goals ... had come at the expense of taking seriously the less quantifiable elements of a well-lived human life.
  6. ^ Burton 2023, You might call it the postrationalist turn... The chipper, distinctly liberal optimism of rationalist culture that defines so much of Silicon Valley ideology — that intelligent people, using the right epistemic tools, can think better, and save the world by doing so — is giving way, not to pessimism, exactly, but to a kind of techno-apocalypticism.
  7. ^ Burton 2023, Both LessWrong and the similarly-focused Slate Star Codex... attracted not just passive readers but enthusiastic commenters, who were drawn to the promise of individual self-improvement as well as the potential to discuss philosophy, science, and technology with people as uncompromisingly devoted to the truth as they believed they were. These commenters — a mixture of the traditionally educated and autodidacts, generally STEM-focused and with a higher-than-average share of people who identified as being on the autism spectrum — tended to be suspicious not just of humanities as a discipline, but of all the ways in which human emotional response clouded practical judgment..
  8. ^ Burton 2023, Rationalist culture — and its cultural shibboleths and obsessions — became inextricably intertwined with the founder culture of Silicon Valley as a whole, with its faith in intelligent creators who could figure out the tech, mental and physical alike, that could get us out of the mess of being human.
  9. ^ Frank 2015, Thiel and Vassar and Yudkowsky, for all their far-out rhetoric, take it on faith that corporate capitalism, unchecked just a little longer, will bring about this era of widespread abundance.
  10. ^ Huet 2023, The underlying ideology valorizes extremes: seeking rational truth above all else, donating the most money and doing the utmost good for the most important reason. This way of thinking can lend an attractive clarity, but it can also provide cover for destructive or despicable behavior..
  11. ^ Huet 2023, The borders of any community this pedantic can be difficult to define. Some rationalists don't consider themselves effective altruists, and vice versa.
  12. ^ a b Huet 2023, Many people who've drifted slightly from a particular orthodoxy hedge their precise beliefs with terms such as "post-rationalist" or "EA-adjacent.".
  13. ^ Huet 2023, Since the early 2000s, Yudkowsky has argued that hostile artificial intelligence could destroy humanity within decades. This driving belief has made him an intellectual godfather in a community of people who call themselves rationalists.
  14. ^ Burton 2023, They focused on big-picture, global-level issues, most notably and controversially Yudkowsky's pet concern: the "x-risk" ("x" for existential) that we will inadvertently create unfriendly artificial intelligence that will wipe out human life altogether..
  15. ^ Huet 2023, Within the group, there was an unspoken sense of being the chosen people smart enough to see the truth and save the world, of being "cosmically significant,".
  16. ^ Frank 2015, I asked him about the rationalist community. Were they really going to save the world? From what? "Imagine there is a set of skills," he said. "There is a myth that they are possessed by the whole population, and there is a cynical myth that they're possessed by 10 percent of the population. They've actually been wiped out in all but about one person in three thousand.".
  17. ^ Burton 2023, For many, rationality culture had at least initially offered a thrilling sense of purpose: a chance to be part of a group of brilliant, committed young heroes capable of working together to save all humanity..
  18. ^ "Rationalist Movement – LessWrong". www.lesswrong.com. Archived from the original on 2023-06-17. Retrieved 2023-06-19.
  19. ^ Metz, Cade (2021-02-13). "Silicon Valley's Safe Space". The New York Times. ISSN 0362-4331. Archived from the original on 2021-04-20. Retrieved 2023-06-19.
  20. ^ The Rationalist's Guide to the Galaxy: Superintelligent AI and the Geeks Who Are Trying to Save Humanity's Future. Orion. 13 June 2019. ISBN 9781474608800. Archived from the original on 18 May 2023. Retrieved 23 June 2023.
  21. ^ Whelan, David (March 2, 2015). "The Harry Potter Fan Fiction Author Who Wants to Make Everyone a Little More Rational". Vice. Retrieved 11 April 2025.
  22. ^ Burton 2023, In his Harry Potter and the Methods of Rationality — perhaps old-school rationalists' most effective recruiting text — Eliezer Yudkowsky is clear that part of the appeal of rationality is the promise of self-overcoming, of becoming more than merely human.
  23. ^ Frank 2015, The next year, Yudkowsky began publishing Harry Potter and the Methods of Rationality at fanfiction.net. The Harry Potter category is the site's most popular, with almost 700,000 stories; of these, HPMoR is the most reviewed and the second-most favorited.
  24. ^ Koebler, Jason (20 November 2023). "New OpenAI CEO Was a Character in a Harry Potter Fanfic That's Wildly Popular With Effective Altruists". 404 Media. Retrieved 11 April 2025.
  25. ^ Frank 2015, Of the 1,636 people who responded to a 2013 survey of Less Wrong's readers, one quarter had found the site thanks to HPMoR, and many more had read the book.
  26. ^ Tiku, Nitasha (2022-11-17). "The do-gooder movement that shielded Sam Bankman-Fried from scrutiny". The Washington Post. Retrieved 2022-11-25.
  27. ^ Sargeant, Alexi (3 January 2018). "Simulating Religion". Plough. Retrieved 22 February 2024.
  28. ^ Huet 2023, The movement's leaders have received support from some of the richest and most powerful people in tech, including Elon Musk, Peter Thiel and Ethereum creator Vitalik Buterin.
  29. ^ Burton 2023, Investor Peter Thiel gave over $1 million to Yudkowsky's Machine Intelligence Research Institute. Elon Musk met his now-ex Grimes when the two bonded on Twitter over a rationalist meme.
  30. ^ Huet 2023, Joseph moved to the Bay Area .... There, she realised the social scene that seemed so sprawling online was far more tight-knit in person. Many rationalists and effective altruists, who call themselves EAs, worked together, invested in one another's companies, lived in communal houses and socialised mainly with each other, sometimes in a web of polyamorous relationships..
  31. ^ Burton 2023, There were commune-style rationalist group houses and polyamorous rationalist group houses devoted to modeling rational principles of good living..
  32. ^ Metz 2021, The Rationalists held regular meet-ups around the world, from Silicon Valley to Amsterdam to Australia. Some lived in group houses. Some practiced polyamory..
  33. ^ Frank 2015, Whereas MIRI aims to ensure human-friendly artificial intelligence, an associated program, the Center for Applied Rationality, helps humans optimize their own minds, in accordance with Bayes's Theorem..
  34. ^ Metz 2021, Because the Rationalists believed A.I. could end up destroying the world — a not entirely novel fear to anyone who has seen science fiction movies — they wanted to guard against it. Many worked for and donated money to MIRI, an organization created by Mr. Yudkowsky whose stated mission was "A.I. safety.".
  35. ^ Ratliff 2025, One was an alumni gathering for a nonprofit called the Center for Applied Rationality. The Bay Area group ran workshops dedicated to "developing clear thinking for the sake of humanity's future," as they put it.... CFAR was itself an outgrowth of another organization, the Machine Intelligence Research Institute, devoted to the technical endeavor of creating artificial intelligence that wouldn't destroy the world..
  36. ^ Huet 2023, Several current and former members of the community say its dynamics can be "cult-like".
  37. ^ Shugerman 2024, Members of the TPOT community are often referred to as "post-rationalists" — former adherents who became "disillusioned with that whole scene, because it's a little culty, it's a little dogmatic," said journalist Allegra Rosenberg, who wrote about the subculture for Dirt..
  38. ^ Torres, Emile (23 August 2023). "'Before It's Too Late, Buddy'". Truthdig. Retrieved 20 February 2025. The threats that I've received, the worries expressed by Knutsson, and the fact that TESCREALists themselves feel the need to hide their identities further bolsters my claim that this movement is dangerous. It operates like a cult, has "charismatic" leaders like Yudkowsky and Bostrom, and appears to be increasingly at ease with extreme rhetoric about how to stop the AGI apocalypse.
  39. ^ Huet 2023, Eight women in these spaces allege pervasive sexual misconduct, including abuse and harassment, that they say has frequently been downplayed..
  40. ^ Lewis-Kraus, Gideon (2020-07-09). "Slate Star Codex and Silicon Valley's War Against the Media". The New Yorker. Archived from the original on 2025-02-28. Retrieved 2025-04-05.
  41. ^ Metz 2021, 'Many Rationalists embraced "effective altruism," an effort to remake charity by calculating how many people would benefit from a given donation.'.
  42. ^ Huet, Ellen (2023-03-07). "The Real-Life Consequences of Silicon Valley's AI Obsession". Bloomberg Businessweek. Archived from the original on 2025-03-01. Retrieved 2025-04-08. 'These distinct but overlapping groups developed in online forums'
  43. ^ Burton, Tara Isabella (Spring 2023). "Rational Magic". The New Atlantis. Retrieved 2025-04-02. There were rationalist sister movements: the transhumanists, who believed in hacking and improving the "wetware" of the human body; and the effective altruists, who posited that the best way to make the world a better place is to abandon cheap sentiment entirely
  44. ^ "The Wide Angle: Understanding TESCREAL — Silicon Valley's Rightward Turn". May 2023. Archived from the original on 2023-06-06. Retrieved 2023-06-06.
  45. ^ Shugerman, Emily (2024-12-10). "This one internet subculture explains murder suspect Luigi Mangione's odd politics". The San Francisco Standard. Retrieved 2025-04-02. former adherents who became "disillusioned with that whole scene, because it's a little culty, it's a little dogmatic"
  46. ^ Burton 2023, the postrationalists — also known by the jokey endonym "this part of Twitter," or TPOT.
  47. ^ Shugerman 2024, Members of the TPOT community are often referred to as "post-rationalists".
  48. ^ Ratliff, Evan (February 21, 2025). "The Delirious, Violent, Impossible True Story of the Zizians". Wired. Archived from the original on February 26, 2025. Retrieved February 26, 2025.
  49. ^ Ratliff 2025, 'They alleged that MIRI had "paid out blackmail (using donor funds)" to quash sexual misconduct accusations and that CFAR's leader "discriminates against trans women."...expressed outrage that MIRI's efforts to create human-friendly AI didn't seem to include other animals in the equation.'.