Jump to content

TESCREAL: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Analysis: Une contradiction chez les TESCREAListes ? Pas à en croire Émile P. Torres qui distingue deux mouvements au sein de cette idéologie : ceux qui voient le verre à moitié vide, inquiets à l’idée que l’intelligence artificielle risque de se retourner contre eux et détruire l’humanité, et ceux qui voient le verre à moitié plein, les accélérationnistes qui veulent foncer dans le tas coûte que coûte. Dans les deux cas, la croyance est la même : les technologies avancées nous amèneront d
mNo edit summary
(45 intermediate revisions by 16 users not shown)
Line 1: Line 1:
{{Short description|Multiple philosophies used to advocate for AGI}}
{{Short description|Multiple philosophies used to advocate for AGI}}
{{POV|date=July 2024}}
<!-- Please do not remove or change this AfD message until the discussion has been closed. -->
{{AfDM|page=TESCREAL (2nd nomination)|year=2024|month=August|day=8|substed=yes|origtag=afdx}}
<!-- End of AfD message, feel free to edit beyond this point -->

{{Use American English|date=July 2024}}
{{Use American English|date=July 2024}}
{{Use mdy dates|date=July 2024}}
{{Use mdy dates|date=July 2024}}


'''TESCREAL''' is an [[acronym]] [[neologism]] proposed by computer scientist [[Timnit Gebru]] and philosopher [[Émile P. Torres]] that stands for "[[transhumanism]], [[extropianism]], [[singularitarianism]], [[Hugo de Garis#Cosmism|cosmism]], [[rationalism]], [[effective altruism]], and [[longtermism]]".<ref name=":0">{{Cite journal |last1=Gebru |first1=Timnit |author-link=Timnit Gebru |last2=Torres |first2=Émile P. |author-link2=Émile P. Torres |date=April 14, 2024 |title=The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence |url=https://firstmonday.org/ojs/index.php/fm/article/view/13636 |journal=[[First Monday (journal)|First Monday]] |volume=29 |issue=4 |doi=10.5210/fm.v29i4.13636 |issn=1396-0466 |doi-access=free |access-date=June 27, 2024 |archive-date=July 1, 2024 |archive-url=https://web.archive.org/web/20240701070850/https://firstmonday.org/ojs/index.php/fm/article/view/13636 |url-status=live }}</ref><ref>{{cite book |last=Thomas |first=Alexander |chapter=Systemic Dehumanization |date=2024 |title=The Politics and Ethics of Transhumanism |pages=159–194 |jstor=jj.14284473.9 |series=Techno-Human Evolution and Advanced Capitalism |edition=1 |publisher=Bristol University Press |doi=10.2307/jj.14284473.9 |isbn=978-1-5292-3964-5}}</ref> Gebru and Torres argue that these ideologies should be treated as an "interconnected and overlapping" group with shared origins.<ref name=":0" /> They say this is a movement that allows its proponents to use the threat of [[human extinction]] to justify expensive or detrimental projects. They consider it pervasive in social and academic circles in [[Silicon Valley]] centered around [[artificial intelligence]].<ref name="dig1">{{cite web |last=Torres |first=Émile P |author-link=Émile P. Torres |date=June 15, 2023 |title=The Acronym Behind Our Wildest AI Dreams and Nightmares |url=https://www.truthdig.com/articles/the-acronym-behind-our-wildest-ai-dreams-and-nightmares/ |access-date=October 1, 2023 |website=[[TruthDig]] |publisher= |quote=}}</ref> As such, the acronym is sometimes used to criticize a perceived belief system associated with [[Big Tech]].<ref name="dig1" /><ref name="was1">{{cite web |last=Troy |first=Dave |date=May 1, 2023 |title=The Wide Angle: Understanding TESCREAL — the Weird Ideologies Behind Silicon Valley's Rightward Turn |url=https://washingtonspectator.org/understanding-tescreal-silicon-valleys-rightward-turn/ |access-date=October 1, 2023 |website=[[The Washington Spectator]] |publisher= |quote= |archive-date=June 6, 2023 |archive-url=https://web.archive.org/web/20230606223259/https://washingtonspectator.org/understanding-tescreal-silicon-valleys-rightward-turn/ |url-status=live }}</ref><ref name="financial1">{{cite web |last=Ahuja |first=Anjana |author-link=Anjana Ahuja |date=May 10, 2023 |title=We need to examine the beliefs of today's tech luminaries |url=https://www.ft.com/content/edc30352-05fb-4fd8-a503-20b50ce014ab |url-access=subscription |url-status=live |archive-url=https://archive.today/20231211051528/https://www.ft.com/content/edc30352-05fb-4fd8-a503-20b50ce014ab |archive-date=December 11, 2023 |access-date=October 1, 2023 |website=[[Financial Times]] |publisher= |quote=}}</ref>
{{multiple image
| image1 = Timnit Gebru crop.jpg
| image2 = Émile P. Torres (cropped).png
| footer = Computer scientist [[Timnit Gebru]] and philosopher [[Émile P. Torres]] coined the acronym "TESCREAL" in 2023.
}}

'''TESCREAL''' is an [[acronym]] [[neologism]] proposed by computer scientist [[Timnit Gebru]] and philosopher [[Émile P. Torres]] that stands for "[[transhumanism]], [[extropianism]], [[singularitarianism]], [[Hugo de Garis#Cosmism|cosmism]], [[rationalism]], [[effective altruism]], and [[longtermism]]".<ref name=":0">{{Cite journal |last1=Gebru |first1=Timnit |author-link=Timnit Gebru |last2=Torres |first2=Émile P. |author-link2=Émile P. Torres |date=April 14, 2024 |title=The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence |url=https://firstmonday.org/ojs/index.php/fm/article/view/13636 |journal=[[First Monday (journal)|First Monday]] |volume=29 |issue=4 |doi=10.5210/fm.v29i4.13636 |issn=1396-0466 |doi-access=free |access-date=June 27, 2024 |archive-date=July 1, 2024 |archive-url=https://web.archive.org/web/20240701070850/https://firstmonday.org/ojs/index.php/fm/article/view/13636 |url-status=live }}</ref> Gebru and Torres argue that these ideologies should be treated as an "interconnected and overlapping" group with shared origins.<ref name=":0" /> Gebru and Torres say this is a movement that allows its proponents to use the threat of [[human extinction]] to justify expensive or detrimental projects. They consider it pervasive in social and academic circles in [[Silicon Valley]] centered around [[artificial intelligence]].<ref name="dig1" /> As such, the acronym is sometimes used to criticize a perceived belief system associated with [[Big Tech]].<ref name="was1">{{cite web |last=Troy |first=Dave |date=May 1, 2023 |title=The Wide Angle: Understanding TESCREAL — the Weird Ideologies Behind Silicon Valley's Rightward Turn |url=https://washingtonspectator.org/understanding-tescreal-silicon-valleys-rightward-turn/ |access-date=October 1, 2023 |website=[[The Washington Spectator]] |publisher= |quote= |archive-date=June 6, 2023 |archive-url=https://web.archive.org/web/20230606223259/https://washingtonspectator.org/understanding-tescreal-silicon-valleys-rightward-turn/ |url-status=live }}</ref><ref name="dig1">{{cite web |last=Torres |first=Émile P |author-link=Émile P. Torres |date=June 15, 2023 |title=The Acronym Behind Our Wildest AI Dreams and Nightmares |url=https://www.truthdig.com/articles/the-acronym-behind-our-wildest-ai-dreams-and-nightmares/ |access-date=October 1, 2023 |website=[[TruthDig]] |publisher= |quote=}}</ref><ref name="financial1">{{cite web |last=Ahuja |first=Anjana |author-link=Anjana Ahuja |date=May 10, 2023 |title=We need to examine the beliefs of today's tech luminaries |url=https://www.ft.com/content/edc30352-05fb-4fd8-a503-20b50ce014ab |url-access=subscription |url-status=live |archive-url=https://archive.today/20231211051528/https://www.ft.com/content/edc30352-05fb-4fd8-a503-20b50ce014ab |archive-date=December 11, 2023 |access-date=October 1, 2023 |website=[[Financial Times]] |publisher= |quote=}}</ref><ref name="business1">{{cite web |last1=Russell |first1=Melia |last2=Black |first2=Julia |date=April 27, 2023 |title=He's played chess with Peter Thiel, sparred with Elon Musk and once, supposedly, stopped a plane crash: Inside Sam Altman's world, where truth is stranger than fiction |url=https://www.businessinsider.com/sam-altman-openai-chatgpt-worldcoin-helion-future-tech-2023-4?op=1 |access-date=October 1, 2023 |website=[[Business Insider]] |publisher= |quote= |archive-date=October 11, 2023 |archive-url=https://web.archive.org/web/20231011023337/https://www.businessinsider.com/sam-altman-openai-chatgpt-worldcoin-helion-future-tech-2023-4?op=1 |url-status=live }}</ref><ref>{{cite web |last1=Saliou |first1=Mathilde |date=April 24, 2024 |title=Transhumanisme, long-termisme… des idéologies aux racines eugénistes ?|url=https://next.ink/135296/transhumanisme-long-termisme-des-ideologies-aux-racines-eugenistes/ |access-date=June 24, 2024 |website=NEXT |publisher= |quote= |archive-date= |archive-url= |url-status= }}</ref><ref>{{cite web |last1=Gambetta|first1=Daniel |date=April 24, 2024 |title=Cos'è l'ideologia TESCREAL, il credo seguito dai miliardari che vogliono guidare il nostro futuro|url=https://www.fanpage.it/innovazione/tecnologia/cose-lideologia-tescreal-il-credo-seguito-dai-miliardari-che-vogliono-guidare-il-nostro-futuro/ |access-date=June 24, 2024 |website=[[Fanpage.it]] |publisher= |quote= |archive-date= |archive-url= |url-status= }}</ref><ref name=hebdo>{{cite web |last1=Redaud|first1=Lorraine |date=August 2, 2024 |title=TESCREAL, l'idéologie futuriste qui se répand chez les élites de la Silicon Valley|url=https://charliehebdo.fr/2024/08/societe/tech/tescreal-lideologie-futuriste-qui-se-repand-chez-les-elites-de-la-silicon-valley/ |access-date=August 7, 2024 |website=[[Charlie Hebdo]] |publisher= |quote= |archive-date= |archive-url= |url-status= }}</ref>


== Origin ==
== Origin ==
Line 25: Line 14:
Philosopher Yogi Hale Hendlin has argued that by both ignoring the human causes of societal problems and over-engineering solutions, TESCREALists ignore the context in which many problems arise.<ref>{{Cite journal |last=Hendlin |first=Yogi Hale |date=April 1, 2024 |title=Semiocide as Negation: Review of Michael Marder's Dump Philosophy |journal=[[Biosemiotics (journal)|Biosemiotics]] |language=en |volume=17 |issue=1 |pages=233–255 |doi=10.1007/s12304-024-09558-x |issn=1875-1342 |doi-access=free }}</ref> Camille Sojit Pejcha wrote in ''[[Document Journal]]'' that TESCREAL is a tool for tech elites to concentrate power.<ref name=":3" /> In ''[[The Washington Spectator]]'', Dave Troy called TESCREAL an "[[Consequentialism|ends justifies the means]]" movement that is antithetical to "democratic, inclusive, fair, patient, and just governance".<ref name="was1" /> [[Gil Duran]] wrote that "TESCREAL", "authoritarian technocracy", and "techno-optimism" were phrases used in early 2024 to describe a new ideology emerging in the tech industry.<ref>{{Cite magazine |last=Duran |first=Gil |author-link=Gil Duran |date=February 12, 2024 |title=The Tech Plutocrats Dreaming of a Right-Wing San Francisco |url=https://newrepublic.com/article/178675/garry-tan-tech-san-francisco |access-date=August 4, 2024 |magazine=[[The New Republic]] |issn=0028-6583 | archive-url= https://web.archive.org/web/20240226001226/https://newrepublic.com/article/178675/garry-tan-tech-san-francisco | archive-date=February 26, 2024 |url-status=live}}</ref>
Philosopher Yogi Hale Hendlin has argued that by both ignoring the human causes of societal problems and over-engineering solutions, TESCREALists ignore the context in which many problems arise.<ref>{{Cite journal |last=Hendlin |first=Yogi Hale |date=April 1, 2024 |title=Semiocide as Negation: Review of Michael Marder's Dump Philosophy |journal=[[Biosemiotics (journal)|Biosemiotics]] |language=en |volume=17 |issue=1 |pages=233–255 |doi=10.1007/s12304-024-09558-x |issn=1875-1342 |doi-access=free }}</ref> Camille Sojit Pejcha wrote in ''[[Document Journal]]'' that TESCREAL is a tool for tech elites to concentrate power.<ref name=":3" /> In ''[[The Washington Spectator]]'', Dave Troy called TESCREAL an "[[Consequentialism|ends justifies the means]]" movement that is antithetical to "democratic, inclusive, fair, patient, and just governance".<ref name="was1" /> [[Gil Duran]] wrote that "TESCREAL", "authoritarian technocracy", and "techno-optimism" were phrases used in early 2024 to describe a new ideology emerging in the tech industry.<ref>{{Cite magazine |last=Duran |first=Gil |author-link=Gil Duran |date=February 12, 2024 |title=The Tech Plutocrats Dreaming of a Right-Wing San Francisco |url=https://newrepublic.com/article/178675/garry-tan-tech-san-francisco |access-date=August 4, 2024 |magazine=[[The New Republic]] |issn=0028-6583 | archive-url= https://web.archive.org/web/20240226001226/https://newrepublic.com/article/178675/garry-tan-tech-san-francisco | archive-date=February 26, 2024 |url-status=live}}</ref>


Gebru, Torres, and others have likened TESCREAL to a secular religion due to its parallels to Christian theology and [[eschatology]].<ref name=":0" /><ref name="dig1" /><ref name=":9" /><ref name=":8">{{Cite news |last=Piccard |first=Alexandre |date=November 30, 2023 |title=The Sam Altman saga shows that AI doomers have lost a battle |url=https://www.lemonde.fr/en/opinion/article/2023/11/30/the-sam-altman-affair-has-shown-that-artificial-intelligence-doomers-have-lost-a-battle_6301213_23.html |access-date=June 30, 2024 |work=[[Le Monde]] |language=en |archive-date=July 1, 2024 |archive-url=https://web.archive.org/web/20240701165012/https://www.lemonde.fr/en/opinion/article/2023/11/30/the-sam-altman-affair-has-shown-that-artificial-intelligence-doomers-have-lost-a-battle_6301213_23.html |url-status=live |url-access=subscription}}</ref> Writers in [[Current Affairs (magazine)|''Current Affairs'']] compared these philosophies and the ensuing [[techno-optimism]] to "any other monomaniacal faith... in which doubters are seen as enemies and beliefs are accepted without evidence". They argue pursuing TESCREAL would prevent an actual equitable shared future.<ref name=":11">{{Cite news |last1=Bhalla |first1=Jag |last2=Robinson |first2=Nathan J. |author-link2=Nathan J. Robinson |date=October 20, 2023 |title='Techno-Optimism' is Not Something You Should Believe In |url=https://www.currentaffairs.org/news/2023/10/techno-optimism-is-not-something-you-should-believe-in |access-date=July 2, 2024 |work=[[Current Affairs (magazine)|Current Affairs]] |language=en |issn=2471-2647}}</ref>
Gebru, Torres, and others have likened TESCREAL to a secular religion due to its parallels to Christian theology and [[eschatology]].<ref name=":0" /><ref name="dig1" /><ref name="hebdo">{{cite web |last1=Redaud |first1=Lorraine |date=August 2, 2024 |title=TESCREAL, l'idéologie futuriste qui se répand chez les élites de la Silicon Valley |url=https://charliehebdo.fr/2024/08/societe/tech/tescreal-lideologie-futuriste-qui-se-repand-chez-les-elites-de-la-silicon-valley/ |url-status= |archive-url= |archive-date= |access-date=August 7, 2024 |website=[[Charlie Hebdo]] |publisher= |quote=}}</ref><ref name=":9" /><ref name=":8">{{Cite news |last=Piccard |first=Alexandre |date=November 30, 2023 |title=The Sam Altman saga shows that AI doomers have lost a battle |url=https://www.lemonde.fr/en/opinion/article/2023/11/30/the-sam-altman-affair-has-shown-that-artificial-intelligence-doomers-have-lost-a-battle_6301213_23.html |access-date=June 30, 2024 |work=[[Le Monde]] |language=en |archive-date=July 1, 2024 |archive-url=https://web.archive.org/web/20240701165012/https://www.lemonde.fr/en/opinion/article/2023/11/30/the-sam-altman-affair-has-shown-that-artificial-intelligence-doomers-have-lost-a-battle_6301213_23.html |url-status=live |url-access=subscription}}</ref> Writers in [[Current Affairs (magazine)|''Current Affairs'']] compared these philosophies and the ensuing [[techno-optimism]] to "any other monomaniacal faith... in which doubters are seen as enemies and beliefs are accepted without evidence". They argue pursuing TESCREAL would prevent an actual equitable shared future.<ref name=":11">{{Cite news |last1=Bhalla |first1=Jag |last2=Robinson |first2=Nathan J. |author-link2=Nathan J. Robinson |date=October 20, 2023 |title='Techno-Optimism' is Not Something You Should Believe In |url=https://www.currentaffairs.org/news/2023/10/techno-optimism-is-not-something-you-should-believe-in |access-date=July 2, 2024 |work=[[Current Affairs (magazine)|Current Affairs]] |language=en |issn=2471-2647}}</ref>

=== Criticism and debate ===
Ozy Brennan, writing in a magazine affiliated with the [[Centre for Effective Altruism]], criticized Gebru's and Torres's grouping of different philosophies as if they were a "monolithic" movement. Brennan argues Torres has misunderstood these different philosophies, and has taken [[Thought experiment|philosophical thought experiments]] out of context.<ref name=":15">{{Cite web |last=Brennan |first=Ozy |date=June 2024 |title=The "TESCREAL" Bungle |url=https://asteriskmag.com/issues/06/the-tescreal-bungle |access-date=June 18, 2024 |website=[[Centre for Effective Altruism|Asterisk]] |archive-date=June 12, 2024 |archive-url=https://web.archive.org/web/20240612223338/https://asteriskmag.com/issues/06/the-tescreal-bungle |url-status=live }}</ref> James Pethokoukis, of the [[American Enterprise Institute]], disagrees with criticizing proponents of TESCREAL. He argues that the tech billionaires criticized in a [[Scientific American|''Scientific American'']] article for allegedly espousing TESCREAL have significantly advanced society.<ref>{{Cite web |last=Pethokoukis |first=James |date=January 6, 2024 |title=Billionaires Dreaming Of a Sci-Fi Future Is a Good Thing |url=https://www.aei.org/articles/billionaires-dreaming-of-a-sci-fi-future-is-a-good-thing/#:~:text=In%20other%20words%2C%20as%20Stross,%2C%20extropianism%2C%20singularitarianism%2C%20cosmism%2C |access-date=July 1, 2024 |website=[[American Enterprise Institute]] |archive-date=June 27, 2024 |archive-url=https://web.archive.org/web/20240627224640/https://www.aei.org/articles/billionaires-dreaming-of-a-sci-fi-future-is-a-good-thing/#:~:text=In%20other%20words%2C%20as%20Stross,%2C%20extropianism%2C%20singularitarianism%2C%20cosmism%2C |url-status=live }}</ref> McLauchlan has noted that critics of the TESCREAL bundle have objected to what they see as disparate and sometimes conflicting ideologies being grouped together, but opines that TESCREAL is a good way to describe and consolidate many of the "grand bizarre ideologies in Silicon Valley".<ref name=":13" />

According to Torres, "If advanced technologies continue to be developed at the current rate, a global-scale catastrophe is almost certainly a matter of when rather than if." Torres believes that "perhaps the only way to actually attain a state of ‘existential security’ is to slow down or completely halt further technological innovation", and criticized the longtermist view that technology, although dangerous, is essential for human civilization to achieve its full potential.<ref>{{Cite web |last=P Torres |first=Émile |date=October 19, 2021 |editor-last=Dresser |editor-first=Sam |title=Why longtermism is the world's most dangerous secular credo |url=https://aeon.co/essays/why-longtermism-is-the-worlds-most-dangerous-secular-credo |access-date=2024-07-20 |website=Aeon |language=en}}</ref><ref name=":15" /> Ozy Brennan contends that Torres's proposal to slow or halt technological development represents a more extreme position than TESCREAL ideologies, preventing many improvements in quality of life, healthcare, and poverty reduction that technological progress enables.<ref name=":15" />


=== Artificial General Intelligence (AGI) ===
=== Artificial General Intelligence (AGI) ===
Much of the discourse about [[existential risk from artificial general intelligence|existential risk from AGI]] occurs among supporters of the TESCREAL ideologies.<ref name="was1" /><ref name=":1" /><ref name=":2">{{Cite journal |last=Helfrich |first=Gina |date=March 11, 2024 |title=The harms of terminology: why we should reject so-called "frontier AI" |journal=AI Ethics |language=en |doi=10.1007/s43681-024-00438-1 |issn=2730-5961 |doi-access=free }}</ref><ref name=":14">{{Cite magazine |last=Heaven |first=Will Douglas |date=July 10, 2024 |title=What is AI? |url=https://www.technologyreview.com/2024/07/10/1094475/what-is-artificial-intelligence-ai-definitive-guide/ |access-date=August 4, 2024 |magazine=[[MIT Technology Review]] |language=en |archive-url=https://web.archive.org/web/20240717021554/https://www.technologyreview.com/2024/07/10/1094475/what-is-artificial-intelligence-ai-definitive-guide/ |archive-date=July 17, 2024 |url-status=live}}</ref> TESCREALists are either considered "AI accelerationists", who consider AI the only way to pursue a utopian future where problems are solved, or "AI [[Doomer|doomers]]", who consider [[AI alignment|AI likely to be unaligned]] to human survival and likely to cause human extinction.<ref name=":1" /><ref name=":8" /> Despite the risk, many doomers consider the development of AGI inevitable and argue that only by developing and aligning AGI first can existential risk be averted.<ref name=":6" /><ref name=":14" />
Much of the discourse about [[existential risk from artificial general intelligence|existential risk from AGI]] occurs among supporters of the TESCREAL ideologies.<ref name=":1" /><ref name=":2">{{Cite journal |last=Helfrich |first=Gina |date=March 11, 2024 |title=The harms of terminology: why we should reject so-called "frontier AI" |journal=AI Ethics |language=en |doi=10.1007/s43681-024-00438-1 |issn=2730-5961 |doi-access=free }}</ref><ref name=":14">{{Cite magazine |last=Heaven |first=Will Douglas |date=July 10, 2024 |title=What is AI? |url=https://www.technologyreview.com/2024/07/10/1094475/what-is-artificial-intelligence-ai-definitive-guide/ |access-date=August 4, 2024 |magazine=[[MIT Technology Review]] |language=en |archive-url=https://web.archive.org/web/20240717021554/https://www.technologyreview.com/2024/07/10/1094475/what-is-artificial-intelligence-ai-definitive-guide/ |archive-date=July 17, 2024 |url-status=live}}</ref> TESCREALists are either considered "AI accelerationists", who consider AI the only way to pursue a utopian future where problems are solved, or "AI [[Doomer|doomers]]", who consider [[AI alignment|AI likely to be unaligned]] to human survival and likely to cause human extinction.<ref name=":1" /><ref name=":8" /> Despite the risk, many doomers consider the development of AGI inevitable and argue that only by developing and aligning AGI first can existential risk be averted.<ref name=":6" /><ref name=":14" />


Gebru has likened the conflict between accelerationists and doomers to a "secular religion selling AGI enabled utopia and apocalypse".<ref name=":8" /> Torres and Gebru argue that both groups use hypothetical AI-driven apocalypses and utopian futures to justify unlimited research, development, and deregulation of technology. By considering only far-reaching future consequences, creating hype for unproven technology, and fear-mongering, Torres and Gebru allege TESCREALists distract from the impacts of technology that may adversely affect society, disproportionately harm minorities through [[algorithmic bias]], and have a detrimental [[environmental impacts of artificial intelligence|impact on the environment]].<ref name=":0" /><ref name="financial1" /><ref name="hebdo" /><ref name=":14" />
Gebru has likened the conflict between accelerationists and doomers to a "secular religion selling AGI enabled utopia and apocalypse".<ref name=":8" /> Torres and Gebru argue that both groups use hypothetical AI-driven apocalypses and utopian futures to justify unlimited research, development, and deregulation of technology. By considering only far-reaching future consequences, creating hype for unproven technology, and fear-mongering, Torres and Gebru allege TESCREALists distract from the impacts of technology that may adversely affect society, disproportionately harm minorities through [[algorithmic bias]], and have a detrimental [[environmental impacts of artificial intelligence|impact on the environment]].<ref name="financial1" /><ref name="hebdo" /><ref name=":14" />

===Pharmaceuticals===
Neşe Devenot has used the TESCREAL acronym to refer to "global financial and tech elites" who promote new uses of [[psychedelic drugs]] as [[mental health]] treatments, not because they want to help people, but so that they can make money on the sale of these pharmaceuticals as part of a plan to increase inequality.<ref name="devenot">{{Cite journal |last=Devenot |first=Neşe |date=2023-12-29 |title=TESCREAL hallucinations: Psychedelic and AI hype as inequality engines |url=https://akjournals.com/view/journals/2054/7/S1/article-p22.xml |journal=Journal of Psychedelic Studies |volume=7 |issue=S1 |pages=22–39 |doi=10.1556/2054.2023.00292 |issn=2559-9283 | quote=Counterfactual efforts to improve mental health by increasing inequality are widespread in the psychedelics industry. These efforts have been propelled by an elitist worldview that is widely-held in Silicon Valley. The backbone of this worldview is the TESCREAL bundle of ideologies, ... While others have noted similarities between the earlier SSRI hype and the ongoing hype for psychedelic medications, the rhetoric of psychedelic hype is tinged with utopian and magico-religious aspirations that have no parallel in the discourse surrounding SSRIs or other antidepressants. I argue that this utopian discourse provides insight into the ways that global financial and tech elites are instrumentalizing psychedelics as one tool in a broader world-building project that justifies increasing material inequality. |doi-access=free }}</ref>


=== Claimed bias against minorities ===
=== Claimed bias against minorities ===
Gebru and Torres claim that TESCREAL ideologies directly originate from 20th-century [[eugenics]]<ref name=":0" /> and that the bundle of ideologies advocates a [[New eugenics|second wave of new eugenics]].<ref name=":0" /><ref>{{Cite web |last=Torres |first=Émile P. |author-link=Émile P. Torres |date=November 9, 2023 |title=Effective Altruism Is a Welter of Lies, Hypocrisy, and Eugenic Fantasies |url=https://www.truthdig.com/articles/effective-altruism-is-a-welter-of-fraud-lies-exploitation-and-eugenic-fantasies/ |access-date=June 30, 2024 |website=[[Truthdig]] |language=en-US}}</ref> Others have similarly argued that the TESCREAL ideologies developed from earlier philosophies that were used to justify mass murder and genocide.<ref name=":3" /><ref name=":6">{{Cite web |last=Van Rensburg |first=Wessel |date=June 7, 2024 |title=AI and the quest for utopia |url=https://www.vryeweekblad.com/en/opinions-and-debate/2024-06-07-ai-and-the-quest-for-utopia/ |access-date=June 30, 2024 |website=[[Vrye Weekblad]] |language= |archive-date=June 30, 2024 |archive-url=https://web.archive.org/web/20240630025036/https://www.vryeweekblad.com/en/opinions-and-debate/2024-06-07-ai-and-the-quest-for-utopia/ |url-status=live }}</ref> Some prominent figures who have contributed to TESCREAL ideologies have been alleged to be racist and sexist.<ref name=":0" /><ref name=":2" /><ref name=":5">{{Cite news |last1=Wilson |first1=Jason |last2=Winston |first2=Ali |date=June 16, 2024 |title=Sam Bankman-Fried funded a group with racist ties. FTX wants its $5m back |url=https://www.theguardian.com/technology/article/2024/jun/16/sam-bankman-fried-ftx-eugenics-scientific-racism |access-date=June 29, 2024 |work=[[The Guardian]] |language=en-GB |issn=0261-3077 |archive-date=July 1, 2024 |archive-url=https://web.archive.org/web/20240701164920/https://www.theguardian.com/technology/article/2024/jun/16/sam-bankman-fried-ftx-eugenics-scientific-racism |url-status=live }}</ref><ref name=":12">{{Cite web |last=Brownell |first=Claire |date=November 27, 2023 |title=Doom, Inc.: The well-funded global movement that wants you to fear AI |url=https://thelogic.co/news/special-report/doom-inc-the-well-funded-global-movement-that-wants-you-to-fear-ai/ |url-access=subscription |access-date=July 2, 2024 |website=[[The Logic]] |language=en-US}}</ref> McLauchlan has said that, while "some people in these groups want to genetically engineer superintelligent humans, or replace the entire species with a superior form of intelligence" others "like the effective altruists, for example, most of them are just in it to help very poor people ... they are kind of shocked ... that they've been lumped into this malevolent ... eugenics conspiracy".<ref name=":13" />
Gebru and Torres claim that TESCREAL ideologies directly originate from 20th-century [[eugenics]]<ref name=":0" /> and that the bundle of ideologies advocates a [[New eugenics|second wave of new eugenics]].<ref name=":0" /><ref>{{Cite web |last=Torres |first=Émile P. |author-link=Émile P. Torres |date=November 9, 2023 |title=Effective Altruism Is a Welter of Lies, Hypocrisy, and Eugenic Fantasies |url=https://www.truthdig.com/articles/effective-altruism-is-a-welter-of-fraud-lies-exploitation-and-eugenic-fantasies/ |access-date=June 30, 2024 |website=[[Truthdig]] |language=en-US}}</ref> Others have similarly argued that the TESCREAL ideologies developed from earlier philosophies that were used to justify mass murder and genocide.<ref name=":3" /><ref name=":6">{{Cite web |last=Van Rensburg |first=Wessel |date=June 7, 2024 |title=AI and the quest for utopia |url=https://www.vryeweekblad.com/en/opinions-and-debate/2024-06-07-ai-and-the-quest-for-utopia/ |access-date=June 30, 2024 |website=[[Vrye Weekblad]] |language= |archive-date=June 30, 2024 |archive-url=https://web.archive.org/web/20240630025036/https://www.vryeweekblad.com/en/opinions-and-debate/2024-06-07-ai-and-the-quest-for-utopia/ |url-status=live }}</ref> Some prominent figures who have contributed to TESCREAL ideologies have been alleged to be racist and sexist.<ref name=":2" /><ref name=":5">{{Cite news |last1=Wilson |first1=Jason |last2=Winston |first2=Ali |date=June 16, 2024 |title=Sam Bankman-Fried funded a group with racist ties. FTX wants its $5m back |url=https://www.theguardian.com/technology/article/2024/jun/16/sam-bankman-fried-ftx-eugenics-scientific-racism |access-date=June 29, 2024 |work=[[The Guardian]] |language=en-GB |issn=0261-3077 |archive-date=July 1, 2024 |archive-url=https://web.archive.org/web/20240701164920/https://www.theguardian.com/technology/article/2024/jun/16/sam-bankman-fried-ftx-eugenics-scientific-racism |url-status=live }}</ref><ref name=":12">{{Cite web |last=Brownell |first=Claire |date=November 27, 2023 |title=Doom, Inc.: The well-funded global movement that wants you to fear AI |url=https://thelogic.co/news/special-report/doom-inc-the-well-funded-global-movement-that-wants-you-to-fear-ai/ |url-access=subscription |access-date=July 2, 2024 |website=[[The Logic]] |language=en-US}}</ref> McLauchlan has said that, while "some people in these groups want to genetically engineer superintelligent humans, or replace the entire species with a superior form of intelligence" others "like the effective altruists, for example, most of them are just in it to help very poor people ... they are kind of shocked ... that they've been lumped into this malevolent ... eugenics conspiracy".<ref name=":13" />

=== Criticism and debate ===
Writing in ''Asterisk'', a magazine related to effective altruism, Ozy Brennan criticized Gebru's and Torres's grouping of different philosophies as if they were a "monolithic" movement. Brennan argues Torres has misunderstood these different philosophies, and has taken [[Thought experiment|philosophical thought experiments]] out of context.<ref name=":15">{{Cite web |last=Brennan |first=Ozy |date=June 2024 |title=The "TESCREAL" Bungle |url=https://asteriskmag.com/issues/06/the-tescreal-bungle |url-status=live |archive-url=https://web.archive.org/web/20240612223338/https://asteriskmag.com/issues/06/the-tescreal-bungle |archive-date=June 12, 2024 |access-date=June 18, 2024 |website=[[Centre for Effective Altruism|Asterisk]]}}</ref> James Pethokoukis, of the [[American Enterprise Institute]], disagrees with criticizing proponents of TESCREAL. He argues that the tech billionaires criticized in a [[Scientific American|''Scientific American'']] article for allegedly espousing TESCREAL have significantly advanced society.<ref>{{Cite web |last=Pethokoukis |first=James |date=January 6, 2024 |title=Billionaires Dreaming Of a Sci-Fi Future Is a Good Thing |url=https://www.aei.org/articles/billionaires-dreaming-of-a-sci-fi-future-is-a-good-thing/#:~:text=In%20other%20words%2C%20as%20Stross,%2C%20extropianism%2C%20singularitarianism%2C%20cosmism%2C |url-status=live |archive-url=https://web.archive.org/web/20240627224640/https://www.aei.org/articles/billionaires-dreaming-of-a-sci-fi-future-is-a-good-thing/#:~:text=In%20other%20words%2C%20as%20Stross,%2C%20extropianism%2C%20singularitarianism%2C%20cosmism%2C |archive-date=June 27, 2024 |access-date=July 1, 2024 |website=[[American Enterprise Institute]]}}</ref> McLauchlan has noted that critics of the TESCREAL bundle have objected to what they see as disparate and sometimes conflicting ideologies being grouped together, but opines that TESCREAL is a good way to describe and consolidate many of the "grand bizarre ideologies in Silicon Valley".<ref name=":13" /> Eli Sennesh and [[James Hughes (sociologist)|James Hughes]], publishing in the blog for the [[Transhumanism|transhumanist]] [[Institute for Ethics and Emerging Technologies]], have argued that TESCREAL is a left-wing [[conspiracy theory]] that unnecessarily groups disparate philosophies together without understanding the [[mutually exclusive]] tenets in each.<ref>{{Cite web |last1=Sennesh |first1=Eli |last2=Hugh |first2=James |date=12 June 2023 |title=Conspiracy Theories, Left Futurism, and the Attack on TESCREAL |url=https://medium.com/institute-for-ethics-and-emerging-technologies/conspiracy-theories-left-futurism-and-the-attack-on-tescreal-456972fe02aa |publisher=Institute for Ethics and Emerging Technology |via=[[Medium (website)|Medium]]}}</ref>

According to Torres, "If advanced technologies continue to be developed at the current rate, a global-scale catastrophe is almost certainly a matter of when rather than if." Torres believes that "perhaps the only way to actually attain a state of ‘existential security’ is to slow down or completely halt further technological innovation", and criticized the longtermist view that technology, although dangerous, is essential for human civilization to achieve its full potential.<ref>{{Cite web |last=P Torres |first=Émile |date=October 19, 2021 |editor-last=Dresser |editor-first=Sam |title=Why longtermism is the world's most dangerous secular credo |url=https://aeon.co/essays/why-longtermism-is-the-worlds-most-dangerous-secular-credo |access-date=2024-07-20 |website=Aeon |language=en}}</ref><ref name=":15" /> Brennan contends that Torres's proposal to slow or halt technological development represents a more extreme position than TESCREAL ideologies, preventing many improvements in quality of life, healthcare, and poverty reduction that technological progress enables.<ref name=":15" />


== Alleged TESCREALists ==
== Alleged TESCREALists ==
Venture capitalist [[Marc Andreessen|Marc Andreessen]] has self-identified as a TESCREAList.<ref name=":13">{{Cite interview |last=McLauchlan |first=Danyl |interviewer=[[Susie Ferguson]] |title=Danyl McLauchlan: Silicon Valley's cult of tech utopianism |url=https://www.rnz.co.nz/national/programmes/saturday/audio/2018945704/danyl-mclauchlan-silicon-valley-s-cult-of-tech-utopianism |access-date=July 6, 2024 |publisher=[[Radio New Zealand]] |date=July 6, 2024}}</ref><ref name=":1" /> He published "The Techno-Optimist Manifesto" in October 2023, which Jag Bhalla and [[Nathan J. Robinson]] have called a "perfect example" of the TESCREAL ideologies.<ref name=":11" /> In the document, he argues that more advanced artificial intelligence could save countless future potential lives, and that those working to slow or prevent its development should be condemned as murderers.<ref name=":1" /><ref name=":3" />
Venture capitalist [[Marc Andreessen|Marc Andreessen]] has self-identified as a TESCREAList.<ref name=":13">{{Cite interview |last=McLauchlan |first=Danyl |interviewer=[[Susie Ferguson]] |title=Danyl McLauchlan: Silicon Valley's cult of tech utopianism |url=https://www.rnz.co.nz/national/programmes/saturday/audio/2018945704/danyl-mclauchlan-silicon-valley-s-cult-of-tech-utopianism |access-date=July 6, 2024 |publisher=[[Radio New Zealand]] |date=July 6, 2024}}</ref><ref name=":1" /> He published the "[[Techno-Optimist Manifesto]]" in October 2023, which Jag Bhalla and [[Nathan J. Robinson]] have called a "perfect example" of the TESCREAL ideologies.<ref name=":11" /> In the document, he argues that more advanced artificial intelligence could save countless future potential lives, and that those working to slow or prevent its development should be condemned as murderers.<ref name=":3" /><ref name=":1" />

[[Elon Musk]] has been described as sympathetic to some TESCREAL ideologies.<ref name="financial1" /><ref name="hebdo" /><ref name=":5" /><ref name="devenot" /> In August 2022, Musk tweeted that [[William MacAskill]]'s longtermist book ''[[What We Owe the Future]]'' was a "close match for my philosophy".<ref>{{Cite news |last=Kulish |first=Nicholas |author-link=Nicholas Kulish |date=October 8, 2022 |title=How a Scottish Moral Philosopher Got Elon Musk's Number |url=https://www.nytimes.com/2022/10/08/business/effective-altruism-elon-musk.html |access-date=July 2, 2024 |work=[[The New York Times]] |language=en-US |issn=0362-4331}}</ref> Some writers believe Musk's [[Neuralink]] pursues TESCREAList goals.<ref name="financial1" /><ref name=":7">{{Cite web |last=Kandimalla |first=Sriskandha |date=June 5, 2024 |title=The dark side of techno-utopian dreams: Ethical and practical pitfalls |url=https://newuniversity.org/2024/06/05/the-dark-side-of-techno-utopian-dreams-ethical-and-practical-pitfalls/ |access-date=June 30, 2024 |website=[[New University (newspaper)|New University]] |language=en-US |archive-date=June 30, 2024 |archive-url=https://web.archive.org/web/20240630020527/https://newuniversity.org/2024/06/05/the-dark-side-of-techno-utopian-dreams-ethical-and-practical-pitfalls/ |url-status=live }}</ref> Some AI experts have complained about the focus of Musk's [[XAI (company)|XAI]] company on existential risk, arguing that it and other AI companies have ties to TESCREAL movements.<ref name=":10">{{Cite web |last=Goldman |first=Sharon |date=July 24, 2023 |title=Doomer AI advisor joins Musk's xAI, the 4th top research lab focused on AI apocalypse |url=https://venturebeat.com/ai/doomer-advisor-joins-musks-xai-the-4th-top-research-lab-focused-on-ai-apocalypse/ |access-date=June 29, 2024 |website=[[VentureBeat]] |language=en-US |archive-date=June 29, 2024 |archive-url=https://web.archive.org/web/20240629215134/https://venturebeat.com/ai/doomer-advisor-joins-musks-xai-the-4th-top-research-lab-focused-on-ai-apocalypse/ |url-status=live }}</ref><ref name=":4">{{Cite web |last=Torres |first=Émile P. |author-link=Émile P. Torres |date=June 11, 2023 |title=AI and the threat of "human extinction": What are the tech-bros worried about? It's not you and me |url=https://www.salon.com/2023/06/11/ai-and-the-of-human-extinction-what-are-the-tech-bros-worried-about-its-not-you-and-me/ |access-date=June 29, 2024 |website=[[Salon.com|Salon]] |language=en |archive-date=June 30, 2024 |archive-url=https://web.archive.org/web/20240630140825/https://www.salon.com/2023/06/11/ai-and-the-of-human-extinction-what-are-the-tech-bros-worried-about-its-not-you-and-me/ |url-status=live }}</ref> Dave Troy believes Musk's [[Natalism|natalist]] views originate from TESCREAL ideals.<ref name="was1" />


It has also been suggested that [[Peter Thiel]] is sympathetic to TESCREAL ideas.<ref name="financial1" /><ref name="hebdo" /><ref name=":16">{{Cite web |last=Svetkey |first=Benjamin |date=2024-08-07 |title="F*** These Trump-Loving Techies": Hollywood Takes on Silicon Valley in an Epic Presidential Brawl |url=https://www.hollywoodreporter.com/news/politics-news/election-2024-hollywood-silicon-valley-1235967050/ |access-date=2024-08-10 |website=[[The Hollywood Reporter]] |language=en-US}}</ref> Benjamin Svetkey wrote in ''[[The Hollywood Reporter]]'' that Thiel and other Silicon Valley CEOs who support the [[Donald Trump 2024 presidential campaign]] are pushing for policies that would shut down "regulators whose outdated restrictions on things like [[human experimentation]] are slowing down progress toward a [[Technotopia|technotopian]] paradise".<ref name=":16" />
[[Elon Musk]] has been described as sympathetic to some TESCREAL ideologies.<ref name="financial1" /><ref name="hebdo" /><ref name=":7" /><ref name=":5" /> In August 2022, Musk tweeted that [[William MacAskill]]'s longtermist book ''[[What We Owe the Future]]'' was a "close match for my philosophy".<ref>{{Cite news |last=Kulish |first=Nicholas |author-link=Nicholas Kulish |date=October 8, 2022 |title=How a Scottish Moral Philosopher Got Elon Musk's Number |url=https://www.nytimes.com/2022/10/08/business/effective-altruism-elon-musk.html |access-date=July 2, 2024 |work=[[The New York Times]] |language=en-US |issn=0362-4331}}</ref> Some writers believe Musk's [[Neuralink]] pursues TESCREAList goals.<ref name="financial1" /><ref name=":7">{{Cite web |last=Kandimalla |first=Sriskandha |date=June 5, 2024 |title=The dark side of techno-utopian dreams: Ethical and practical pitfalls |url=https://newuniversity.org/2024/06/05/the-dark-side-of-techno-utopian-dreams-ethical-and-practical-pitfalls/ |access-date=June 30, 2024 |website=[[New University (newspaper)|New University]] |language=en-US |archive-date=June 30, 2024 |archive-url=https://web.archive.org/web/20240630020527/https://newuniversity.org/2024/06/05/the-dark-side-of-techno-utopian-dreams-ethical-and-practical-pitfalls/ |url-status=live }}</ref> Some AI experts have complained about the focus of Musk's [[XAI (company)|XAI]] company on existential risk, arguing that it and other AI companies have ties to TESCREAL movements.<ref name=":10">{{Cite web |last=Goldman |first=Sharon |date=July 24, 2023 |title=Doomer AI advisor joins Musk's xAI, the 4th top research lab focused on AI apocalypse |url=https://venturebeat.com/ai/doomer-advisor-joins-musks-xai-the-4th-top-research-lab-focused-on-ai-apocalypse/ |access-date=June 29, 2024 |website=[[VentureBeat]] |language=en-US |archive-date=June 29, 2024 |archive-url=https://web.archive.org/web/20240629215134/https://venturebeat.com/ai/doomer-advisor-joins-musks-xai-the-4th-top-research-lab-focused-on-ai-apocalypse/ |url-status=live }}</ref><ref name=":4">{{Cite web |last=Torres |first=Émile P. |author-link=Émile P. Torres |date=June 11, 2023 |title=AI and the threat of "human extinction": What are the tech-bros worried about? It's not you and me |url=https://www.salon.com/2023/06/11/ai-and-the-of-human-extinction-what-are-the-tech-bros-worried-about-its-not-you-and-me/ |access-date=June 29, 2024 |website=[[Salon.com|Salon]] |language=en |archive-date=June 30, 2024 |archive-url=https://web.archive.org/web/20240630140825/https://www.salon.com/2023/06/11/ai-and-the-of-human-extinction-what-are-the-tech-bros-worried-about-its-not-you-and-me/ |url-status=live }}</ref> Dave Troy believes Musk's [[Natalism|natalist]] views originate from TESCREAL ideals.<ref name="was1" />


[[Sam Altman]] and much of the [[OpenAI]] board has been described as supporting TESCREAL movements, especially in the context of his attempted [[Removal of Sam Altman from OpenAI|firing]] in 2023.<ref name="hebdo" /><ref name=":13" /><ref name=":10" /><ref>{{Cite web |last1=Melton |first1=Monica |last2=Mok |first2=Aaron |date=November 23, 2023 |title='Black Twitter' asks 'What if Sam Altman were a Black woman?' in the wake of ouster |url=https://www.businessinsider.com/black-twitter-asks-what-if-sam-altman-was-black-woman-2023-11 |url-status=live |archive-url=https://web.archive.org/web/20240303172326/https://www.businessinsider.com/black-twitter-asks-what-if-sam-altman-was-black-woman-2023-11 |archive-date=March 3, 2024 |access-date=June 29, 2024 |website=[[Business Insider]] |language=en-US}}</ref><ref name=":8"/><ref name=":12" /> Gebru and Torres have urged Altman not to pursue TESCREAL ideals.<ref name="business1" /> An article by Charlie Hebdo has gone on to accuse Sam Altman and multiple other Silicon Valley executives of supporting TESCREAL ideals.<ref name="hebdo" />
[[Sam Altman]] and much of the [[OpenAI]] board has been described as supporting TESCREAL movements, especially in the context of his attempted [[Removal of Sam Altman from OpenAI|firing]] in 2023.<ref name="hebdo" /><ref>{{Cite web |last1=Melton |first1=Monica |last2=Mok |first2=Aaron |date=November 23, 2023 |title='Black Twitter' asks 'What if Sam Altman were a Black woman?' in the wake of ouster |url=https://www.businessinsider.com/black-twitter-asks-what-if-sam-altman-was-black-woman-2023-11 |url-status=live |archive-url=https://web.archive.org/web/20240303172326/https://www.businessinsider.com/black-twitter-asks-what-if-sam-altman-was-black-woman-2023-11 |archive-date=March 3, 2024 |access-date=June 29, 2024 |website=[[Business Insider]] |language=en-US}}</ref><ref name=":8"/> Gebru and Torres have urged Altman not to pursue TESCREAL ideals.<ref name="business1">{{cite web |last1=Russell |first1=Melia |last2=Black |first2=Julia |date=April 27, 2023 |title=He's played chess with Peter Thiel, sparred with Elon Musk and once, supposedly, stopped a plane crash: Inside Sam Altman's world, where truth is stranger than fiction |url=https://www.businessinsider.com/sam-altman-openai-chatgpt-worldcoin-helion-future-tech-2023-4?op=1 |url-status=live |archive-url=https://web.archive.org/web/20231011023337/https://www.businessinsider.com/sam-altman-openai-chatgpt-worldcoin-helion-future-tech-2023-4?op=1 |archive-date=October 11, 2023 |access-date=October 1, 2023 |website=[[Business Insider]] |publisher= |quote=}}</ref> Lorraine Redaud writing in [[Charlie Hebdo|''Charlie Hebdo'']] described Sam Altman and multiple other Silicon Valley executives as supporting TESCREAL ideals.<ref name="hebdo" />


Self-identified transhumanists [[Nick Bostrom]] and [[Eliezer Yudkowsky]], both influential in discussions of existential risk from AI,<ref name=":12" /> have also been described as leaders of the TESCREAL movement.<ref name="dig1" /><ref name="financial1" /><ref name=":2" /><ref name=":12" /> One article in [[Charlie Hebdo]] has described Bostrom as supporting some ideals "in line with the TESCREALists movement."<ref name="hebdo" />
Self-identified transhumanists [[Nick Bostrom]] and [[Eliezer Yudkowsky]], both influential in discussions of existential risk from AI,<ref name=":12" /> have also been described as leaders of the TESCREAL movement.<ref name="financial1" /><ref name=":2" /><ref name=":12" /> Redaud said Bostrom supported some ideals "in line with the TESCREALists movement".<ref name="hebdo" />


[[Sam Bankman-Fried]], former CEO of the [[FTX]] cryptocurrency exchange, was a prominent and self-identified member of the effective altruist community.<ref>{{Cite magazine |last=Wenar |first=Leif |author-link=Leif Wenar |date=March 27, 2024 |title=The Deaths of Effective Altruism |url=https://www.wired.com/story/deaths-of-effective-altruism/ |access-date=July 2, 2024 |magazine=[[Wired (magazine)|Wired]] |language=en-US |issn=1059-1028}}</ref> According to ''[[The Guardian]]'', since [[Bankruptcy of FTX|FTX's collapse]], administrators of the bankruptcy estate have been trying to recoup about $5 million that they allege was transferred to a nonprofit to help secure the purchase of a historic hotel that has been repurposed for conferences and workshops associated with longtermism, rationalism, and effective altruism. The property hosted liberal eugenicists and other speakers the ''Guardian'' said had racist and misogynistic histories.{{ r | ":5"| p=1 | q=The revelations cast new light on so-called 'Tescreal' intellectual movements – an umbrella term for a cluster of movements including EA and rationalism that exercise broad influence in Silicon Valley, and have the ear of the likes of Sam Altman, Marc Andreessen and Elon Musk. It also raises questions about the extent to which people within that movement continue to benefit from Bankman-Fried’s fraud, the largest in US history.}}
[[Sam Bankman-Fried]], former CEO of the [[FTX]] cryptocurrency exchange, was a prominent and self-identified member of the effective altruist community.<ref>{{Cite magazine |last=Wenar |first=Leif |author-link=Leif Wenar |date=March 27, 2024 |title=The Deaths of Effective Altruism |url=https://www.wired.com/story/deaths-of-effective-altruism/ |access-date=July 2, 2024 |magazine=[[Wired (magazine)|Wired]] |language=en-US |issn=1059-1028}}</ref> According to ''[[The Guardian]]'', since [[Bankruptcy of FTX|FTX's collapse]], administrators of the bankruptcy estate have been trying to recoup about $5 million that they allege was transferred to a nonprofit to help secure the purchase of a historic hotel that has been repurposed for conferences and workshops associated with longtermism, rationalism, and effective altruism. The property hosted liberal eugenicists and other speakers the ''Guardian'' said had racist and misogynistic histories.{{ r | ":5"| p=1 | q=The revelations cast new light on so-called 'Tescreal' intellectual movements – an umbrella term for a cluster of movements including EA and rationalism that exercise broad influence in Silicon Valley, and have the ear of the likes of Sam Altman, Marc Andreessen and Elon Musk. It also raises questions about the extent to which people within that movement continue to benefit from Bankman-Fried’s fraud, the largest in US history.}}


Longtermist and effective altruist [[William MacAskill]], who frequently collaborated with Bankman-Fried to coordinate philanthropic initiatives, has been described as a TESCREAList.<ref name=":0" /><ref name="was1" /><ref name=":1" />
Longtermist and effective altruist [[William MacAskill]], who frequently collaborated with Bankman-Fried to coordinate philanthropic initiatives, has been described as a TESCREAList.<ref name=":0" /><ref name="was1" /><ref name=":1" /><ref name="devenot" />


== See also ==
== See also ==
Line 57: Line 51:
* [[Effective accelerationism]]
* [[Effective accelerationism]]
* [[Utilitarianism]]
* [[Utilitarianism]]
* [[The Californian Ideology]]


==References==
==References==

Revision as of 04:54, 28 August 2024

TESCREAL is an acronym neologism proposed by computer scientist Timnit Gebru and philosopher Émile P. Torres that stands for "transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism".[1][2] Gebru and Torres argue that these ideologies should be treated as an "interconnected and overlapping" group with shared origins.[1] They say this is a movement that allows its proponents to use the threat of human extinction to justify expensive or detrimental projects. They consider it pervasive in social and academic circles in Silicon Valley centered around artificial intelligence.[3] As such, the acronym is sometimes used to criticize a perceived belief system associated with Big Tech.[3][4][5]

Origin

Gebru and Torres coined "TESCREAL" in 2023, first using it in a draft of a paper titled "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence".[1][4] First Monday published the paper in April 2024, though Torres and Gebru popularized the term elsewhere before the paper's publication. According to Gebru and Torres, transhumanism, extropianism, singularitarianism, (modern) cosmism, rationalism, effective altruism, and longtermism are a "bundle" of "interconnected and overlapping ideologies" that emerged from 20th-century eugenics, with shared progenitors.[1] They use the term "TESCREAList" to refer to people who subscribe to, or appear to endorse, most or all of the ideologies captured in the acronym.[1][3]

Analysis

According to critics of these philosophies, TESCREAL describes overlapping movements endorsed by prominent people in the tech industry to provide intellectual backing to pursue and prioritize projects including artificial general intelligence (AGI), life extension, and space colonization.[1][4][6] Science fiction author Charles Stross, using the example of space colonization, argued that the ideologies allow billionaires to pursue massive personal projects driven by a right-wing interpretation of science fiction by arguing that not to pursue such projects poses an existential risk to society.[7] Gebru and Torres write that, using the threat of extinction, TESCREALists can justify "attempts to build unscoped systems which are inherently unsafe".[1] Media scholar Ethan Zuckerman argues that by only considering goals that are valuable to the TESCREAL movement, futuristic projects with more immediate drawbacks, such as racial inequity, algorithmic bias, and environmental degradation, can be justified.[8] Speaking at Radio New Zealand, politics writer Danyl McLauchlan said that many of these philosophies may have started off with good intentions but might have been pushed "to a point of ridiculousness."[9]

Philosopher Yogi Hale Hendlin has argued that by both ignoring the human causes of societal problems and over-engineering solutions, TESCREALists ignore the context in which many problems arise.[10] Camille Sojit Pejcha wrote in Document Journal that TESCREAL is a tool for tech elites to concentrate power.[6] In The Washington Spectator, Dave Troy called TESCREAL an "ends justifies the means" movement that is antithetical to "democratic, inclusive, fair, patient, and just governance".[4] Gil Duran wrote that "TESCREAL", "authoritarian technocracy", and "techno-optimism" were phrases used in early 2024 to describe a new ideology emerging in the tech industry.[11]

Gebru, Torres, and others have likened TESCREAL to a secular religion due to its parallels to Christian theology and eschatology.[1][3][12][7][13] Writers in Current Affairs compared these philosophies and the ensuing techno-optimism to "any other monomaniacal faith... in which doubters are seen as enemies and beliefs are accepted without evidence". They argue pursuing TESCREAL would prevent an actual equitable shared future.[14]

Artificial General Intelligence (AGI)

Much of the discourse about existential risk from AGI occurs among supporters of the TESCREAL ideologies.[8][15][16] TESCREALists are either considered "AI accelerationists", who consider AI the only way to pursue a utopian future where problems are solved, or "AI doomers", who consider AI likely to be unaligned to human survival and likely to cause human extinction.[8][13] Despite the risk, many doomers consider the development of AGI inevitable and argue that only by developing and aligning AGI first can existential risk be averted.[17][16]

Gebru has likened the conflict between accelerationists and doomers to a "secular religion selling AGI enabled utopia and apocalypse".[13] Torres and Gebru argue that both groups use hypothetical AI-driven apocalypses and utopian futures to justify unlimited research, development, and deregulation of technology. By considering only far-reaching future consequences, creating hype for unproven technology, and fear-mongering, Torres and Gebru allege TESCREALists distract from the impacts of technology that may adversely affect society, disproportionately harm minorities through algorithmic bias, and have a detrimental impact on the environment.[5][12][16]

Pharmaceuticals

Neşe Devenot has used the TESCREAL acronym to refer to "global financial and tech elites" who promote new uses of psychedelic drugs as mental health treatments, not because they want to help people, but so that they can make money on the sale of these pharmaceuticals as part of a plan to increase inequality.[18]

Claimed bias against minorities

Gebru and Torres claim that TESCREAL ideologies directly originate from 20th-century eugenics[1] and that the bundle of ideologies advocates a second wave of new eugenics.[1][19] Others have similarly argued that the TESCREAL ideologies developed from earlier philosophies that were used to justify mass murder and genocide.[6][17] Some prominent figures who have contributed to TESCREAL ideologies have been alleged to be racist and sexist.[15][20][21] McLauchlan has said that, while "some people in these groups want to genetically engineer superintelligent humans, or replace the entire species with a superior form of intelligence" others "like the effective altruists, for example, most of them are just in it to help very poor people ... they are kind of shocked ... that they've been lumped into this malevolent ... eugenics conspiracy".[9]

Criticism and debate

Writing in Asterisk, a magazine related to effective altruism, Ozy Brennan criticized Gebru's and Torres's grouping of different philosophies as if they were a "monolithic" movement. Brennan argues Torres has misunderstood these different philosophies, and has taken philosophical thought experiments out of context.[22] James Pethokoukis, of the American Enterprise Institute, disagrees with criticizing proponents of TESCREAL. He argues that the tech billionaires criticized in a Scientific American article for allegedly espousing TESCREAL have significantly advanced society.[23] McLauchlan has noted that critics of the TESCREAL bundle have objected to what they see as disparate and sometimes conflicting ideologies being grouped together, but opines that TESCREAL is a good way to describe and consolidate many of the "grand bizarre ideologies in Silicon Valley".[9] Eli Sennesh and James Hughes, publishing in the blog for the transhumanist Institute for Ethics and Emerging Technologies, have argued that TESCREAL is a left-wing conspiracy theory that unnecessarily groups disparate philosophies together without understanding the mutually exclusive tenets in each.[24]

According to Torres, "If advanced technologies continue to be developed at the current rate, a global-scale catastrophe is almost certainly a matter of when rather than if." Torres believes that "perhaps the only way to actually attain a state of ‘existential security’ is to slow down or completely halt further technological innovation", and criticized the longtermist view that technology, although dangerous, is essential for human civilization to achieve its full potential.[25][22] Brennan contends that Torres's proposal to slow or halt technological development represents a more extreme position than TESCREAL ideologies, preventing many improvements in quality of life, healthcare, and poverty reduction that technological progress enables.[22]

Alleged TESCREALists

Venture capitalist Marc Andreessen has self-identified as a TESCREAList.[9][8] He published the "Techno-Optimist Manifesto" in October 2023, which Jag Bhalla and Nathan J. Robinson have called a "perfect example" of the TESCREAL ideologies.[14] In the document, he argues that more advanced artificial intelligence could save countless future potential lives, and that those working to slow or prevent its development should be condemned as murderers.[6][8]

Elon Musk has been described as sympathetic to some TESCREAL ideologies.[5][12][20][18] In August 2022, Musk tweeted that William MacAskill's longtermist book What We Owe the Future was a "close match for my philosophy".[26] Some writers believe Musk's Neuralink pursues TESCREAList goals.[5][27] Some AI experts have complained about the focus of Musk's XAI company on existential risk, arguing that it and other AI companies have ties to TESCREAL movements.[28][29] Dave Troy believes Musk's natalist views originate from TESCREAL ideals.[4]

It has also been suggested that Peter Thiel is sympathetic to TESCREAL ideas.[5][12][30] Benjamin Svetkey wrote in The Hollywood Reporter that Thiel and other Silicon Valley CEOs who support the Donald Trump 2024 presidential campaign are pushing for policies that would shut down "regulators whose outdated restrictions on things like human experimentation are slowing down progress toward a technotopian paradise".[30]

Sam Altman and much of the OpenAI board has been described as supporting TESCREAL movements, especially in the context of his attempted firing in 2023.[12][31][13] Gebru and Torres have urged Altman not to pursue TESCREAL ideals.[32] Lorraine Redaud writing in Charlie Hebdo described Sam Altman and multiple other Silicon Valley executives as supporting TESCREAL ideals.[12]

Self-identified transhumanists Nick Bostrom and Eliezer Yudkowsky, both influential in discussions of existential risk from AI,[21] have also been described as leaders of the TESCREAL movement.[5][15][21] Redaud said Bostrom supported some ideals "in line with the TESCREALists movement".[12]

Sam Bankman-Fried, former CEO of the FTX cryptocurrency exchange, was a prominent and self-identified member of the effective altruist community.[33] According to The Guardian, since FTX's collapse, administrators of the bankruptcy estate have been trying to recoup about $5 million that they allege was transferred to a nonprofit to help secure the purchase of a historic hotel that has been repurposed for conferences and workshops associated with longtermism, rationalism, and effective altruism. The property hosted liberal eugenicists and other speakers the Guardian said had racist and misogynistic histories.[20]: 1

Longtermist and effective altruist William MacAskill, who frequently collaborated with Bankman-Fried to coordinate philanthropic initiatives, has been described as a TESCREAList.[1][4][8][18]

See also

References

  1. ^ a b c d e f g h i j k Gebru, Timnit; Torres, Émile P. (April 14, 2024). "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence". First Monday. 29 (4). doi:10.5210/fm.v29i4.13636. ISSN 1396-0466. Archived from the original on July 1, 2024. Retrieved June 27, 2024.
  2. ^ Thomas, Alexander (2024). "Systemic Dehumanization". The Politics and Ethics of Transhumanism. Techno-Human Evolution and Advanced Capitalism (1 ed.). Bristol University Press. pp. 159–194. doi:10.2307/jj.14284473.9. ISBN 978-1-5292-3964-5. JSTOR jj.14284473.9.
  3. ^ a b c d Torres, Émile P (June 15, 2023). "The Acronym Behind Our Wildest AI Dreams and Nightmares". TruthDig. Retrieved October 1, 2023.
  4. ^ a b c d e f Troy, Dave (May 1, 2023). "The Wide Angle: Understanding TESCREAL — the Weird Ideologies Behind Silicon Valley's Rightward Turn". The Washington Spectator. Archived from the original on June 6, 2023. Retrieved October 1, 2023.
  5. ^ a b c d e f Ahuja, Anjana (May 10, 2023). "We need to examine the beliefs of today's tech luminaries". Financial Times. Archived from the original on December 11, 2023. Retrieved October 1, 2023.
  6. ^ a b c d Pejcha, Camille Sojit (May 23, 2024). "Techno-futurists are selling an interplanetary paradise for the posthuman generation—they just forgot about the rest of us". Document Journal. Archived from the original on June 29, 2024. Retrieved June 29, 2024.
  7. ^ a b Stross, Charles (December 20, 2023). "Tech Billionaires Need to Stop Trying to Make the Science Fiction They Grew Up on Real". Scientific American. Archived from the original on June 26, 2024. Retrieved June 27, 2024.
  8. ^ a b c d e f Zuckerman, Ethan (January 16, 2024). "Two warring visions of AI". Prospect. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  9. ^ a b c d McLauchlan, Danyl (July 6, 2024). "Danyl McLauchlan: Silicon Valley's cult of tech utopianism" (Interview). Interviewed by Susie Ferguson. Radio New Zealand. Retrieved July 6, 2024.
  10. ^ Hendlin, Yogi Hale (April 1, 2024). "Semiocide as Negation: Review of Michael Marder's Dump Philosophy". Biosemiotics. 17 (1): 233–255. doi:10.1007/s12304-024-09558-x. ISSN 1875-1342.
  11. ^ Duran, Gil (February 12, 2024). "The Tech Plutocrats Dreaming of a Right-Wing San Francisco". The New Republic. ISSN 0028-6583. Archived from the original on February 26, 2024. Retrieved August 4, 2024.
  12. ^ a b c d e f g Redaud, Lorraine (August 2, 2024). "TESCREAL, l'idéologie futuriste qui se répand chez les élites de la Silicon Valley". Charlie Hebdo. Retrieved August 7, 2024.
  13. ^ a b c d Piccard, Alexandre (November 30, 2023). "The Sam Altman saga shows that AI doomers have lost a battle". Le Monde. Archived from the original on July 1, 2024. Retrieved June 30, 2024.
  14. ^ a b Bhalla, Jag; Robinson, Nathan J. (October 20, 2023). "'Techno-Optimism' is Not Something You Should Believe In". Current Affairs. ISSN 2471-2647. Retrieved July 2, 2024.
  15. ^ a b c Helfrich, Gina (March 11, 2024). "The harms of terminology: why we should reject so-called "frontier AI"". AI Ethics. doi:10.1007/s43681-024-00438-1. ISSN 2730-5961.
  16. ^ a b c Heaven, Will Douglas (July 10, 2024). "What is AI?". MIT Technology Review. Archived from the original on July 17, 2024. Retrieved August 4, 2024.
  17. ^ a b Van Rensburg, Wessel (June 7, 2024). "AI and the quest for utopia". Vrye Weekblad. Archived from the original on June 30, 2024. Retrieved June 30, 2024.
  18. ^ a b c Devenot, Neşe (December 29, 2023). "TESCREAL hallucinations: Psychedelic and AI hype as inequality engines". Journal of Psychedelic Studies. 7 (S1): 22–39. doi:10.1556/2054.2023.00292. ISSN 2559-9283. Counterfactual efforts to improve mental health by increasing inequality are widespread in the psychedelics industry. These efforts have been propelled by an elitist worldview that is widely-held in Silicon Valley. The backbone of this worldview is the TESCREAL bundle of ideologies, ... While others have noted similarities between the earlier SSRI hype and the ongoing hype for psychedelic medications, the rhetoric of psychedelic hype is tinged with utopian and magico-religious aspirations that have no parallel in the discourse surrounding SSRIs or other antidepressants. I argue that this utopian discourse provides insight into the ways that global financial and tech elites are instrumentalizing psychedelics as one tool in a broader world-building project that justifies increasing material inequality.
  19. ^ Torres, Émile P. (November 9, 2023). "Effective Altruism Is a Welter of Lies, Hypocrisy, and Eugenic Fantasies". Truthdig. Retrieved June 30, 2024.
  20. ^ a b c Wilson, Jason; Winston, Ali (June 16, 2024). "Sam Bankman-Fried funded a group with racist ties. FTX wants its $5m back". The Guardian. ISSN 0261-3077. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  21. ^ a b c Brownell, Claire (November 27, 2023). "Doom, Inc.: The well-funded global movement that wants you to fear AI". The Logic. Retrieved July 2, 2024.
  22. ^ a b c Brennan, Ozy (June 2024). "The "TESCREAL" Bungle". Asterisk. Archived from the original on June 12, 2024. Retrieved June 18, 2024.
  23. ^ Pethokoukis, James (January 6, 2024). "Billionaires Dreaming Of a Sci-Fi Future Is a Good Thing". American Enterprise Institute. Archived from the original on June 27, 2024. Retrieved July 1, 2024.
  24. ^ Sennesh, Eli; Hugh, James (June 12, 2023). "Conspiracy Theories, Left Futurism, and the Attack on TESCREAL". Institute for Ethics and Emerging Technology – via Medium.
  25. ^ P Torres, Émile (October 19, 2021). Dresser, Sam (ed.). "Why longtermism is the world's most dangerous secular credo". Aeon. Retrieved July 20, 2024.
  26. ^ Kulish, Nicholas (October 8, 2022). "How a Scottish Moral Philosopher Got Elon Musk's Number". The New York Times. ISSN 0362-4331. Retrieved July 2, 2024.
  27. ^ Kandimalla, Sriskandha (June 5, 2024). "The dark side of techno-utopian dreams: Ethical and practical pitfalls". New University. Archived from the original on June 30, 2024. Retrieved June 30, 2024.
  28. ^ Goldman, Sharon (July 24, 2023). "Doomer AI advisor joins Musk's xAI, the 4th top research lab focused on AI apocalypse". VentureBeat. Archived from the original on June 29, 2024. Retrieved June 29, 2024.
  29. ^ Torres, Émile P. (June 11, 2023). "AI and the threat of "human extinction": What are the tech-bros worried about? It's not you and me". Salon. Archived from the original on June 30, 2024. Retrieved June 29, 2024.
  30. ^ a b Svetkey, Benjamin (August 7, 2024). ""F*** These Trump-Loving Techies": Hollywood Takes on Silicon Valley in an Epic Presidential Brawl". The Hollywood Reporter. Retrieved August 10, 2024.
  31. ^ Melton, Monica; Mok, Aaron (November 23, 2023). "'Black Twitter' asks 'What if Sam Altman were a Black woman?' in the wake of ouster". Business Insider. Archived from the original on March 3, 2024. Retrieved June 29, 2024.
  32. ^ Russell, Melia; Black, Julia (April 27, 2023). "He's played chess with Peter Thiel, sparred with Elon Musk and once, supposedly, stopped a plane crash: Inside Sam Altman's world, where truth is stranger than fiction". Business Insider. Archived from the original on October 11, 2023. Retrieved October 1, 2023.
  33. ^ Wenar, Leif (March 27, 2024). "The Deaths of Effective Altruism". Wired. ISSN 1059-1028. Retrieved July 2, 2024.