The Alignment Problem: Difference between revisions

Content deleted Content added
link to Atlas of AI and add quote
Line 37:
The book received positive reviews from critics. ''[[The Wall Street Journal]]''<nowiki/>'s David A. Shaywitz emphasized the frequent problems when applying algorithms to real-world problems, describing the book as "a nuanced and captivating exploration of this white-hot topic."<ref name="shaywitz">{{ cite news| url=https://www.wsj.com/articles/the-alignment-problem-review-when-machines-miss-the-point-11603659140 | title='The Alignment Problem' Review: When Machines Miss the Point | publisher=[[The Wall Street Journal]] | first=David | last=Shaywitz | date=October 25, 2020 | access-date=December 5, 2021}}</ref> ''[[Publishers Weekly]]'' praised the book for its writing and extensive research.<ref>{{Cite web|title=Nonfiction Book Review: The Alignment Problem: Machine Learning and Human Values by Brian Christian. Norton, $27.95 (356p) ISBN 978-0-393-63582-9|url=https://www.publishersweekly.com/978-0-393-63582-9|access-date=2022-01-20|website=PublishersWeekly.com|language=en}}</ref>
 
''[[Kirkus Reviews]]'' gave the book a positive review, calling it "technically rich but accessible", and "an intriguing exploration of AI."<ref>{{Cite book|url=https://www.kirkusreviews.com/book-reviews/brian-christian/alignment-problem/|title=THE ALIGNMENT PROBLEM {{!}} Kirkus Reviews|language=en}}</ref> Writing for ''[[Nature (journal)|Nature]]'', Virginia Dignum gave the book a positive review, favorably comparing it to [[Kate Crawford]]'s ''[[Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence]]''.<ref>{{Cite journal|last=Dignum|first=Virginia|date=2021-05-26|title=AI — the people and places that make, use and manage it|journal=Nature|language=en|volume=593|issue=7860|pages=499–500|doi=10.1038/d41586-021-01397-x|bibcode=2021Natur.593..499D|s2cid=235216649|doi-access=free}}</ref>
 
In 2021, journalist [[Ezra Klein]] had Christian on his podcast, ''The Ezra Klein Show,'' writing in ''[[The New York Times]]'', "''The Alignment Problem'' is the best book on the key technical and moral questions of A.I. that I’ve read."<ref name="klein">{{ cite news| url=https://www.nytimes.com/2021/06/04/opinion/ezra-klein-podcast-brian-christian.html | title=If 'All Models Are Wrong,' Why Do We Give Them So Much Power? | work=[[The New York Times]] | first=Ezra | last=Klein | date=June 4, 2021 | access-date=December 5, 2021}}</ref> Later that year, the book was listed in a ''[[Fast Company]]'' feature, "5 books that inspired Microsoft CEO [[Satya Nadella]] this year".<ref name="nadella">{{ cite news| url=https://www.fastcompany.com/90696770/microsoft-satya-nadella-book-recommendations | title=5 books that inspired Microsoft CEO Satya Nadella this year | publisher=[[Fast Company]] | first=Satya | last=Nadella | date=November 15, 2021 | access-date=December 5, 2021}}</ref>