Artificial intelligence (AI) is often discussed as something extraordinary, a dream--or a nightmare--that awakens metaphysical questions on human life. Yet far from a distant technology of the future, the true power of AI lies in its subtle revolution of ordinary life. From voice assistants like Siri to natural language processors, AI technologies use cultural biases and modern psychology to fit specific characteristics of how users perceive and navigate the external world, thereby projecting the illusion of intelligence. Integrating media studies, science and technology studies, and social psychology, Deceitful Media examines the rise of artificial intelligence throughout history and exposes the very human fallacies behind this technology. Focusing specifically on communicative AIs, Natale argues that what we call "AI" is not a form of intelligence but rather a reflection of the human user. Using the term "banal deception," he reveals that deception forms the basis of all human-computer interactions rooted in AI technologies, as technologies like voice assistants utilize the dynamics of projection and stereotyping as a means for aligning with our existing habits and social conventions. By exploiting the human instinct to connect, AI reveals our collective vulnerabilities to deception, showing that what machines are primarily changing is not other technology but ourselves as humans. Deceitful Media illustrates how AI has continued a tradition of technologies that mobilize our liability to deception and shows that only by better understanding our vulnerabilities to deception can we become more sophisticated consumers of interactive media.

Deceitful media: Artificial intelligence and social life after the Turing Test

Simone Natale
2021-01-01

Abstract

Artificial intelligence (AI) is often discussed as something extraordinary, a dream--or a nightmare--that awakens metaphysical questions on human life. Yet far from a distant technology of the future, the true power of AI lies in its subtle revolution of ordinary life. From voice assistants like Siri to natural language processors, AI technologies use cultural biases and modern psychology to fit specific characteristics of how users perceive and navigate the external world, thereby projecting the illusion of intelligence. Integrating media studies, science and technology studies, and social psychology, Deceitful Media examines the rise of artificial intelligence throughout history and exposes the very human fallacies behind this technology. Focusing specifically on communicative AIs, Natale argues that what we call "AI" is not a form of intelligence but rather a reflection of the human user. Using the term "banal deception," he reveals that deception forms the basis of all human-computer interactions rooted in AI technologies, as technologies like voice assistants utilize the dynamics of projection and stereotyping as a means for aligning with our existing habits and social conventions. By exploiting the human instinct to connect, AI reveals our collective vulnerabilities to deception, showing that what machines are primarily changing is not other technology but ourselves as humans. Deceitful Media illustrates how AI has continued a tradition of technologies that mobilize our liability to deception and shows that only by better understanding our vulnerabilities to deception can we become more sophisticated consumers of interactive media.
2021
Oxford University Press
1
208
9780190080372
https://global.oup.com/academic/product/deceitful-media-9780190080372?cc=us&lang=en&
Artificial Intelligence, Deception, Voice assistants, Chatbots, Human-machine communication, Turing Test, Media history, Media theory, Computer-mediated communication, Human-Computer Interaction
Simone Natale
File in questo prodotto:
File Dimensione Formato  
Natale_Introduction_Author draft.pdf

Accesso aperto

Tipo di file: POSTPRINT (VERSIONE FINALE DELL’AUTORE)
Dimensione 390.49 kB
Formato Adobe PDF
390.49 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1768312
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 115
  • ???jsp.display-item.citation.isi??? ND
social impact