Are you for real? : decoding realistic AI-generated faces from neural activity

Michoel L. Moshel, Amanda K. Robinson, Thomas A. Carlson, Tijl Grootswagers

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)

Abstract

Can we trust our eyes? Until recently, we rarely had to question whether what we see is indeed what exists, but this is changing. Artificial neural networks can now generate realistic images that challenge our perception of what is real. This new reality can have significant implications for cybersecurity, counterfeiting, fake news, and border security. We investigated how the human brain encodes and interprets realistic artificially generated images using behaviour and brain imaging. We found that we could reliably decode AI generated faces using people's neural activity. However, while at a group level people performed near chance classifying real and realistic fakes, participants tended to interchange the labels, classifying real faces as realistic fakes and vice versa. Understanding this difference between brain and behavioural responses may be key in determining the 'real' in our new reality. Stimuli, code, and data for this study can be found at https://osf.io/n2z73/.
Original languageEnglish
Article number108079
Number of pages12
JournalVision Research
Volume19
DOIs
Publication statusPublished - 2022

Fingerprint

Dive into the research topics of 'Are you for real? : decoding realistic AI-generated faces from neural activity'. Together they form a unique fingerprint.

Cite this