“…we have these incredibly predatory institutions being created, whether it is white supremacy on one hand or Facebook on the other. It is kind of a surrealist moment. Everything is like Magritte’s Ceci n’est pas une pipe. Nothing is what it seems.” — Trevor Paglen (interview in The Guardian, 25/11/17)
For his new Curve commission, American artist Trevor Paglen invites us to take a critical look at how artificial intelligence networks are taught to ‘perceive’ and ‘see’ the world by engineers who provide them with vast training sets of images and words.
For his commission Paglen has created a vast mosaic of approximately 30,000 of these images across 100 categories that charts ImageNet’s labelling of the world, moving from uncontroversial nouns such as ‘cloud’, ‘anchovy’ and ‘apple’, to terms that make problematic and prejudiced visual judgements such as ‘schemer’, ‘traitor’ and ‘anomaly’.
This dataset is archived and pre-selected in categories by humans, and widely used for training AI networks. These categories, when used in AI, suggest a world in which machines will be able to elicit forms of judgement against humankind.
This exclusive publication features a commissioned text by the academic, writer and curator Sarah Cook, and an interview with the artist by Barbican curator Alona Pardo.
Published on the occasion of the exhibition, Trevor Paglen: From Apple to Anomaly at Barbican, London (26 Sep 2019 – 16 Feb 2020).