For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
The two presentations examine how generative AI systems are haunted by the visual pasts they absorb, from war imagery amplified across social platforms to personal photos scraped into training datasets at incomprehensible scales. Marloes Geboers traces how platforms and algorithmic amplification shape what gets counted and recycled into synthetic imaginaries, while Gabriel Pereira tinkers with the automated pipelines that produce AI slop, surfacing the layers of mediation buried in them. Together, we ask what critical and creative possibilities emerge when we confront, rather than look away from, the spectral afterlives of the contemporary, algorithmic visual culture.
Event details of PEPTalk #29: What haunts Generative AI?
Date
12 May 2026
Time
13:00 -14:00
Location
Online via Zoom

From Slop to Slop (Gabriel Pereira)

This contribution presents an ongoing experiment with generating short AI videos entirely on a local machine. Starting from images sourced from training datasets, the workflow chains together small, locally-run models (image description, story generation, text-to-video, voice narration, and music) to produce 40-second video outputs. The system was developed through "vibe coding," an iterative process of building with AI that shaped both the technical system and the creative inquiry.

The resulting videos are sloppy, uncanny, and narratively strange — squarely within the genre of "AI slop." Rather than dismissing slop from the outside, this project approaches it from within, asking what becomes visible when you actually tinker with the infrastructures behind automated content factories. Two lines of inquiry on the politics of algorithmic culture emerge. First, how GenAI remains haunted by its source material: people's photos scraped and remediated into training data, given a strange afterlife through automated storytelling. This reflects a new moment for digital visual culture: where earlier critical dataset studies could grapple with bounded corpora like ImageNet, current GenAI data sets operate at scales that resist human comprehension. Second, how these workflows connect to a longer history of automated artistic processes, from Warhol's Factory to Sollfrank's net.art generator, where labor is delegated to systems as a critical reflection on hegemonic mass media infrastructures. 

Crucially, the malleability of these processes reveals openings: if data sets are the result of all the waste of our digital lives, could slop be a radical form of reclaiming it?

'Likely war' (Marloes Geboers)

The image on the right emerges from the synthetic translation of war images circulating and amplified on social media (original images, top row). In the atmosphere of the canvas, one could easily catch the vibe of Gotham City (or whatever movie the viewer ‘feels’). The dark-hooded figures are repeated versions of a figure in one of the underlying synthetic images. The synthesized image absorbs the brutal immediacy of war and reconfigures it into a sci-fi aesthetic of looming danger. What re-emerges is a “likely likeness of war”. 

There is extensive scholarly attention to how past images haunt synthetic imaginaries; what is not often made explicit is the role of algorithmic amplification and its platformed synergies with earlier image classification. Machine vision translates sensory experience into discrete units (Stiegler) that can be counted and recombined. “What gets counted, counts” (a much-circulated quote probably not by Einstein) is also true for how war is inscribed in computational registers of seeing. Categories and weightings are co-constructed by the preferences of business models and networked publics. In this way, the canvas’s imagination of war as a threat that needs to be handled by faceless ‘knights’, personifying both hero and villain, not only absorbs and recycles visual pasts but also reflects and amplifies accumulated preferences co-constructed by platform environments that require war to be mediated in digestible (cinematic distancing) and non-disruptive ways (sustain attention).

My work aims to elevate earlier grammatisation and collective amplification across social platforms in the discussion of haunting in generative AI contexts. Given the malleability of these processes, as pointed out by Gabriel, I ask: (how) can we observe “past data grammars”, and can we push back?  

This Peptalk is moderated by Aybüke Özgün