Exercises in Meta-cohesion
In 2019, OpenAI released GPT-2, a language model capable of generating whole paragraphs of text at a time. GPT-2’s output, stripped of inhibition and ego, offers delightful linguistic surprises run after run.
Eventually though, the novelty wears off. When it does, we’re left to wonder: how do we make this statistical trick— an assembly of words no longer contingent on an author's intention— mean something to us?
In Exercises in Meta-cohesion, my mechanical co-writer (GPT-2) and I tell a fictional tale of characters whose connections to each other, fragile as they are, build a society out of selves. Underneath this surface, we tell our own tale of human and machine working together through formulas, improv, and endless material to put words artfully together.
Exercises in Meta-cohesion uses GPT-2 by OpenAI, with the help of gpt-2-simple by Max Woolf. All custom tuning datasets were created using the Python Reddit API Wrapper (PRAW).
Visit the project website
Read the full paper
The first lines of each portrait, prompted by human, generated by GPT-2.
Dynamic web interface, showing both surface and under-layers.
Continue reading at the project website.