One strange thing I've noticed about otherwise reasonably competent
text-generating neural nets is how their lists tend to go off the rails.
I noticed this first with GPT-2
[https://aiweirdness.com/post/185085792997/gpt-2-it-cant-resist-a-list]. But it
turns out GPT-3 is no exception.
Here's the largest model,
![How to make an Intimidating Sponge Cake. Last step is "Hide it all under a carpet." (full version in post)](/content/images/size/w795/2021/11/mary-berry-s-viennese-whirls---zoetrope---with-intimidating-sponge-cake-1.png)