One strange thing I've noticed about otherwise reasonably competent text-generating neural nets is how their lists tend to go off the rails. I noticed this first with GPT-2. But it turns out GPT-3 is no exception. Here's the largest model, GPT-3 DaVinci, finishing this list of ingredients you put in