I’ve done several experiments with GPT-2, a neural net that OpenAI trained on millions of pages from the internet. OpenAI has been releasing the neural net in stages, with the second-biggest model, 774M, just recently released. I decided to put it through its paces. Last week’s experiment, where