I’ve done several experiments with GPT-2, a neural net that OpenAI trained on
millions of pages from the internet. OpenAI has been releasing the neural net in
stages, with the second-biggest model, 774M, just recently released. I decided
to put it through its paces. Last week’s experiment, where
![Bigger than before! A larger neural net tries to write fanfiction](/content/images/size/w795/image/fetch/w_1200-c_limit-f_jpg-q_auto:good-fl_progressive:steep/https-3A-2F-2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com-2Fpublic-2Fimages-2Fa2c7c912-2468-4adf-b37c-57d1339a2153_489x489.png)