I’ve done several experiments with a text-generating neural network called
GPT-2. Trained at great expense by OpenAI (to the tune of tens of thousands of
dollars worth of computing power), GPT-2 learned to imitate all kinds of text
from the internet. I’ve interacted with the basic model, discovering
![Dungeon crawling or lucid dreaming?](/content/images/size/w795/image/fetch/w_1200-c_limit-f_jpg-q_auto:good-fl_progressive:steep/https-3A-2F-2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com-2Fpublic-2Fimages-2F16b59435-6817-46a6-82d6-4df6c033f2cd_500x114.png)