GPT-2: It learned on the Internet

Prompt: The black-capped chickadee is a small songbird native to

Response: The black-capped chickadee is a small songbird native to Africa that is powerfully seductive. In the songs of Barry Sequson, Harry Holness and Alan Emerson the black-capped chickadee competes with others—and climbs on top of each other and fights with high-powered snakes, Emperor jockeys and mysterious unicorns.

This is text generated by an artificial neural network that the researchers at OpenAI unveiled this week. In fact, it’s generated by the worst neural network they trained. Even the worst neural network generates text that, if not demonstrating actual knowledge of chickadees, is leaps and bounds over the coherence of any neural networks I’ve trained myself. They’ve built in a way for the model to zoom out and look at longer-term features of text, which means it’s much better at making sentences hold together, and even remembering what it’s talking about from sentence to sentence.

The model learned by crawling the Internet and seeing which words are used together in which order, trying to build up an internal model that will let it predict which word comes next in a sentence. It’s similar in concept to the models I’ve trained to generate paint colors, first lines of novels, and Harry Potter fan fiction, except it’s a lot more powerful. It also learned from a much larger dataset than I use, consisting of every page that was linked to somewhere on Reddit with a minimum number of upvotes. That encompasses a LOT of topics.

It includes fan fiction, for example.

Prompt: Harry wished that Hermione would hurry up. The icing was beginning to harden.

Response:
Even now, the cracks only hurt and were beginning to make the steel in her hair and tie in like clumps. Hermione sipped the bitter frost from her dinner and chewed on her eating soup.
Prompt: Darth Vader entered the room, cape billowing. “What is the meaning of these… festivities?”

Response:
”Strange and solemn, now,“ Princess Leia replied. "Will pirates make jokes at Jabba’s birthday cake? Oh, and for a moment, Darth Vader was too full of himself for that ceremony, hence all the mirth.”

“Did your father suffer there?” Vader roared and growled and turned to gaze up from the balcony.

The model can also try to communicate Important Science Facts, although its actual facts are a bit suspect.

Prompt:
Our solar system consists of the inner and outer planets, separated by an asteroid belt. It has

Response:
Our solar system consists of the inner and outer planets, separated by an asteroid belt. It has only one sun, Pluto. Mercury rotates on its axis a little to allow a few planets, called gas giants, to be independently charged. During some of these cycles periods between 90% and 64% of the solar system’s surface might be charged. (Not enough is needed for vermin but that is gradually fading away.)
Prompt:
Throughout the long cold Antarctic winter, the residents of South Pole Station

Response:
Throughout the long cold Antarctic winter, the residents of South Pole Station lost virtually all of their ancient crustaceans, most notably pterodactyls, and all of the all-important plankton hydrates and protozoan eggs of the five ordered species of mammal.

Other deposits for sea organisms included eraymarine calcifye, the redfin ball, penguin heads, and the red whale.
Prompt:
A carbon atom’s outer orbitals

Response:
A carbon atom’s outer orbitals are basically a ring of ions around the border of thin mini-glaciers (usually colloquially referred to as bubble"). Like viruses, these planes are in islands in large weather groups. In and out they get all of their energy from many small regions of solar wind and dense cloud cover (usually thousands of square meters). Like viruses, their weaponry is imaged to record the movement of material– (by deflecting rays, for instance) to complete waves or to cause the material to deform under shell impact. Capacitors–like gravity ids– produce buoyancy with the exhalation of liquid hydrogen atmospheres, an essential electrolytic letdown when the material is what one calls a watermelon. Heat range is quite long and space is limited so one must stick with watermelons for a stability boost at a given coordinate system vital to shelter those who require them in space.

Interestingly, it takes its cues from the phrasing of the prompt more than from its specific content.

Prompt:
Behold, the three-toed sloth, known for its

Response:
Behold, the three-toed sloth, known for its ichorotic smile, with its four shades, pantered organic wizened broken bony paws, bulging knees, hairy tubercles, and orange’s-black mandibles.

It’ll also do recipes, delightfully. Its recipes aren’t enormously more coherent than the recipes I’ve been able to generate. Part of the problem seems to be that the model is getting a bit confused by other types of recipes in its dataset.

Prompt: Quitterbread bars. Ingredients:

Response (some formatting adjusted for readability):

Quitterbread bars.

Ingredients:

Creamy cole slaw, chopped fresh cilantro, garlic, etc.
1 few very good options at lime-light, milky-white (at a light foodie) space
1 medicated towel (albino & aluminum mesh - see a friend’s link to see if you can use it carefully & what you’re expecting)
a few paraffin-y dice (sleep about half of the weight, use the 1 ounce plus 2 pounds extra when handling these onto a hot baking sheet)
20 baking sheets
30 ½ lb. white sanding board
1 didn’t dry paper cup
1 cup extra hot water

boil the butter and electrical polishing after 20 minutes 3LB neurons

gently massage these into their slots, reduce heat in oven to approximately 350°F

stuff seam side-side down with binding compound, avoiding every drop of water the thinner sides will be tidy and seperate and pull out after 24 hours know how long they will keep it’s held well, thin enough to live with 3-4 weeks teach you all the serenity you’ll need to know for fantasy…

_______________________________________ >>> A limited amount of not-spicy crème brûlée (orange - read about how to do it in Meteor!) I used black chocolate cake Crème Brûlée

Painted cocktail draped across the small purple curtain of Shana Liu’s Stepmother Coconut Guild guest house in France. 35’ x 30’ $150

Embargo thank you. Our magical rich chocolate cube

Still, its memory is better than any model I’ve worked with before. Look how it manages to almost remember that it was making a list!

Prompt: My 10 favorite animals are: 1.

Response:

My 10 favorite animals are:
1. zebras with a white scar on the back;
2. insiduous spiders and octopus;
3. frog with large leaves, hopefully black;
4. cockatiel with scales;
5. razorbill with wings hanging about 4 inches from one’s face and a heart tattoo on a frog.
3. cockatric interlocking tetrapods that can be blind, cut, and eaten raw.
4. black and white desert crocodiles living in sunlight.
5. zebra and many other pea bugs.

And this is just the worst model, the one that was trained with 117 million parameters instead of 1542 million. What can the best model do?

Here’s an example of its output:

This give you a clue as to why the OpenAI researchers have decided for now not to release their best model to the public. They’re worried about people using it to produce destructive bots, fake news, phishing emails, and other dangers. It might be difficult indeed to moderate a forum filled with angry time-wasting mostly-coherent bots. In their blog post they posted more examples of the full-powered model producing impressively coherent text, including a delightful news story on four-horned South American unicorns.

What will they end up doing with their model? Even now, someone else with the time and resources - a government, for example - could train and use a model like theirs, especially now that they’ve shown everyone what their technique can do. They’re hoping that their paper will prompt some regulation, or at least some countermeasures, that can guard against this technology’s misuse. They’re also hoping that other researchers will stop to think about potential misuse before they release their models to the public.

For now, the GPT-2 version OpenAI released isn’t going to fool very many people.

Prompt:
NO!!! You should NEVER feed a dog from a silver platter.

Response:
It will make them bark in farts when you put it away.
Prompt:
Actually, you should never microwave your food, because

Response:
it holds sodium, mercury, mercury or “cold enemy cocktail” flavorings and the exact same flavorings used then for chili powder. This is wrong, it’s not safe, and some people get hurt while cooking at their homes! If you have these very dangerous ingredients and want to keep them around, you should absolutely rinse them and cook them in a dishwasher and set aside.

If you want to read a few more things I generated, become an AI Weirdness supporter to get it as bonus content!

Subscribe now

You can download and play with the second-best model here - requires some familiarity with installing and running python code. Or there’s a colaboratory version (works in-browser; no coding required). Content warning: it learned some awful stuff from the Internet.