So much of current AI-generated stuff is derivative sludge that I'm enjoying the pockets of weirdness where I find them. One of my favorite things right now: DALL-E3's attempts to label things in the images it generates.

Here I asked "Please generate a cross section of a gourmet chocolate with each layer labeled."

A single chocolate candy shown in cross section. The cross section reveals that the candy contains at least 5 distinct layers, one of which contains whole almonds, and another of which seems to contain entire individual additional candies. Layers have labels such as Nouks, Cnuts, Frufist, Ganachte, Conol, Cestel, Ganclut, and Caramel. Also the blank ground beneath the candy is labeled Dust.

I was thinking, like, three layers tops. But for some reason DALL-E3 generates chocolates of epic proportions. Judging by the size of the almond, I think this chocolate is at least 6 cm on a side, yielding an estimated total weight of over 200 grams (about half a pound).

ChatGPT may be to blame, since it was the interface that passed my prompt to DALL-E3, and suggested "various layers like ganache, caramel, nuts, or fruit fillings". DALL-E3 seems to have taken that as instruction to include them all.

Here's another.

A cross section of a chocolate candy showing at least 9 distinct layers. Some layers seem to contain complete other chocolates. Another (labeled choccolaate) seems to be maybe nougat. One layer looks suspiciously like raw muscle with chunks of bone (it's labeled lotceate), and the one below it also looks weirdly organic and fibrous (it's labeled somhintae coocolaate).

ChatGPT had only suggested "chocolate coating" and "a creamy filling", so there goes that hypothesis about DALL-E trying to include everything on ChatGPT's list. I'm afraid to ask what's in the layer labeled "Lotceate". Or the one labeled "Shocolafablte".

Bonus content: Another of my favorite chocolates, plus a labeled chocolate box.

Subscribe now