Search or fabrication?

I recently started experimenting with Bing's new ChatGPT-powered chat tab. This is the first thing I asked it for:

I've put red boxes around the factual errors. What is notable is that these are not just slight typos or errors in context - those items never appeared anywhere on my blog, and are pure fabrications. Asking for further details or clarification didn't help - when I asked it for more paint colors, 40% of them had never appeared on my blog. Most of its color descriptions were also wrong.

red boxes mark factual errors

And when I tried to point out a mistake, it doubled down, generating more imaginary facts to fit its original position.

(I did generate some paint colors using HSV encoding but none of the colors it named were generated by that neural net.)

These are not cherrypicked examples; I don't think it generated a single response to me that didn't contain at least one made-up fact. Most contained several. In the example below, it had information about the gist of my most recent blog post, but made up every single one of its examples.

Bing chat is not a search engine; it's only playing the role of one. It's trained to predict internet text, and is filling in a search engine's lines in a hypothetical transcript between a user and a chatbot. It's drawing on all sorts of dialog examples from the internet, which is why it so frequently slips into internet argument mode. In its internet argument examples, the person being called out for an incorrect fact will usually double down and back up their position with more data.

So when it mentioned a nonexistent AI Weirdness post on Battlestar Galactica and I challenged it, it invented a nonexistent separate AI Weirdness newsletter and finally a completely fabricated excerpt from it.

Note mouseover text for reference 1.

Ironically the reference it gives for its fabricated battlestar galactica episode summary is actually my post on Galactica, a similar chat-based "knowledge engine" that has a tendency to make up facts. Asked about my Galactica post specifically, Bing gets the gist correct but makes up the specifics.

I find it outrageous that large tech companies are marketing chatbots like these as search engines. And people have been fooled - into asking authors and librarians for nonexistent references, or into contacting a Signal number that turned out to belong to someone named Dave.

If a search engine will find what you're asking for whether or not it exists, it's worse than useless.

I do have a real newsletter, which is this blog in email newsletter form. There are also bonus posts for supporters, and you can read one here, in which I ask Bing chat for a list of 50 AI-generated paint colors from my blog.