Don’t Let that Lying ChatGPT Trick You!

Just because technology is easy to use, doesn’t mean it’s right.

ChatGPT is renowned for its wide breadth of knowledge. As a scientist-writer, I use it almost daily to discover sources for my writing. For instance, to validate a strange storyline I knew zilch about, I gave ChatGPT the following prompt:

“List examples of dead people who have reappeared in full human form to provide a message to loved ones.”

It responded thusly:

Wilson, C. (1990). Afterlife: A History of Life After Death. Grafton.

ChatGPT said of the book, “It surveys historical and modern reports of post-death contact, including full-body manifestations intended to comfort the living. Covers spiritualist and non-religious traditions alike.”

It even offered to provide the book’s bibliographic citation in all the usual formats, including APA, Chicago, and MLA. After struggling to do that job manually many times, I thought that was a great feature.  Thank you, ChatGPT, for saving me the time and effort!

That was followed by what it called a “Summary Chart” of information:

CaseDeceased PersonMessage TypeWitnessesSourceNotes
Lincoln Whitehouse GhostAbraham LincolnSilent presence, calmMary Todd Lincoln, staffWilson (1990)Seen in bedroom and halls post-assassination

I was impressed!

Except for one tiny detail. When I attempted to buy the book, it did not exist!

It was pure fabrication! Or as AI apologists prefer to call it, a hallucination!

Wilson’s Book

The actual book by Colin Wilson, which I now own, is titled “Afterlife: An Investigation of the Evidence for Life After Death.”

Version 1.0.0

It was initially published in 1985, then under a new publisher in 1987, but NOT in 1990. It was first published in Great Britain by Harrup Limited, not by Grafton. Furthermore, the book includes nary a single word about “Lincoln” or “White House.”

So, ChatGPT got the title, year, and publisher wrong. Therefore, the nicely formatted bibliographic citations were garbage. (Woe be to any student who might include them for “references.”)

Furthermore, ChatGPT’s offer to summarize the book was a red herring. The book had nothing to do with Abraham Lincoln, Mary Todd Lincoln, or the White House.

It is hard to imagine how ChatGPT could cram such a great amount of disinformation into such a small space.

Philip Arnold

To add to the confusion, there is a book titled Afterlife: A History of Life After Death. It was written by Philip Almond, an academic, and published in 2016 by Cornell University Press.  

Ironically, by 2016, Colin Wilson was deep within his Afterlife. He passed in 2013 at the age of 82.

Due Diligence

The phrase “due diligence” refers to thoroughly checking your sources. Your teacher/professor probably knows the assigned subject matter and references better than you and ChatGPT combined.

It cost me five dollars to verify my reference by purchasing the book, but it might cost students a lot more if they don’t. So, students beware! By asking ChatGPT to do your homework, you may risk receiving an F on your grade.

As we say in the aviation world, check, check and double-check. So far, until proven otherwise, no AI assistant is entirely trustworthy.

Siri versus Napoleon Bonaparte

Napoleon Bonaparte once famously said, “A soldier will fight long and hard for a bit of colored ribbon.”

At precisely 10:09 this morning I was in an office discussing awards, and the lack thereof, for civilian service members in military organizations. It was a matter of fact discussion, contrasting the award system for civilians and the military. And at that moment, Napoleon’s famous quote came to mind. I reminded that executive of the above quote.

My fellow workers and I talk frequently, and there have been numerous discussions in that office, and elsewhere, that have been of a sensitive nature.

As I turned and returned to my office, I heard a familiar voice coming from my pocket. “That’s not nice!” it said.

In utter dismay, I pulled my iPhone from my pocket where it had lain untouched and unused for quite some time. And that was when I saw the following plainly written on my phone’s screen.

Siri’s vulgar word has been redacted.

Siri was scolding me!

Unknown to us, Siri had been listening, transcribing what it THOUGHT I was saying, clearly imagining vulgarity where there was none.  After I ended the conversation, Siri addressed me like she was my mother.

Now, a human would know those transcribed words were ludicrous, nothing but gibberish, but not the phone’s AI system controlling Siri. Unbelievably, that system took the gibberish seriously, perhaps by parsing a few words out of context. And in spite of that stupidity, Siri felt led to judge me!

Perhaps smart phone  AIs are taking themselves too seriously. Perhaps they think they have advanced  enough that they now think they can pass judgment on human speech.

A few years ago, in another meeting, in another room, Siri spoke up unbidden while we were discussing sensitive project planning.

The door to the conference room had been closed so we wouldn’t be disturbed. But disturbed we were when Siri suddenly spoke and said, “I don’t know what you mean.”

Everyone at the table stared first at my phone and then at me, perhaps wondering if I’d been recording the planning meeting.

AI is certainly becoming increasingly intrusive. But as shown by Siri’s text message to me today, it’s still not smart. And arguably that’s a scary thing.

For example, supposedly China is using data collected from social apps (collected by various AI systems) to rate the trustworthiness of its citizens. That’s bad enough, but what if the data collected is garbage like the recorded text today, and the AI uses that faulty data to make a perfunctory and wildly incorrect judgment?

And, scary thought, what if that social monitoring trend were to spread to the U.S., and your character could to be judged based on the digital algorithms of certifiable AI idiots?

If that doesn’t worry you, perhaps it should. It certainly did me, enough to cause me to shut down all access to Siri … for almost 24 hours, until I was driving home and said, “Siri, call home.”

She was silent, sullen, unresponsive.

 

 

 

 

 

 

Verified by ExactMetrics