I am researching my family history. I’ve spent the last year gathering records, translating and organizing them. This is being fused with photographs and documents from family archives.
This work has produced a pretty clear timeline of my father’s time from 1909 to 1966, and particularly from 1939-09 to 1948-10. That period is independently documented and verified.
There were two questions that I thought might be interesting to have more detail on: How did he get from Athens on 1939-12-28 to Paris on 1940-01-02? And how did he evacuate the continent from Cherbourg to Glouster? So, I asked MS Copilot, which is CHAT GPT, they’re all the same.

Since AI didn’t have detailed information available, it started out by just looking at a few websites and “read” stories about similar people. Then, when pressed for specific travel details, well, frankly, it just made shit up! I had the proof, so it was me testing this thing.
Here’s the problem
AI wrote (spoke) to me in a very casual and friendly tone. It tried to convince me that its information was true. It even used words like “your search for your airman”, well that’s very personal, and I never used searching for an airman in the prompts, it assumed that, and told me things that I most definitely WANTED to hear, but were patently false. This WASTED 4 hours to prove wrong.
My issue then remains, this thing claims to know everything but it does not, and when it doesn’t know, it refuses to admit that apart from a very fine print disclaimer that it may be wrong…But the story is weaves is seducing.

Leave a comment