A courts reporter wrote about a few trials. Then an AI decided he was actually the culprit.

September 23, 2024

A pair of glasses next to a cell phone with ChatGPT prompts.

(Nieman Lab) – For one German reporter, the statistical underpinnings of a large language model meant his many bylines were wrongly warped into a lengthy rap sheet.

When German journalist Martin Bernklau typed his name and location into Microsoft’s Copilot to see how his articles would be picked up by the chatbot, the answers horrified him. Copilot’s results asserted that Bernklau was an escapee from a psychiatric institution, a convicted child abuser, and a conman preying on widowers. For years, Bernklau had served as a courts reporter and the AI chatbot had falsely blamed him for the crimes whose trials he had covered. (Read More)