Although artificial intelligence (AI) is currently being applied in various fields, this technology is not infallible, and errors are not uncommon. Recently in the state of Utah, USA, a mistake occurred with the AI program used by the police, resulting in officers being described as frogs.
According to a report from the Associated Press on January 5, the Heber City Police Department in Utah has been using two AI programs, Draft One and Code Four, since December of last year. These programs automatically generate police reports based on footage captured by officers’ body cameras, aiming to reduce paperwork and allow officers more time to be on duty outdoors.
However, a report generated by the Draft One program mistakenly described an officer as a frog.
Officer Rick Keel from the department told KSTU TV that this error occurred because the AI program identified a video playing in the background of the footage, which happened to be “The Princess and the Frog,” a Disney animated film released in 2010. It was then that they realized the importance of correcting these reports generated by AI.
Keel stated that a major advantage of the software is time-saving, as writing reports typically takes 1 to 2 hours.
He said, “I can save about 6 to 8 hours per week now. I’m not very tech-savvy, so it’s very convenient to use.”
He mentioned that the department will continue to use AI programs but will enhance supervision.
Draft One was released by Axon in 2024, utilizing the language model Chat GPT-4 from OpenAI to convert law enforcement recordings into text.
AI blunders like this are not uncommon and can even lead to losses for people.
According to a report by the British Broadcasting Corporation (BBC), in 2022, Air Canada faced a lawsuit for misinformation provided by its AI chatbot to a passenger, Jake Moffatt, regarding ticket discounts.
Although Air Canada acknowledged that the chatbot’s response contradicted the company’s policy, they refused to provide Moffatt with the lower ticket price.
However, the British Columbia Civil Resolution Tribunal ruled that Air Canada was responsible for all information on its website, including responses from the chatbot, and had to compensate Moffatt with $812.02 in damages and arbitration fees.
The arbitration document stated, “Air Canada should be well aware that it is responsible for all information on its website. Whether the information comes from a static page or a chatbot is irrelevant.”
