The King County Prosecuting Attorney’s Office in Seattle has informed local law enforcement agencies that they will not accept police reports generated using artificial intelligence due to concerns about potential errors. An example cited in a memo from the office included a report that referenced an officer who was not present at the scene, leading to questions about accountability and accuracy in these reports. The memo was distributed to members of the King County Police Chiefs’ & Sheriff’s Association, and highlighted the potential implications of relying on AI-generated reports in legal cases.

Chief Deputy Daniel J. Clark outlined the office’s decision not to accept reports produced with AI assistance, referencing specific products such as Open AI’s ChatGPT and Axon’s Draft One. The concern raised is that these AI technologies may lead to unintended errors in police reports that could go unnoticed during review processes. Clark mentioned potential consequences for legal cases, communities, and officers if inaccurate reports are certified, noting the challenges in establishing oversight and accountability in these situations.

The memo discussed concerns about the use of AI technologies in law enforcement more broadly, highlighting the need for responsible innovation and rigorous testing of products. While acknowledging the benefits of AI in saving time for officers and improving efficiency, the office emphasized the importance of ensuring accuracy and compliance with laws governing information dissemination in the criminal justice system. The statement also referenced ongoing national discussions about AI in law enforcement and the potential for future developments in AI technology to address these concerns.

In response to the memo, Axon provided a statement affirming the safeguards built into their AI model, including the requirement for human officers to edit, review, and approve narrative reports generated by their Draft One product. The company stressed its commitment to responsible innovation, emphasizing the inclusion of human decision-making in critical moments and ongoing collaboration with various stakeholders to gather feedback on the use of AI technologies in policing and the justice system. Axon’s statement aimed to address concerns raised by the King County Prosecuting Attorney’s Office and reinforce the reliability of their product.

The decision by the King County Prosecuting Attorney’s Office not to accept AI-generated police reports reflects broader questions and challenges surrounding the use of AI technologies in law enforcement. The memo underlines the importance of maintaining accountability, accuracy, and compliance with legal regulations in the creation of police narratives. While acknowledging the potential benefits of AI in the future, the office remains cautious about current AI products and the potential for errors that could impact legal cases, communities, and officers. Ongoing discussions and collaborations with law enforcement agencies, prosecutors, and other stakeholders are essential to address these concerns and ensure responsible innovation in the use of AI in policing.

Share.
Exit mobile version