The evolving landscape of artificial intelligence is increasingly intersecting with established societal institutions, and the judicial system is no exception. A recent case in Arizona has brought this convergence into sharp focus, demonstrating how AI can offer novel, albeit debated, avenues for participation in legal proceedings, even for those who are no longer alive.
A Digital Voice in Court
In a poignant application of technology, the family of Chris Pelkey, a man tragically killed in a 2021 road rage incident, utilized artificial intelligence to enable him to address the court during the sentencing of his murderer. As reported by the BBC, relatives compiled a collection of videos, audio recordings, and photographs of Pelkey. This material was then processed by neural networks to reconstruct his likeness and voice, creating a digital persona capable of delivering a statement.
The family felt it was crucial for Pelkey himself, in a manner of speaking, to have a presence and a voice at this critical juncture of the legal process. His sister, Stacy Wailes, explained that the AI-generated message was crafted to reflect Pelkey’s inherently forgiving nature, emphasizing his kindness and strong faith.
The Court’s Response and Expert Opinions
During the court hearing, the virtual representation of Chris Pelkey conveyed a message of forgiveness towards Gabriel Horcasitas, the man responsible for his death. The statement even suggested that under different circumstances, they might have formed a friendship. Arizona State Judge Todd Lang acknowledged the sincere nature of this unique address and subsequently sentenced Horcasitas to 10.5 years of imprisonment.
The use of AI in this context did not entirely surprise some legal experts. Retired federal judge Paul Grimm commented that artificial intelligence is already being employed within Arizona’s legal framework, for instance, to streamline certain judicial decision-making processes. He indicated that such technological applications could be considered acceptable under specific guidelines.
However, the development has also sparked ethical debate. Derek Leben, a professor specializing in business ethics, voiced concerns regarding the potential for AI to misrepresent the actual intentions or sentiments of a deceased individual. He questioned the reliability of a digital recreation in truly echoing a person’s authentic will.
A Tool for Closure
Stacy Wailes defended her family’s decision, asserting that they approached the use of AI with ethical consideration. She likened the technology to a tool, such as a hammer, which can be wielded for constructive or destructive purposes. “For us,― she stated, “this became an opportunity to give Chris the last word.” This perspective highlights a desire for closure and a unique way to honor the victim’s perceived character in the face of tragedy.

Chris brings over six years of hands-on experience in cryptocurrency, bitcoin, business, and finance journalism. He’s known for clear, accurate reporting and insightful analysis that helps readers stay informed in fast-moving markets. When he’s off the clock, Chris enjoys researching emerging blockchain projects and mentoring new writers.