Friday, June 20, 2025
" "

Top 5 This Week

" "

Related Posts

OPINION: AI Allowed as Legitimate Court Testimony Sets a Dangerous Precedent

Getting your Trinity Audio player ready...

Artificial Intelligence can write essays, emails, and basically optimize your entire life at this point, but did
you know it can also testify in a court of law? That’s right.
A judge in Maricopa County Superior Court in Arizona allowed a video crafted by AI technology and a script
written by the victim’s sister to play in court. The video served as the Victim Impact Statement.  While a
Victim Impact Statement being made into a “film” of sorts is already questionable, what really makes this
appalling is that the victim in this case is dead. 
Christopher Pelkey was murdered on November 11, 2021 during a road rage altercation in Chandler,
Arizona. After nearly four years and two trials, the shooter was convicted of manslaughter, plus one count
of endangerment to others. During the final sentencing for the murder on May 1, 2025, Judge Todd Lang
allowed for the AI video created by Pelkey’s sister Stacey Wales to be admitted as testimony.
It is legal in the state of Arizona to provide impact statements in  any format, including digitally–something
that already raises my brow. Shouldn’t  a video, presentation or voice note be given by the actual victim
or  least the defendant? 
To be clear, Judge Lang admitted that the video ultimately swayed his decision to dole out the maximum
sentence of 10.5 years to Gabriel Horcasitas, Pelkey’s killer.  “I love the A.I…thank you for that,” he said
and went on to applaud Pelkey’s family members for, “allow[ing] Chris to speak from his heart as you saw
today.” 
But Chris wasn’t speaking from his heart. He wasn’t speaking at all. It was AI reading a script by Wales to
sway the court’s opinion and perception of her brother and what he “would have wanted” if he were still
alive!
How someone who passed the Bar is allowed to make such a bizarre and scrupulous legal decision is
beyond me. How someone who put in years (and likely a ton of debt) to achieve a law degree–only to
abandon the eons-long tradition of reading, writing, critical thinking and doing things “by the book” as
being a lawyer requires–is not only baffling, but scary!
If a judge or jury can now decide the fate of someone’s life by emotionally responding to “vibes” of an
artificially fabricated video, then what does that mean for more complex cases like domestic violence or
police brutality?
You don’t even have to ask the “color” of the judge or the defendants, which brings me to my second
concern.  If AI Victim Statements become the law of the land, then how will Black and Brown defendants
or victims get fair verdicts? Black people aren’t believed in real life when tragedy strikes. Our innocence is
always questioned, even in death (Remember Trayvon Martin and Mike Brown?)
Would Black and Brown people be allowed to submit similar stories in a court of law? And if they were to
do so, would there suddenly be a price on this software? Every time Black people try to
progress–whether  getting a college degree or buying a house–the financial goalpost always moves!
Not only is Judge Lang’s emotional decision to allow AI testimony in the courtroom crazy, it’s also unjust.
Plagiarism has always been a problem with AI.  But when someone’s personal freedom is at stake, it will
only make the situation worse.  Now we’re talking about allowing plagiarism to creep into the delicate
process of making life or death decisions.  It’s sickening. 

My faith in humanity is somewhat restored by the comment section of the Youtube video of the
Christopher Pelkey Avatar Testimonial. Commenters were not gentle in their responses.  Some users said
Judge Lang should be disbarred for such a move and others questioned his sanity for allowing AI, and 
not evidence, to put the pin in his sentencing. 
The people are still sane. . .for now.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles

Enable Notifications OK No thanks