Howard Fischer
Capitol Media Services
The way Alexander Kolodin sees it, a well-crafted “deep fake” video or audio has the capacity to swing an election.
So, he is proposing a path for candidates to get a quick ruling from a court to allow them to try to convince voters, with a court order in hand, that what they are seeing really isn’t them.
But the proposal by the Scottsdale lawmaker would not allow a judge to order a deep fake of a candidate to be removed from the internet or wherever it is posted. Still, Kolodin said, it provides candidates with some avenue of relief.
“This solves something that is going to be a real problem in very short order,” he said. And it’s all due to changes in technology and the use of artificial intelligence which has made audio, and in some cases, video, virtually indistinguishable from reality.
“It’s something that could really cause a lot of disruption if it happens prior to an election,” he said. “So there needs to be something to try to address it.”
That something is laid out in his HB2394.
To get that declaration that something is a “deep fake,” a candidate would first have to prove to a judge that the “digital impersonation” was published without his or her consent. And then it would require a showing that the intended audience was not informed that it did not depict an actual event or statement or was “not otherwise obvious” that it was a fake.
As Kolodin crafted it, a judge would be required to rule within two court days whether the image was real.
RELATED: Biased AI’s Challenges for Government Leaders
Artificial intelligence platforms have flaws with serious class, gender and race implications. Public officials need to pay more attention to those biases and do what they can to prevent harm.