Proxy Judge
An artificial intelligence can't take over our decisions unless it has a front man.
“Judge Hanlon, why are you running for the District Judge seat?” asked the reporter. The conversation was being streamed live. A crowd of a hundred was watching them in the studio. A district judge race normally didn’t get this much attention. Normally it wouldn’t receive any. But this wasn’t a normal race.
“I’m running because we need to change how justice is done. We’re deciding people’s fates with methods centuries old. We have new technology. We need to use it. That’s what I will do as judge.” The sixty year old lawyer looked the part of a judge. Stern, fair, unyielding. His suit was a sober dark blue. Even his tie was dark regimental stripes. Everything about him was serious.
“But how can voters take seriously a judge who says he won’t do his job?”
Hanlon smiled thinly. “I was a district judge for fourteen years before I retired, Cindy. I know what the job requires. It’s mostly showing up and signing papers. The hard part—making the decisions—that’s a small part of the total effort. Yes, it’s the most important part. That’s why I intend to outsource my judicial decisions to an instance of the Golem-5 artificial intelligence.”
The audience stirred. Not in shock. They knew about that. That’s why they’d come. But a little, ‘Oh my God he’s really doing it,’ stir.
Cindy glanced down at her notecards but decided to go with the obvious follow up. “Why?”
“I made a lot of rulings in my previous fourteen years on the bench. I handed down a lot of sentences. Every time I looked at a man a jury convicted, I had to decide what the punishment would be. The minimum the legislature allowed? The maximum? Should I allow probation or insist on a prison term? Many times it was obvious. Too many times it was a hard call. I’d think, pray, and make the decision, knowing I couldn’t take too long to think.”
Hanlon scanned across the faces of the crowd. “I started out convinced every decision I made was right. Then, after a few years, I discovered facts I hadn’t known when passing a sentence that gave me doubts. But it was too late. I saw colleagues making decisions I was convinced were mistaken. Then I started to wonder if they thought that about my decisions.”
His eyes dropped to his feet. “After twelve years, I started taking a drink some nights so I could sleep after the hard decisions. When I found myself looking at the bottle and thinking a shot might help me make the decision, I resigned.”
Murmurs stirred the crowd. Some contemptuous, some sympathetic.
Hanlon looked up again. “I had regrets after I resigned. I worried if I was abandoning my duty and letting someone who’d make worse decisions step in.”
He stood and began pacing the stage. “I was brooding about that one day on my way to a meeting. I realized I was thinking about it in a car that was driving itself. Because it was better at driving than I was. And it hit me. Judicial decisions aren’t a creative act. They aren’t social activities. They should be a result of the facts. The sort of thing computers are better at than humans.”
Crowd noise again. Some agreeing, some disagreeing, some telling their neighbors to hush.
“But there’s no way politicians could decide to replace human judges with artificial intelligences. It would only take one unpopular decision for voters to turn on them and vote them out of office. They’d never have the nerve to do it.”
Hanlon stopped and faced the crowd. “The only way to get modern decision-making into a court is to ask the voters to do it. That’s why I’m asking you to vote for me, to act as proxy for an artificial intelligence which will make better decisions than a human will.”
A few people clapped. No one else joined in. The applause faded quickly.
The candidate took his seat again.
Cindy asked, “Why do you think artificial intelligence is ready for judging?”
“We’ve seen AIs take over more life and death roles. Probably everyone here came in a self-driving car.” He looked out at the audience. “Anybody here drive himself?”
A old man, hair reduced to a grey fringe around the side of his head, stood and waved.
“One driver. God bless stubborn old men, I love you. Ninety-nine others were brought here by an AI. The machines have taken over air traffic control. Diagnosing cancer. Administering medications. We’ve given those tasks to them because they do a better job than humans do.
“I bought an instance of Golem-5 because it had a reputation for analyzing moral dilemmas. I gave it federal laws, state laws, the court decisions that matter, everything decisions should be based on. It already had a grounding in literature, ethics, moral codes. Then I had it tackle cases I’d ruled on. I reviewed the decisions it made. The ones where it differed from the decisions I’d handed down . . . were the ones where I had the greatest doubt. I decided it could be trusted as a judge.”
“So you judged it,” said Cindy, “by how close its decisions were to your own?”
That drew a wry smile from Hanlon. “I did. What other yardstick could I use? I know what the prosecutors and defense attorneys would say.”
She pressed him. “Self-driving cars were approved because the ones allowed as experiments had fewer crashes than human drivers did. AI air traffic controllers have fewer close calls than humans. What objective criteria can we use to tell if an artificial intelligence is doing a better job of judging?”
“We’ll have to judge it the same way we evaluate human judges. Which comes down to whether the voters approve of its performance.”
“Looking at the historical record, voters re-elect most judges until they do something spectacularly wrong. Most often, that’s some scandal in their personal life, rather than a trial decision.” Cindy discarded that note card to the table beside her.
“I expect voters will take a closer look at my record than usual. Just like this race is already drawing more attention than usual,” said Hanlon drily. “We can’t start comparing AI judges to human ones until there’s one out there being tested.”
The reporter kept her tone cool and professional. “Self driving cars were tested on closed courses, then in low-speed neighborhoods, and only then allowed out on the highways. Why are you starting the AI judge in a district court, handling felony cases? Those decide people’s lives. Why not make one a justice of the peace, with lower stakes?”
“People care about felonies. If an AI is deciding vandalism cases, or settling disputes over whether a shed was too close to the property line, no one’s going to pay attention to whether the decision is right or wrong. So that wouldn’t answer the question of whether the AI is better than a human.”
Cindy shuffled her remaining notecards, then put them on the table. “Thank you for answering my questions, Judge Hanlon. You’ve been very thought provoking. I know some people in the audience have questions, would you be willing to take theirs?”
He nodded. “I’d be happy to. Could I ask that those registered to vote in my district be given priority?”
“All of them are voters in this election,” answered Cindy. “We had many more people interested than we had seats, so we restricted attendance to them.”
“I’m glad to hear that. Please, begin.”
Four interns holding microphones spread out among the seats.
The old man who’d driven himself claimed the first question. “Are you still drinking, sir?”
A flash of irritation went across Hanlon’s face. “I drink socially. I do not drink alone or at home any more.”
The old man sat, looking dissatisfied with the answer.
The next question was from a pink-haired young woman. “You mentioned having regrets, or at least doubts, about some of the cases you presided over. Could you give us an example of one and why you had doubts?”
Hanlon shook his head. “I’m sorry, divulging the confidential information that would require would be a violation of judicial ethics.”
An intern handed the microphone to a young man wearing a sweatshirt identifying himself as a student at UT Law. “Sir, would your unusual approach be ground for a change of venue?”
“Either party could certainly request a change of venue. They’d have to explain why they thought they couldn’t get a fair trial in my court. I think that could be interesting reading.” Hanlon flashed the aspiring lawyer a smile.
The next question was from a motherly woman in a pantsuit. “If you prove an AI can be a judge, would AIs wind up replacing jurors?”
That the candidate needed to think about. “I suppose in the future anything could happen. But I’m not looking that far ahead. I’m just looking at the judge’s role.”
A bearded man in a Star Wars t-shirt took the mike. “You said you’d be using an instance of the Golem-5 AI. That’s developed by a joint project of Technion in Israel and MIT. Wouldn’t it be influenced in its decisions by Israeli civil law and Jewish religious law?”
“Our law traces its heritage back to Jewish religious law, and the laws of other countries,” said Hanlon. “The Ten Commandments were one of the earliest written codes of law. British common law was drawn on by both the Israeli and American legal systems. All of the test cases I’ve run through Golem-5 were decided according to our state laws. I don’t see any problems with Golem-5 being partially developed in Israel, any more than I’m worried about it drawing on Massachusetts state law.”
Another man with a longer and greyer beard asked, “The MIT-Technion team has announced they’ll be releasing Golem-6 next year. Do you plan to upgrade your judicial AI?”
The candidate shook his head. “It took a lot of work to educate this instance to handle our legal system. I don’t want to start over.”
A woman with grey curly hair stood up, cane in one hand and microphone in the other. “You’re putting a lot of trust in this machine. What if it does something obviously wrong? Something everybody can see is wrong? What would you do then?”
“I’d resign,” said Hanlon calmly. “If it’s obvious this experiment has failed, I shouldn’t be passing on the AI’s decisions any more. And I’m not asking you to elect me for my own decisions.”
The next questioner was a young woman with long hair and glasses. “You said earlier you thought sentences should be decided by computers because they’re based on facts. But isn’t justice an emotion? Shouldn’t those decisions be based on what feels fair, given the whole picture around the crime?”
The candidate gave her a serious look. “We do base sentences on gut feelings. On the emotional feelings of what’s just in that situation. But I don’t think that’s how we should be doing it.”
He stood and started to pace again. “There’s a lot of factors that go into deciding what’s a just sentence. Did the victim provoke the act? Does the defendant have a clean record? Is it still a clean record if he has a speeding ticket, or some teenage scrape from twenty years ago?”
He turned to look at the young questioner. “The state legislature makes some of that decision by defining the allowable range of sentences. The federal government’s sentencing guidelines go into even more detail on the factor which can enhance or mitigate sentences. But the judge still has to decide which ones apply and how much. It should be something where we can put in the data and have a result. But it’s too complicated for humans to do that. And when we do, our unconscious biases can be an influence. Maybe the biggest influence.”
Hanlon went back to his seat. “So let’s base those decisions on the facts. Give the computer the facts, and have a proper algorithm make the decision.”
The interns searched the crowd for anyone else with a question. Finally a skinny old man with a completely bald head stood. “Your Honor, it sounds like you want to do this so you can shuck the guilt for your bad decisions onto a machine.”
The candidate was taken aback. After a moment he smiled. “Well, maybe. If I am, is that a reason to vote for me or against me?”
My thanks to Bill Stoddard for the suggestion which led to this story.
More stories by Karl K. Gallagher are on Amazon and Audible.
I'm glad to have stimulated your imagination! Now I'd like to see a sequel some day where Judge Hanlon finds himself in doubt over one of Golem-5's rulings, and has to resolve the conflict, one way or another.
I’ve gotten in arguments w people over how damaging AI could be. The thing is, computers can never be at fault — it’s humans who give them power.
There is a big scandal in the UK right now over a completely screwed up accounting system that was used to accuse people of embezzling…. And people just assumed the software was correct. And now I realize I’m going to have to write/talk about this.