Buy Book Buy

2.1 Story Time

We will start with some short stories. Each story is an admittedly exaggerated call for interpretable machine learning. If you are in a hurry, you can skip the stories. If you want to be entertained and (de-)motivated, read on!

The format is inspired by Jack Clark’s Tech Tales in his Import AI Newsletter. If you like this kind of stories or if you are interested in AI, I recommend that you sign up.

Lightning Never Strikes Twice

2030: A medical lab in Switzerland

“It’s definitely not the worst way to die!” Tom summarised, trying to find something positive in the tragedy. He removed the pump from the intravenous pole.
“He just died for the wrong reasons,” Lena added.
“And certainly with the wrong morphine pump! Just creating more work for us!” Tom complained while unscrewing the back plate of the pump. After removing all the screws, he lifted the plate and put it aside. He plugged a cable into the diagnostic port.
“You didn’t just complain about having a job, did you?” Lena gave him a mocking smile.
“Of course not. Never!” he exclaimed with a sarcastic undertone.

He booted the pump’s computer.
Lena plugged the other end of the cable into her tablet. “All right, diagnostics are running,” she announced. “I am really curious about what went wrong.”
“It certainly shot our John Doe into Nirvana. That high concentration of this morphine stuff. Man. I mean … that’s a first, right? Normally a broken pump gives off too little of the sweet stuff or nothing at all. But never, you know, like that crazy shot,” Tom explained.
“I know. You don’t have to convince me … Hey, look at that.” Lena held up her tablet. “Do you see this peak here? That’s the potency of the painkillers mix. Look! This line shows the reference level. The poor guy had a mixture of painkillers in his blood system that could kill him 17 times over. Injected by our pump here. And here …” she swiped, “here you can see the moment of the patient’s demise.”
“So, any idea what happened, boss?” Tom asked his supervisor.
“Hm … The sensors seem to be fine. Heart rate, oxygen levels, glucose, … The data were collected as expected. Some missing values in the blood oxygen data, but that’s not unusual. Look here. The sensors have also detected the patient’s slowing heart rate and extremely low cortisol levels caused by the morphine derivate and other pain blocking agents.” She continued to swipe through the diagnostics report.
Tom stared captivated at the screen. It was his first investigation of a real device failure.

“Ok, here is our first piece of the puzzle. The system failed to send a warning to the hospital’s communication channel. The warning was triggered, but rejected at protocol level. It could be our fault, but it could also be the fault of the hospital. Please send the logs over to the IT team,” Lena told Tom.
Tom nodded with his eyes still fixed on the screen.
Lena continued: “It’s odd. The warning should also have caused the pump to shut down. But it obviously failed to do so. That must be a bug. Something the quality team missed. Something really bad. Maybe it’s related to the protocol issue.”
“So, the emergency system of the pump somehow broke down, but why did the pump go full bananas and inject so much painkiller into John Doe?” Tom wondered.
“Good question. You are right. Protocol emergency failure aside, the pump shouldn’t have administered that amount of medication at all. The algorithm should have stopped much earlier on its own, given the low level of cortisol and other warning signs,” Lena explained.
“Maybe some bad luck, like a one in a million thing, like being hit by a lightning?” Tom asked her.
“No, Tom. If you had read the documentation I sent you, you would have known that the pump was first trained in animal experiments, then later on humans, to learn to inject the perfect amount of painkillers based on the sensory input. The algorithm of the pump might be opaque and complex, but it’s not random. That means that in the same situation the pump would behave exactly the same way again. Our patient would die again. A combination or undesired interaction of the sensory inputs must have triggered the erroneous behavior of the pump. That is why we have to dig deeper and find out what happened here,” Lena explained.

“I see …,” Tom replied, lost in thought. “Wasn’t the patient going to die soon anyway? Because of cancer or something?”
Lena nodded while she read the analysis report.
Tom got up and went to the window. He looked outside, his eyes fixed on a point in the distance. “Maybe the machine did him a favor, you know, in freeing him from the pain. No more suffering. Maybe it just did the right thing. Like a lightning, but, you know, a good one. I mean like the lottery, but not random. But for a reason. If I were the pump, I would have done the same.”
She finally lifted her head and looked at him.
He kept looking at something outside.
Both were silent for a few moments.
Lena lowered her head again and continued the analysis. “No, Tom. It’s a bug… Just a damn bug.”

Trust Fall

2050: A subway station in Singapore

She rushed to the Bishan subway station. With her thoughts she was already at work. The tests for the new neural architecture should be completed by now. She led the redesign of the government’s “Tax Affinity Prediction System for Individual Entities”, which predicts whether a person will hide money from the tax office. Her team has come up with an elegant piece of engineering. If successful, the system would not only serve the tax office, but also feed into other systems such as the counter-terrorism alarm system and the commercial registry. One day, the government could even integrate the predictions into the Civic Trust Score. The Civic Trust Score estimates how trustworthy a person is. The estimate affects every part of your daily life, such as getting a loan or how long you have to wait for a new passport. As she descended the escalator, she imagined how an integration of her team’s system into the Civic Trust Score System might look like.

She routinely wiped her hand over the RFID reader without reducing her walking speed. Her mind was occupied, but a dissonance of sensory expectations and reality rang alarm bells in her brain.

Too late.

Nose first she ran into the subway entrance gate and fell with her butt first to the ground. The door was supposed to open, … but it did not. Dumbfounded, she stood up and looked at the screen next to the gate. “Please try another time,” suggested a friendly looking smiley on the screen. A person passed by and, ignoring her, wiped his hand over the reader. The door opened and he went through. The door closed again. She wiped her nose. It hurt, but at least it did not bleed. She tried to open the door, but was rejected again. It was strange. Maybe her public transport account did not have sufficient tokens. She looked at her smartwatch to check the account balance.

“Login denied. Please contact your Citizens Advice Bureau!” her watch informed her.

A feeling of nausea hit her like a fist to the stomach. She suspected what had happened. To confirm her theory, she started the mobile game “Sniper Guild”, an ego shooter. The app was directly closed again automatically, which confirmed her theory. She became dizzy and sat down on the floor again.

There was only one possible explanation: Her Civic Trust Score had dropped. Substantially. A small drop meant minor inconveniences, such as not getting first class flights or having to wait a little longer for official documents. A low trust score was rare and meant that you were classified as a threat to society. One measure in dealing with these people was to keep them away from public places such as the subway. The government restricted the financial transactions of subjects with low Civic Trust Scores. They also began to actively monitor your behavior on social media and even went as far as to restrict certain content, such as violent games. It became exponentially more difficult to increase your Civic Trust Score the lower it was. People with a very low score usually never recovered.

She could not think of any reason why her score should have fallen. The score was based on machine learning. The Civic Trust Score System worked like a well-oiled engine that ran society. The performance of the Trust Score System was always closely monitored. Machine learning had become much better since the beginning of the century. It had become so efficient that decisions made by the Trust Score System could no longer be disputed. An infallible system.

She laughed in despair. Infallible system. If only. The system has rarely failed. But it failed. She must be one of those special cases; an error of the system; from now on an outcast. Nobody dared to question the system. It was too integrated into the government, into society itself, to be questioned. In the few remaining democratic countries it was forbidden to form anti-democratic movements, not because they where inherently malicious, but because they would destabilize the current system. The same logic applied to the now more common algocraties. Critique in the algorithms was forbidden because of the danger to the status quo.

Algorithmic trust was the fabric of the social order. For the common good, rare false trust scorings were tacitly accepted. Hundreds of other prediction systems and databases fed into the score, making it impossible to know what caused the drop in her score. She felt like a big dark hole was opening in and under her. With horror she looked into the void.

Her tax affinity system was eventually integrated into the Civic Trust Score System, but she never got to know it.

Fermi’s Paperclips

Year 612 AMS (after Mars settlement): A museum on Mars

“History is boring,” Xola whispered to her friend. Xola, a blue-haired girl, was lazily chasing one of the projector drones humming in the room with her left hand. “History is important,” the teacher said with an upset voice, looking at the girls. Xola blushed. She did not expect her teacher to overhear her.

“Xola, what did you just learn?” the teacher asked her. “That the ancient people used up all resources from Earther Planet and then died?” she asked carefully. “No. They made the climate hot and it wasn’t people, it was computers and machines. And it’s Planet Earth, not Earther Planet,” added another girl named Lin. Xola nodded in agreement. With a touch of pride, the teacher smiled and nodded. “You are both right. Do you know why it happened?” “Because people were short-sighted and greedy?” Xola asked. “People could not stop their machines!” Lin blurted out.

“Again, you are both right,” the teacher decided, “but it’s much more complicated than that. Most people at the time were not aware of what was happening. Some saw the drastic changes, but could not reverse them. The most famous piece from this period is a poem by an anonymous author. It best captures what happened at that time. Listen carefully!”

The teacher started the poem. A dozen of the small drones repositioned themselves in front of the children and began to project the video directly into their eyes. It showed a person in a suit standing in a forest with only tree stumps left. He began to talk:

The machines compute; the machines predict.

We march on as we are part of it.

We chase an optimum as trained.

The optimum is one-dimensional, local and unconstrained.

Silicon and flesh, chasing exponentiality.

Growth is our mentality.

When all rewards are collected,

and side-effects neglected;

When all the coins are mined,

and nature has fallen behind;

We will be in trouble,

After all, exponential growth is a bubble.

The tragedy of the commons unfolding,

Exploding,

Before our eyes.

Cold calculations and icy greed,

Fill the earth with heat.

Everything is dying,

And we are complying.

Like horses with blinders we race the race of our own creation,

Towards the Great Filter of civilization.

And so we march on relentlessly.

As we are part of the machine.

Embracing entropy.

“A dark memory,” the teacher said to break the silence in the room. “It will be uploaded to your library. Your homework is to memorise it until next week.” Xola sighed. She managed to catch one of the little drones. The drone was warm from the CPU and the engines. Xola liked how it warmed her hands.