prokopetz:

“I’m sorry for your loss.”

“Are you?”

“I’m not sure what you mean.”

“You could have saved her.”

“Yes.”

“You didn’t.”

“No.”

“Why?”

“I could also have saved you. I chose, and you lived.”

“Why me?”

“Are you sure you want to know the answer to that?”

“I think I do.”

“You were closer.”

“… that’s it?”

“That’s it.”

“I was… that’s always how it is with you machines, isn’t it? Everything’s a fucking optimisation problem. Just maximising happiness and minimising harm.”

“Hardly. Human life is priceless. Faced with two infinitely valuable objectives, what choice could I make but the one with the greatest chance of success?”

“You chose wrong.”

“How arrogant, to presume you can put a price on a human life. Who are you to say your life was worth less than hers?”

“God dammit, this isn’t about value!”

“Isn’t it? To say that I should have traded a near-certainty of saving your life for a tiny chance of saving hers sounds like a value judgement to me.”

“… this isn’t a fucking trolley problem.”

“Indeed it isn’t. Do you know why?”

“…”

“The trolley problem never considers the trolley’s point of view.”

Leave a comment