WebNovels

Prologue - The Prison Without Walls

I always woke up on that same jump ramp, inside that same damned cargo aircraft. The door still closed, everyone lined up, waiting for our turn to leap.

I always followed the same protocol: check the rifle, the pistols, the extra ammo, the grenades, and the mortars. Finally, the daggers—weapons of last resort.

But there were no parachutes to check; they weren't necessary. Here, I was a combat android, designed to survive a freefall of over fifteen hundred feet.

My real body was still rotting somewhere, but my mind was trapped in a simulation. All of this existed for one purpose: to train an artificial intelligence.

Did everyone here know it was a simulation? They never explained—just shoved me in. But I think so. You could see it in the desperation in the other androids' eyes.

It was bizarre how they'd been designed to mimic human expressions. What did they want? For the machine to show empathy before killing?

This was hell, but I had my favorite moment: when the doors opened, and we finally jumped. Those seven or eight seconds of freefall were the only peace I ever got. Because when we hit the ground, a blood-soaked battlefield awaited us.

Sometimes, the enemies started shooting before we even landed. A big mistake. It only pissed me off even more, made me even more aggressive. I wasn't like this before. I can't even remember when I changed.

Once on the ground, there was no time to think. Find a target, aim, pull the trigger, advance. Find another target, aim, pull the trigger, advance. Body after body, no thought, no remorse, no empathy.

Was this intentional? Was this simulation designed to strip away whatever traces of empathy we had left?

The battle scenarios didn't vary much. There were only seven or eight options.

Sometimes, we fought somewhere in South Asia. Our enemies were a Hollywood-esque stereotype of revolutionaries, modeled after the Viet Cong.

Other times, we were dropped into a fictional Brazilian favela, fighting drug dealers—AK-47s in hand, soccer jerseys on their backs.

Or else we were thrown into Eastern Europe, fighting an anachronistic version of some imaginary Soviet army.

This time, we dropped somewhere in the Middle East. Our enemies were the clichéd Arab rebels—white turbans and robes included.

What's interesting is that there was no scenario where we dismantled a neo-Nazi cell in Germany. Or a white supremacist group in rural America.

I guess that made it clear who the enemies and allies were for the bastards keeping me trapped in this nightmare. After all, as that famous brazilian singer once said: "Narcissus finds ugly what isn't his mirror."

I looked like the enemy. No doubt, I was nothing close to what they saw in their reflection. If I weren't trapped here, I'd probably be hunted by the same androids I was helping train.

You'd ask: Then why not resist? Why play along with this simulation nightmare? Trust me, I've tried—and it wasn't pleasant.

In the real world, where my body withered in some filthy basement, I had no rights. They could use any torture method they wanted to force me.

They felt no remorse torturing us. The irony was that this whole project existed because of "white guilt." The androids didn't need to learn how to aim and shoot—they mastered that in months.

What they needed to learn was when not to pull the trigger. And that exact situation was unfolding before me.

A civilian was running toward me. That was my real purpose: to teach those damned machines whether or not to blow that guy's brains out.

Not that the people keeping me here had any qualms about killing whoever stood in their way. But the headline "Androids Kill Innocent Civilians" always tanked their stocks and gave them headaches.

People have no problem enjoying the privileges that dead innocents bring them—but being reminded of it over morning coffee? That's a different story.

At first, I didn't know how to handle that dilemma. Yes, I had military training, but I'd never seen combat before this hell.

I hesitated, analyzing every detail before deciding. But now? I just shot them in the legs—didn't even stop to think.

After all, "Android Maims Innocent Civilian" wasn't shocking enough to make headlines. They wouldn't feel guilty about what they didn't know.

Isn't it strange that now I just wish there were no journalists or activists left to remind people of the atrocities out there? At least then, I might have a chance to escape this hell.

This fucking place was really changing me. I wonder if the same was happening to the others trapped here like me. If so, how was it affecting these machines?

But I didn't have time to think about that. Today, the guy really was a suicide bomber, not an innocent civilian, and he blew up right in front of me.

For an android's body, the shrapnel from a blast like that meant nothing. But for my body? That's another story.

This was an extremely realistic simulation. If I didn't know it was fake, I might've struggled to tell it apart from reality.

But no computer in the world could simulate something this perfectly. That's not how it worked.

I don't even know how to explain it, but they used some kind of hypnosis—or something like it—to trick my mind into believing it was real.

That's why, even though the androids felt no pain, I did. The shrapnel that didn't pierce my body clung to my skin, still burning.

At least, that's how my mind processed it. I don't know if it was the same for everyone else.

The pain was unbearable. But I couldn't stop. If I did and disrupted the simulation, my fate would be even worse.

I had to keep going: find a target, aim, pull the trigger, advance, and do it all over again. Even after running out of ammo, we drew our daggers and pushed forward.

That's when this hell reached its gruesome climax. We abandoned cover and charged, chests exposed, daggers in hand. Our enemies emptied their magazines into us.

The most efficient way for an android to kill a human with a dagger was to slit the throat. A cut anywhere else rarely caused instant death.

Stabbing any other part of the body required two movements: thrust in, then pull out. The most efficient method was to take the head off in one swing.

The androids had the strength and precision to do it. When that moment came, the battlefield became a festival of severed heads flying through the air.

I watched it all while enduring the pain of bullets from enemies still standing.

No matter how grotesque the scene, it didn't faze me anymore. I'd seen it too many times to care.

The only thing I never got used to was the pain. Probably by design. After all, what kept me obedient, cooperating with the simulation, was the memory of the torture sessions I endured when I didn't comply.

If they let me learn to endure pain the same way they let me shed all empathy, maybe those torture sessions would stop working.

The pain was really unbearable. At least the android's body would eventually take too much damage and shut down.

Then everything would go dark, and I always woke up on that same jump ramp, inside that same damned cargo aircraft.

My name is Sekou. The last name doesn't matter — I'm just another Johnson. My mother named me after Ahmed Sékou Touré. She used to say she preferred Sankara, but Sekou sounded more like the name of an anime character. Don't bother trying to understand it, I don't either.

She was a Black woman who raised her children alone, a fan of Tupac, who always managed to quote Fanon in the lectures she gave me.

Man, I loved my mother.

My father, like most fathers where I come from, was nothing more than a blank space on the birth certificate. My mother was everything to me, my safe haven.

At least she had already passed when they threw me into this place. I don't even want to imagine how much it would have hurt her, seeing her son turned into just another bag of flesh trapped inside a demonic machine.

In this hellhole they locked me in, I don't even have time to feel nostalgic. I'm just one more among thousands of inmates in BrainNet.

A state-of-the-art prison system where prisoners' minds are connected to continuous neural simulations used to train artificial intelligences. When I first heard about it, of course I was scared.

Among my friends, that's all anyone could talk about. I was just as outraged as they were — I knew exactly what a technology like that meant. But I was too tired to turn my outrage into action. What would my mother think of that attitude?

Some evil genius from MIT — or one of those other big-name universities — came up with the concept. I call him an evil genius, since he's the reason I'm in this mess, but the poor guy was probably just another scientist who had no control over what — the real evil geniuses — did with his work.

The concept was actually pretty simple.

Despite billions of dollars being spent each year to improve artificial intelligence, there was an impassable barrier: computers are nothing more than number-crunching machines, and there are problems that simply can't be solved numerically.

Alan Turing foresaw this decades before the AI boom. He proposed that the only way to solve this kind of problem was to include an oracle in the machine — something that could give answers when no numerical solution existed.

Turing's Oracle, as the concept became known, was exactly what I had become. Yes, the simple solution proposed by the evil genius from MIT was to add a human mind — connected through what they named BrainNet — to the array of processors running AI algorithms.

The result exceeded all expectations. BrainNet dramatically improved the performance of artificial intelligences. But it came at a very high cost — a human cost.

I've told you about my hell, but training combat AIs wasn't the only use for the system. Other professionals were imprisoned in BrainNet too.

Picture this: you're a talented musician, and they want to use your creative mind to train music AIs.

So they create a simulation where you're performing on a busy street. Every so often someone walks by humming a slightly off-key melody. You can't help it — you have to compose something using that melody, no matter how weird it is.

Right after you finish the song, before you can even enjoy a sense of accomplishment, another poor soul walks by humming another weird tune. And you start again. And again. And again.

Want another example? Imagine you're a children's story writer. They stick you in a simulation where you're telling stories at a school. Every moment, a kid comes up asking for a story with the most absurd plot: "Miss, tell me the story of the princess who killed aliens to escape marrying the prince!"

You make up the story, but moments later, another kid comes with another even more ridiculous request. And again. And again...

Take any profession, and it's not hard to imagine a simulation that could be used to train some damned AI.

Some of those simulations might even seem harmless at first. But imagine doing that over and over and over again, until your life becomes nothing but that simulation.

As innocent as they might seem, they all eventually became true nightmares. Which, of course, raised questions about human rights and the use of such a method.

So how did they solve the human rights problem? One simple sentence took care of everything: "Human rights are for righteous humans." That, and a truckload of marketing money, of course.

After a long lobbying campaign from multiple sectors, they managed to amend international human rights conventions and stripped all inmates of their rights. After all, there are no righteous humans in prison, right?

That's how many prisons around the world began to be converted and integrated into the BrainNet system. Of course, not all of them — some inmates were still more like them and less like us, so they still deserved their rights.

Soon, the majority of the world's prison population was connected to BrainNet — just like me. But of course, even that wasn't enough. They wanted more.

At first, they went after the usual suspects — locking people up for drug dealing based purely on skin color. Then others as undocumented immigrants, based only on their accent.

They loved a good protest — it was perfect to increase data diversity. All they had to do was accuse everyone of terrorism. Of course, it depended on the kind of protest and who was funding it. They wouldn't shoot themselves in the foot by going after their own pawns.

But even that wasn't enough. They needed more and more. That's when someone had a brilliant idea:

"We need a new crime. Something everybody does, something morally questionable but not necessarily illegal," said someone at the council of evil. At least that's how I imagine it went.

Then an evil genius replied, "Let's make being in debt a crime. Almost everyone has debt, and almost no one can pay it."

And that's how I became a criminal.

I was minding my business, working a nearly twelve-hour shift delivering food on my 'rented' bike, when a patrol car pulled me over.

At first I thought it was about the 'borrowed' bike. But soon it was clear it had nothing to do with that.

Structural racism, you might say. Not exactly — but somehow yes. For the first time in my life, I was stopped by the cops in a way that was legal. Completely immoral, but legal.

Congress had passed a law forcing service apps, like the food delivery one I worked for, to share user location data with government systems.

So, when a judge issued my arrest warrant, the system automatically accessed the database, found my location (provided by the app), identified the closest patrol unit, shared the warrant and coordinates, and within minutes, they found me and I was under arrest.

It felt like a ride share — there was already one prisoner in the car before me, and they picked up another one before we got to the prison.

They were arresting so many people that they set up a makeshift courtroom inside the prison to speed things up. I didn't even get a chance to defend myself.

According to them, I was caught in the act — since I hadn't paid my debt yet — and this was just a custody hearing.

I owed thirty-something thousand in student loans. I had a degree in engineering but was delivering food on a stolen bike, so you can guess I had no chance of paying it back.

Want to hear something funny? I was arrested for owing money — and my debt nearly tripled on my first day as an inmate.

To connect my mind to BrainNet, they had to perform surgery on my skull to implant connection points and sensors. That surgery cost nearly sixty thousand — and of course, the prisoner pays the bill.

I went in owing thirty, and by the end of my first day I owed almost ninety thousand.

Yes, I would serve a ten-year sentence, used to train AIs, to pay off a ninety-thousand-dollar debt. But I should be happy. After all, I wouldn't have to pay for food, rent, or health insurance anymore.

And I would still earn nine thousand dollars a year to pay off my debts. A bargain, right?

There was probably someone out there complaining about how easy life was for prisoners.

More Chapters