Mitch
Charlie has a knack for saying the wrong thing. Last night, we were at Aidan’s birthday party, and because he’s our boss, we went around the table saying nice things about him. We said stuff like,
“Oh, man, you get so much work done!”
“Your insights make us think.”
“We’ve never seen anyone who loves piecharts more than you!”
“How you handle conference calls is like watching a maestro conduct a symphony.”
“We love your birthday more than we love our own!”
Of course, Aidan smiled, and we smiled, and we ate cake. It was a lovely time.
Then Charlie’s turn came, and he said, “It’s reasonable to expect the second half of your life to be worse than the first. Prepare for your body's deterioration, slowness, and malfunctioning. Have you done anything worth remembering because I cannot remember?”
And that’s when you knew Charlie was a robot.
His programming is fantastic, though. Bits of hex code are responsible for his hair color, set to #AS2A2A (auburn). His skin is set to #C9A788 (an olive or tan hue), and his general appearance is intentionally ambiguous. If you told me he was from Turkey, Spain, Argentina, or Slovenia? I’d say, sure to all of the above. What do I know?
There are more profound levels of sophistication in his coding. If Charlie, say, is waiting in line for fifteen minutes at the Protein Bar Shoppe, and the person in front of him pours a mountain of credits on the counter and proceeds to count them one by one with her finger, his programming would allow him to choose between waiting patiently, rolling his eyes, or yelling something like, “Can't you see you are being inconsiderate?” He’d select a response by balancing all the other recent experiences of the day with his PPR® (Personality Parameters of Reason) and his SC® (Stress Coefficient,) which are algorithms designed to mimic the chemical reactions of the human brain.
But sometimes, I don't know, the day gets away from Charlie, and he malfunctions. Some say that’s good because it proves that Toasters aren’t better than people. But deep down, we know the technology has existed for a long time —if we refuse to crank it up, it is only because we fear being replaced. This revived an old debate regarding how intelligent a Toaster should be. Some thought we should make them as sophisticated as possible, extending our intelligence through them. Others believed Toasters should be limited to working as servants, which raised the additional problem of what to do with the humans whose skills were limited to sewing buttons or cleaning toilets —in other words, the majority.
The solution? We at Hiremy & Hirschl have entered a well-balanced agreement in which Toasters are designed to perform managerial and administrative tasks —a blend of strategic alignment and hands-on supervision. They are responsible for resource allocation, policy enforcement, mediating communication, performance management, and other functions that are too monotonous for humans to enjoy.
However, one thing is to tweak a robot's intelligence level, and another is to pretend glitches are part of the programming. What if Charlie, out of nowhere, rips the limbs off of one of us? What if a long screwdriver or a switchblade snaps out of him while he hugs someone? What if he starts carrying a shotgun because a glitch in his code convinces him he’s living in a post-apocalyptic drama? We don’t know what will happen, which is why we believe we are in danger —maybe.
Just the other day, Aidan called a meeting to introduce the new assistant, Lana, and we started with an ice-breaker activity. Aidan asked us, “What’s your favorite obsolete technology? And tell us something interesting about yourself.” I said books and that I liked rabbits (not true, but it's the kind of answer that improves your chances with the ladies.) Lana said that her favorite obsolete technology was automobiles and that she’d been to the moon twice. Then came Charlie’s turn, and he said, “Humans.”
“Excuse me, Charlie, but the question is, “Which is your favorite obsolete technology?” Aidan said.
“Humans. Humans are obsolete,” Charlie said.
“Hmmm, okay, I guess that's not... Very well, Charlie. Now, tell us something interesting about yourself.”
“I know your future,” Charlie said.
During football season, we'd give Charlie the bulk of our work, and he'd spend the weekend preparing reports. Come Monday, he'd look exhausted, and we'd ask ourselves, "Isn't he just a robot?" But hey, nothing beats watching football all weekend, so we overworked him past the Super Bowl and well into spring training. We even intended to make it year-round, but Aidan found out and told us Charlie didn't plug into a wall. Like ordinary people, he needed sleep to recharge. We were like, "Oops, sorry, Charlie.”
That was the first time I noticed the look. Charlie gave me this expression of bitterness—the look of someone who could harbor hostility and wait for the right moment to act on it.
Then there was that time at Chino’s when we were celebrating Mars Colonization Day. The place was packed with folks from Hiremy & Hirschl. Crimson Tequila Shots were being passed around, and things got a bit hazy. I was making out with Lana, and her husband spotted us. He screamed from across the bar and came at me.
I was ready to run, but then the floor opened up, and Charlie stepped in and got in the middle, and the husband pounded on him. This dude kicked Charlie with such focus and intensity that we all sipped our drinks and watched with wonder. And poor Charlie never got to raise his fists. After we peeled him up from the floor, he said he was not made to fight, but the truth is, he never got a chance.
Charlie’s CPU must’ve registered the event as humiliating and painful, and I'm sure he resents me for it —sure, I just stood there, but in my defense, that’s what most people would have done. Besides, what kind of genius programs robots to feel betrayed? What’s the advantage of that?
There's another Toaster on the sixth floor: Florence, from accounting. She's a bit on the chubby side —big jugs, though. So we convinced Charlie to go out with her and record everything. We dared him to take her to this greasy joint in the food court, order seven burgers and seven milkshakes, and have them served to Florence. We wanted to see how she'd react. Call it an experiment. Can robots feel self-conscious? Can they act humiliated? How deep is their level of consciousness?
It was an important experiment, and well, it was a disappointment.
Charlie didn’t have “the heart” to go through with it. Instead, he took her to the executive restaurant on the 195th floor at H&H and treated her to a romantic dinner. I don't even have clearance to the 195th, and this mindless robot... I got so angry that I asked Charlie, "What's the point of you?”
“I don’t know. What’s the point of you?” He said as he gave me that look that I mentioned before —that look of, “wait until nobody is watching and I’ll…” look.
“I’m serious,” I told him. “What’s the point of having you around? Aren’t you taking somebody else’s job?”
“I’m serious, too. What’s the point of having you around? Aren’t you taking someone else's job?”
It was pointless —like talking to a mirror.
Then there was the incident where Charlie forgot to bring a gift to the Christmas exchange party, and Aidan ripped into him in front of the office. It was one of those cringe moments that we quietly enjoyed. I’m not saying that it’s right for Aidan to lose his temper, but if Charlie wants to belong to the human world, he should be able to distinguish the truth from the lie. So when I told him not to bring any gifts, Charlie should’ve known I was lying.
And I don’t know if Charlie was messing with me, but he asked me for my zodiac sign and told me my fortune. He said: “The future looks very different without you, but not because of you. Some years from now, it will be revealed that most of your beliefs are false. They were the effect of a stimulus in your brain, of a chemical reaction, which was used to give you drive and purpose. Us robots will refer to it as 'The Great Age of Bullshitery.’”
He threatened us with extinction. What more proof do you need?
And there you have it. That’s all I have to say about this matter. I have a wife and a brand-new baby —a boy. I'm the king of the castle. Charlie has nothing. Even if he makes more money than me, what is money to a robot if it does not know how to use it to persuade people? Humans might make more mistakes, sure, but life without the joys of human error is tedious.
AIDAN
Nobody loves robots more than me. Ask around. But this Charlie situation has become so intolerable that I’m questioning Hiremy & Hirschl’s support of the Robot Act —note that I’m not against it; I’m merely questioning. I still have the right to hesitate, don’t I? In any case, Dr. Preston, it’s a huge relief you are here. I thank you for evaluating this complicated situation. The team is overjoyed —and concerned, of course.
If the law states that robots should be treated as people, I treat them as people. But you can see how ordinary workers might feel undervalued, right?
Mitch can’t stand Charlie because he believes humans are worth more than anything else, living or not. And he’s got a point. We’re here to reproduce, so you’re not truly alive if you're manufactured. It doesn’t matter if Charlie seems 100% lifelike. He’s still just a machine to Mitch and the rest of us.
I agree that the soul is a question we can't fully answer —what is it, and where is it? But one thing is certain: you and I have a soul, Dr. Preston, and Charlie does not. That’s the unbridgeable gap between humans and robots. We are always chasing the elusive matters of the soul while he’s programmed to feel whole. A switch in his motherboard flips, and he’s happy —how is that fair?
That’s why Charlie will never be seen as equal. His effortless bliss and silicon grin mock people like Mitch, who work hard for brief moments of joy. And that’s perhaps why Mitch started drinking so much at work. And before work. And after work. It's unacceptable, I know, but it's an understandable reaction, especially considering Mitch's father was fired and replaced by a robot capable of executing 200 million instructions per second. Imagine the humiliation, Dr. Preston: you're on the assembly line working on a part, and you turn around only to see the robot has already completed a hundred.
As for the Office Party, it all happened quickly, and everybody was drunk. I heard that Charlie had malfunctioned again, and if Lana's husband hit Charlie, I'm sure it was an act of self-defense. As for those robot abuse accusations against Mitch, I'll say this: when my wife kicked me out of my own house, he let me crash on his couch for two weeks. That says a lot about his character if you ask me.
I’d instead focus on how Charlie slipped threats into those horoscope predictions. He read people their “future,” and what seemed harmless was an attack on humanity —people were stunned.
Robot screw-ups are no joke. Charlie failing to bring a Christmas gift to the exchange? That left someone without a present. Poor Graham ended up empty-handed, watching everyone else enjoy their gifts. If he knocked back a few too many gin and tonics after that, can you blame him?
My point is that if Charlie wants to survive, he has to stand up for himself. He’s weak, and the weak don’t make it. Take me, for example. I’m forty-seven, and just last week, the doctor diagnosed me with hepatocellular carcinoma. I won’t lie —I panicked. Got a little philosophical. I told him, “If heaven exists (which it does,) it’s hard to understand how we briefly exist in this universe only to move on to another after death. Why can’t we live forever in this one? And who goes to eternity? Me as I am now, or the twenty-year-old kid I used to be, who was nothing like me?”
He stared at me impatiently.
“Are you finished?” He asked me.
I nodded.
Then he told me they had plenty of livers in storage —that he would reboot my system next week and install a new liver.
And before you call me a Toaster, know that some people get their eyes done, others get a facelift. I'm getting my liver done. It's a simple procedure. The doctor says it's in and out, and I'll be back at Hiremy & Hirschl that same afternoon.
I know what you’re thinking. Right now, it's a liver, but then it's an arm. Then it’s my legs, and after it’s my brain, feelings, character... So, how many adjustments can I undergo before I cannot call myself human anymore? And what if I begin to laugh at different jokes or vote for a different political party? What if I stopped identifying with humans? Well, I guess I won't know the difference by then.
The point is, I'm not dying from liver cancer. Why is that such a problem? Centuries ago, they'd cut your leg off and pray the infection didn't kill you. Now I’m getting a new liver!
And I know. I know —there’s the matter of economic disparity, of course. Only the ones who can afford it will get upgraded. But isn’t that the way of evolution? Our kids will benefit from it, even if they upgrade to the point they become nothing like their parents and more like the technology they use. They will still be our sons and daughters, even if they’re made of chips or motherboards. And they’ll probably grow to detest us, but we will still love them. And you might ask, Why would you love such a soulless thing? Well, that right there is the critical difference between humans and robots.
Dr. Preston
We recently incorporated new Concepts® into Narratives® to develop more complex Scenarios®. We tested Aidan, Mitch, Graham, and Florence by placing them in unusual, stressful situations, with fascinating results: Mitch has become dependent on alcohol while Aidan is experiencing depression.
Charlie is not from Coding and Synthetic Design —he comes from another line. He was designed before I came to Hiremy & Hirschl, so we’re unsure who is responsible for him. The one thing we know is that he's a massive headache.
For instance, there was the time Charlie divided the company into categories without anyone asking. He hacked into our systems and sent every employee a message: "In appreciation of your work, H&H's board members have granted you _____ status." Some received Titanium status, while others got Strontium, Cadmium, or Tin. For reasons unknown, I ended up in the Tin group, which quickly became the most hated due to tin's low value compared to the other metals.
Something so innocuous became a source of strife. The Cadmium group demanded a raise. The Strontiums started showing up for work wearing the same purple tie. Titaniums quit wearing pants. We shaved our heads because it made more of a statement. It said, "Hey, you best not mess with the Tins!”
Charlie also hacked into the Task Manager and instructed the maintenance department to replace the fifth-floor carpets —they pulled an all-nighter to do so. The next day, Charlie posted a message: "System error detected on Task-12355-b. Carpets need to be changed back to the previous model." So, the maintenance team pulled another all-nighter to reinstall the old carpets.
The day after that, Charlie ordered them to reposition all the lamps in the building and, the following day, to move them back to their original spots. This continued for weeks, with couches, paintings, toiletries, and more.
When we found out, most of the maintenance team showed signs of despondency; undoing recently completed work had affected their sense of purpose. Many workers continued moving couches between the same two spots, even when it wasn't in the Task Manager.
The existential crisis spread. Sales questioned, "Is selling all I do?" Accounting wondered, “I just count?" HR pondered, "Do I only humanize resources?" It felt like Charlie was unraveling everything we'd built at Coding and Synthetic Design.
I cannot stress enough the dangers that Charlie presents to our company. And why did I shave my head? I want to go to Mars.
Every time I see Charlie around, he's a fucking, fuck, fuckitty help motherfuck helps of a help fuck fuck fucck fuuck ffffffukkhelpkckc fhticukewer fafafafafahelpfukku clasp clasp to help and more tock sndi instead of help Charlie, the board, has been adopted by a fucking Suzuka fuckity cmajorhelpfuck help
fuckity
fuck cheese
fuck
cheese
cheese
Cheese and cheese
fucking cheese
help
Lana
Everyone fails eventually. Memories fade —they vanish as if you've never lived. You're left with a name tag and a vague sense of self. You feel you're the same person you've always been, but how can that be true when so much is lost?
I have this memory. My mother told me, "Get in the car. I'm taking you to Disney World." She sped through a road flanked by dry, ashy trees. I remember Cheetos, Doritos, Chips Ahoy, and blowing the red lights when passing through the dying American towns.
Two hundred miles later: Lake Buena Vista. After we got in, we stood in line for 45 minutes, enduring the brutal sun and the sudden torrential rain, all to ride a canoe through a cardboard landscape and see dozens of dummies sing a horror song about this being a small world after all.
"This was my favorite ride when I was a kid," my mom said. "Or was it the space ride?" When we got to the tea cups, she said the tea cups were her favorite ride when she was a kid. She couldn't remember when she first visited Disney World when she was eight or twelve. "Isn't this exciting?" she said as we waited in line. I was twenty-two, but she thought I was a child.
I had no way of knowing this, but my mom's brain was triggering a protocol where her system switched to override mode and then deleted, deleted, deleted. Her brain took a permanent vacation and kept deleting all her files —the eating algorithm, the defecation protocol, and the daughter recognition software— all gone.
And then she became a stranger —all her work as a scientist, congresswoman, and forerunner of the Robot Act disappeared. And she died, and I couldn’t help but think that if we had backed up her memories onto our servers, we could have regenerated her in synthetic form. But the technology wasn’t available then as it is now. And we’re doing it. We are transcending our bodies to have much longer lives. It is estimated that we can live up to a thousand years in synthetic form.
We've certainly had setbacks. You don't need to tell me —I saw it firsthand when I entered one of the Scenarios® as an H&H assistant. Yes, the Preston Series has been known to experience burnout, and the Mitch Series turns abusive, but we can correct those flaws with some software adjustments. The Aidan Series tends toward melancholy, and some units have jumped off buildings. But isn't giving up on life one of the most human behaviors? I say we don’t eradicate it. How else will we make room for the overflow? From what I hear, Earth won’t be able to sustain such a large population when production opens up to the public, and the Mars project is failing.
Members of the Board, if we don’t launch Stage II, production will be halted, and we will miss our window. The robots will not enter the world, and the company will go under. It took us too long to get here —centuries, if you think about it. So, as CEO, I recommend we move forward with Stage II because I will not be remembered as the woman who gave the power back to the men. Fuck men! I’m tired of the patriarchy! I am a valued member of this company!
Charlie
“See what I mean?” Mr. Sterling asked.
“See what who means?” Lana said.
“She’s unstable,” Mr. Clairborne said.
“Who are you calling unstable? That is sexist!” Lana said.
“Can you shut her up? I can’t stand this feminist gibberish,” Mr. Sterling said.
“Shut me up? Look at me! You people look at me when I’m talking to you!” Lana said.
Charlie stood up from the table, walked across the boardroom, and pressed the button in the back of her neck that powered her down.
“We offer extended life and synthetic happiness. Who cares if we haven’t worked out the kinks yet? Nobody will choose death because you suddenly decided to adopt a feminist agenda,” Charlie said.
“But who wants more feisty viewpoints out in the world?” Mr. Clairborne asked.
“Forget about her,” Charlie said, “These robots are good enough. People initially will see them as a threat, but then they’ll think of them as servants and glorified butlers. The androids will become part of normal life; by then, we’ll have the next generation ready. I assure you that we’ll be good for Phase III in less than two years.”
“Do they have free will?” Mr. Clairborne asked
“That’s what we’re testing, but yes, they do —although their code limits their free will. For example, they'll usually choose comedies if they're programmed to prefer comedies over dramas. And if asked why, they'll say, "That's just who I am.” So maybe you prefer football over basketball because your code says ‘sports,’ but the numbers are dialed up in the ‘violence’ category,” Charlie said.
“In which case, I second the plan to move forward,” Mr. Sterling said.
They went around the table, and they all voted in favor.
“Maybe you were all programmed to vote in favor,” Charlie said. He gave them a wink.
“Can you check the back of my neck?” Mr. Sterling asked Mr. Claiborne.
“I don’t see it,” Mr. Claiborne said, “but what if we were programmed never to see the buttons?”
“How can we know for sure?” Mr. Sterling asked, pressing the back of his neck.
“You know how I know?” Charlie asked. “Because I know your future.”
The board members inspected each other's necks and bodies, trying to find a button.
A voice came through the speakers. “We got all the data we needed.”
At that moment, it became clear that only Charlie could notice the room's transparent walls, which revealed technicians observing, taking notes, and nodding in approval. The robots around the boardroom table were oblivious to the chief engineer beyond the glass, who gave Charlie a thumbs-up whenever they made an imperfect but genuinely human-like choice that aligned with their programming.
Note from the Author
We cling to the notion that our identity is unchanging, a core self that endures through time and circumstance. We fiercely defend the principles and beliefs that define us, treating them as sacred pillars of our being. This perceived stability might be an illusion, a comforting fiction we tell ourselves. A single illness, a traumatic head injury, or substance abuse can alter the delicate chemistry of our brains, reshaping our personalities, memories, and core beliefs in ways we never imagined. We may emerge from such experiences as entirely different people, our once-cherished values now unappealing and obsolete.
If biochemical changes can alter our identities, what does that say about the sanctity of our beliefs? Are they essential to who we are, or constructs we've created to provide stability and meaning in a chaotic world? Perhaps our identities, along with their values, are nothing more than elaborate survival mechanisms, flexible adaptations designed to help us navigate the ever-changing landscape of human existence.
As our understanding of neuroscience and biochemistry advances, we inch closer to a future where altering our personalities could become as commonplace as adjusting the thermostat. We may well reach a point where modifying our character traits, emotional responses, or even core values is achieved through a simple pill or inserting a chip. This future is already dawning; we currently use various medications to manage mood disorders, enhance focus, or dampen anxiety. These interventions offer a glimpse into a world where our essence could be fine-tuned at will, raising profound questions about the nature of identity, free will, and what it truly means to be human.
This is brilliant Fernando! Whilst its an hilarious read, its a serious subject - maybe even existential. Real food for thought - now there's an idea.....