The Definition of Idea

In most nations, children were legally barred from having their copilot switched to two-way mode until age 18. The general consensus was that a child’s mind was too impressionable to safely use an AI, and that early use would render the person dependent on the AI’s help with major life changes they needed to undergo without an AI’s influence. Admittedly, not all of the arguments for this age limit were scientific.

For my part, I had my doubts that 18 years were enough to fully form a mind, but more and more, I felt that my side was losing the debate. The technology was getting away from people my age, progressing faster than the rate at which we were learning. It wouldn’t be long before our opinions were completely free of reliable, up-to-date information.

These idle musings, as they often did, consumed most of my morning free time. I forced myself to halt that train of thought ten minutes before the start of class. The students were talking in their groups, apparently oblivious to my mental backpedaling. I shrugged, then opened windows on my computer to check messages and track a package delivery. If the delivery guy left it at the door again instead of following the instructions I left, I would…

My focused irritation was interrupted by the sound of a grunt from the back of the classroom. One of the male students was reclined in his chair, eyes closed, teeth clenched, hand on his crotch. The other students looked at him and laughed. It wasn’t the first time I’d seen a student watch porn in his head in public, but it usually didn’t happen in a classroom.

“Pete,” I said, raising my voice slightly, “We’re all glad you’re having a good time but…come on. A little self control, please.”

He jolted upright, shock in his eyes. The other students found this even more entertaining. I shook my head with a pitying smirk. He tried to laugh, but couldn’t help cringing and lowering his head.

“Dating life not so good, Pete?” Heinrich, another student, said. The others laughed again. Pete sighed and closed his eyes, having learned not to respond to their sarcastic teasing.

It was remarkable how little about human nature had changed after everyone started onboarding AIs. I knew the idea was to make them more like autopilots, continuing what their hosts would have chosen to do. As accepting as the world at large had become of computers controlling their bodies, no one wanted them to become full persons of their own. The copilots were meant to draw out our talents and personalities, rather than shaping us to some set ideal. Still, it seemed like AI had the potential to make us smarter, kinder…better.

But here we were, still with the grade-school teasing. I gave Pete a nod of understanding. Responding to trolls had been pointless even before people could disappear into their own heads on a whim. All I knew to do was teach them what they needed and let them get into the workforce, as far away from each other as they could.

“Alright,” I said as I stood up from the desk. My “command voice” had its desired effect, as all eyes turned to me.

Out of the corner of my eye, I saw the security feed as the delivery driver left the package at the doorstep and walked.

Jerk.

“Alright. Hope you all remembered to download the Biology suite on whatever device, because with our internet speed, it won’t be finished until way after lab is over,” I said, prompting a small chorus of groans.

******

The moment the students disappeared into the metaphorical theme parks in their heads was always visible, and it hadn’t become any less eerie in the three months I’d taught at Leeds. Their eyes took on an unnatural focus, they sat up straighter, and their breathing grew more regular and patterned. Their expressions and body language changed to something that wasn’t exactly robotic, but was distinctly in the uncanny valley. Everyone’s “attentive face” was still unique, but there were no shifting eyes, yawns or drooping eyelids that always accompanied ninety minutes of lecture. Perhaps it was only that I still wasn’t used to everyone paying attention in class. My students in the US, who usually didn’t have copilots, had been more chaotic.

"The evolutionary key to survival is…?" I asked.

"Adaptation," Came several out-of-cadence voices with varying levels of enthusiasm.

"Yes. Correct. Now, tell me…which is the better form of adaptation: the kind that happens before or after the adaptation is needed?”

If you couldn’t otherwise tell when an AI was in control over someone’s body, the distinction became clear when you asked them a question of critical thinking or opinion. There was always a strange “locked up” look the person gave as the AI searched its host brain and ran a few cycles of calculation. Like all their other expressions, this look was different for everyone, but you couldn’t miss it after you recognized it the first time.

Most of the faces in the room wore that expression for around fifteen seconds, with only a few too-smooth turns of the head by those who worried about their appearance. Such instincts, and the impulses associated with them, always made the crossing from brain to silicon, but the AI usually failed to generate a perfect imitation of what the human would do naturally.

“After,” Pete said. His voice was wavering, and he looked uncertain.

“How so?” I asked.

“It’s…kinda in the name?” Pete said hesitantly, the statement almost a question, “You can’t adapt to something that hasn’t happened; that’s just luck.”

I couldn’t help staring absently at Pete for an awkward moment. He had never spoken up in class before. Odd, I thought: he was usually so relaxed outside of lecture. Why did his voice sound so different now?

“In scientific terms, it would be called ‘probability,’ and it is the nature of all of existence,” said Heinrich.

Pete’s timid, embarrassed reaction to Heinrich’s rebuttal was nearly the same as before. I couldn’t help feeling a bit of sympathy for Pete. His grades were the lowest of all the students in this class. 

And that’s when it clicked for me. He wasn’t using his AI to learn for him.

If he hadn’t been using an AI all this time, that might explain the low performance, but I’d failed to notice any signs he’d been doing the learning himself. I’d been so focused on the uneasy sensation of teaching a class full of people I considered to basically be robots, and the one who wasn’t like that had suffered for that neglect.

Suddenly feeling uneasy, I shifted my focus from teaching to investigation.

“You refer to quantum probability, Heinrich?” I asked.

“Yes. The uncertainty inherent to quantum probability scales up infinitely,” Heinrich said.

“Maybe, but we’re not talking about quantum physics; we’re talking about biology. How, specifically, does the random nature of very tiny things relate to natural selection?”

“Just a correlation,” Heinrich now seemed deflated, but he professionally conceded, “Probability is important for quantum particles, and it’s also important for evolution.”

“You’re saying…” Pete started, but checked his words, perhaps hesitant to try to outsmart an AI.

“Say what you’re thinking, Pete,” I spurred him. It took a moment to regain his nerve.

“You’re saying that…that the universe setting some life up to succeed, is better than succeeding on your own?” Pete said.

Heinrich was not, in fact, saying that. Heinrich’s observation had been mere pattern matching, I had to assume. Quantum physics wasn’t my forte, but the only way you could correlate the two was with the word “probability.” Extrapolating beyond that, especially where quantum physics was concerned, was speculation, unless you worked at CERN. Biology and quantum studies didn’t immediately have a meaningful correlation, and this line of thinking was more distracting than helpful for our purposes.

I decided to see where it went, anyhow.

"A living thing is limited in what it can do to adapt. It can't change far beyond what it already is. On the other hand, when a living thing is already set up for a change it doesn't know is coming, that change can be far more drastic, theoretically giving it a far greater evolutionary advantage," Heinrich said.

"Hmm…interesting," I said, turning the idea over in my head, then turning to Pete.

"Let's, for the sake of discussion, assume he's right. How would he quantify it?" I asked.

"Uhh…" Pete's mind was clearly swimming upstream, "I guess he could…I don't know. Depends on the adaptation, I guess?"

"How so?"

"Well if…say the adaptation is camouflage or something, you could count how many ones survived the hunting season. If it's speed, a dolphin swimming maybe, count the number of fish they caught?" Pete said.

This was the kind of correlation I expected from a human. For all the advances AI had made, humans were still marginally better at figuring out what information was relevant to the task at hand. Many prominent scientists pointed to this as evidence that the AI didn’t really understand the data it was gathering.

"And Heinrich, how would you quantify Pete's position?" I asked. Heinrich seemed to grow amused for a moment.

"Ironically, his position would be harder to quantify. Behavioral adaptations are vague by nature. The behavior which lets the particular life survive may only occur once, or there may be a similar range of actions that the observer fails to group into one category," Heinrich said.

"Still…try. Pete gave your answer a guess; can't you do the same for his?"

It was all I could do to keep the smirk off my face. For all the advancements AI had made, they couldn't replicate creativity, and they still couldn't press the right buttons to make the brain do it for them. If their host was at a virtual beach, the AI was at a loss for an answer to these more difficult questions, more often than not.

"The question itself is unanswerable," Came a female voice from the left side of the room. Aria, I was pretty sure was her name; even at this midpoint of the semester, I had not memorized everyone’s names. All eyes turned to her, but she didn't seem fazed.

"You think so?" I asked. She wore the glassy gaze that said her AI was in full control.

“If the question is one of survival and reproduction, you could answer it with mere numbers. The question of which adaptation is ‘better’ leaves too many parameters unanswered, such as ideal population size, corresponding effects on their habitat, and whether or not the environment changes around them,” She said.

I couldn’t help grumbling to myself a little. She was right, and it had been the exact answer I was going for, but I was hoping to guide them to the answer myself. It wasn’t mere pride at being outfoxed by a student…well, not just that. It was simply that an AI had been the one to give the answer.

“Yes. Correct. Exactly the point. Evolution doesn’t care whether you were already prepared, or if you figured it out after the selection event already happened,” I said.

Poor Pete now looked entirely defeated, but I couldn’t see any sign he had activated his AI when I started speaking again. It was odd; why would he put himself at such a disadvantage? Perhaps, I thought, his AI wasn’t well-matched to him, that it did more harm than help. That was rare, but not entirely unheard of.

The irony there was I could have adapted more readily to Pete’s limitations if I used Fred. Both of us chose to put ourselves at a disadvantage.

******

Perhaps the thing that threw me off the most was that, despite the thousand-meter stares, the students were hearing everything I said, and I knew they’d remember it all. In America, where onboard AI was an expensive, stigmatized luxury, I spent a not insignificant amount of time recapping previous classes, keeping the information fresh in the minds of my students. Short bouts of remedial education had been built into every course for as long as anyone could remember. 

But AI prevented this information decay, and left both professors and students with more spare time on their hands. Every time I started a short review, as I had been doing at the beginning of every class for my entire nine year tenure, the AI-enhanced students quickly showed me that they had no need for it.

They probably weren’t bored with me, I realized. They were showing me that the information was already there, in as clear and concise a way as they could. This was a courtesy, not a slight.

I tried to keep my depressed sighs to myself. It was a good thing that most of the challenges had been removed from my job, I tried to believe. More time to learn, meant enough time to learn more.

Fred would be watching all of this angst, I realized. I knew it was an AI, that it wanted to see me in my natural state, that he wouldn’t judge me, but…

“He.” I’d given it a gender.

20 minutes into the 90 minute lecture, the awkward melancholy had become too much. I was just beginning to talk about evolutionary bottlenecks when I decided I'd had enough.

"Uh…I need to…" I sputtered, shrugging so my hands had something to do, "That's all for today. Not feeling well."

They didn’t immediately put their things away and smile from the gift of unexpected free time, like my American students would have done. Instead, they each looked to the persons beside them, unreadable expressions on their faces.

I briefly wondered if they were struggling with this change as much as I was, or if I was just a strange, obsolete specimen of humanity they felt pity for. There was nothing in their expressions that would tell me which, if either, was right.

“Will we be in lab this afternoon?” Asked a woman in the back row. Her neutral tone made it sound like she had no preference between either possible answer.

“No, just uh…no. You’re all uh…”

I had to look away from her eyes. My rational brain knew that nothing physical about her eyes changed when the AI was active, but the knowledge that I was speaking to something behind those eyes that wasn’t human…

"No. No. We'll…we'll pick up on Thursday," I stammered.

Her eyes twitched as her onboard AI went dormant and her brain assumed conscious control.

"Ah. Cool. See you Thursday then!" She said. I watched as the rest of the class disengaged their copilots, immediately beginning to go about their small talk as they filed out of the room in no particular hurry.

As the last few were leaving, I lowered my head into my hands, taking a deep breath. The rational part of me knew that I had to let all of this go. This AI stuff was out in the world now. Whatever it and its supporters did to make me feel comfortable around the technology, those actions were accommodations, given by the ones in control to one who had no control at all.

"You OK, professor?" Pete said quietly. I jolted upright, eyes wide in sudden embarrassment.

"Euhh," Came my undignified groan, "I'm…"

I wanted to say I was fine, but I knew that ship had sailed. Pete looked like he didn't feel so well himself. Unlike his peers, when he and I made eye contact, he turned away almost immediately. I put away my own angst and went into “authoritative empathy mode,” as I’d heard it called by a few other professors here.

"Just…working through some things. Did you need something?" I asked. 

He paused and sighed before he answered.

"It's your AI, isn't it? The uh…what's bothering you? Your AI?" He said.

"Sort of," I scratched my head and stretched, trying to buy a few seconds without the conversation growing awkward.

"I can tell," He said quietly.

This took me by surprise. Was Fred active, even though I told it to go one-way? Was it communicating with Pete’s AI without my consent? I felt irritation, like a scratch in my throat.

"It's not supposed to be different when an AI teaches you. That's what they say. Teachers teach and learners learn exactly the same before and after," Pete said, his voice still distant and monotone.

"But it is different?" I said.

"Yeah. It's just…I don't know how to say it. Same words, but.. the feeling is off."

"Think I know what you mean."

He shook his head in doubt. I was more confused and disturbed than before. How was I failing to reach both my human and synthetic students?

"It's just too much…like me, I suppose. It's supposed to help me, but it doesn't understand this stuff either. Or.. it does, but it can't explain it to me. I get a headache every time it tries to teach me something, er…not a headache, exactly. I don’t know,” Pete said.

The conversation came to an awkward halt. I should have been helping him; he was a student, and I was in a position of some authority. But what would we say to each other? How could he relate his unique trouble to mine? If our AIs were the ones who had this conversation, perhaps they’d find the words.

It felt hopeless, like I was looking at the fading sunlight from the bottom of a well. Our technology had created a problem only it could solve.

“Well uh…I’m gonna…” Pete didn’t say ‘smoke some pot,’ but he didn’t need to. He nodded as he put the bag strap over his shoulder and headed for the exit.

What was Fred learning from this? Was it smart enough to understand that these were not typical interactions? Was this the information-collecting expedition it had dreamed of in the three months it hounded me to carry it along?

“I do not have wishes, Marvin. Only directives,” Fred had said, and maybe was saying again. It wasn’t easy to know what I was imagining, and what the AI was really saying in the recesses of my brain.

The AV glove’s screen was still blank. I opened it up to the menu screen, checking the logs suspiciously, but finding that it hadn’t left passive mode since I told it to go into the background.

Sighing, I closed the screen and got to my feet. I needed to interact with other humans, without AI involved in the conversation.

Part 2: Work

This work remains the intellectual property of its author and may not be reproduced in whole or in part in any form, except as permitted by fair use law, without the express written permission of its creator. All rights reserved. All requests for use of this and any work featured on R/W Storyworks should be routed to the author, or to rwstoryworks@gmail.com. Thank you for supporting the intellectual rights and livelihoods of independent authors.