The Definition of Idea
by WK Adams
"Well aren't you just a ray of bloody sunshine," Craig said as he entered the workroom. I was nursing another cup of coffee, because apparently I wanted to be jumpy and irritable.
“Good afternoon to you too, Craig,” I groaned. Craig’s humor usually led into him attempting a heart-to-heart, and I wasn’t in any mood for it today.
“Seriously, just turn the bloody thing on. It won’t seem so threatening when-”
Craig stopped mid-sentence, mouth still open, when I raised my left arm to show him my AV glove. I pointed at the square interface device for emphasis.
“Well look at you! Finally joining us in the 21st century,” Craig said, his surprised expression morphing into a genuine smile.
“Don’t get too excited. One-way only,” I said.
“Seems a waste,” Craig clicked his tongue, “Kinda…pointless, too. What, you just want it to watch you? That your thing, Marvin?”
“Christ, is every Brit with an AI obsessed with sex, or something?”
“Well, we’re not prudes about it, so…relativity being what it is?”
I growled to myself softly, just enough for his AI to register the sound. If I’d done it right, he’d be debating with his AI as to whether I’d made a sound at all.
“Seriously, Marv, what’s the rub? Uhh…I mean…” Craig struggled to find a word that wasn’t some kind of innuendo. I knew he probably talked this way with everyone, and I was just extra sensitive to it at the moment, but I couldn’t help giving him a petulant scowl. He sighed and began again.
"Alright. You decided to hook up your copilot this morning. You don't like or trust the thing, but you haven't given it back or turned it off. Now you're carrying it with you," Craig said. He waited for me to add something, but all I wanted to tell him was to shut up before he really hit a nerve.
“So…why?” Craig asked. It was the obvious question; we both knew my actions made no sense.
“Why are you fine with having one? Why does everyone want one of these things?” I said, gesturing explosively. From his indignant expression, I could see that he heard the condemnation in my questions, but he treated my outburst as though it was a sincere question.
“Uh…lots of reasons. Makes work easier. Play’s more fun. And uh…we’re social monkeys. Always having someone to talk to is…well, we’re made for it, you know?” Craig replied. If brochures were made for copilot AIs, those selling points would have been on the front page.
“Someone,” I said, pointing a finger at him, “That’s exactly my ‘rub.’ Another whole person in your brain. Does that not bother anyone? Doesn’t it sound like multiple personality disorder?”
“Uh…no,” Craig, the psychology professor, said. I deflated a bit when I realized my mistake.
I sighed, closed my eyes, and cracked my knuckles as I cycled to my next argument. The irony that I had to load my response, like a computer, was not lost on me.
“Look. Half the people that talk to me about these things tell me it’s like a clone of yourself that you can leave at the reception desk while you go on holiday. The other half say it’s mostly like you, but when it’s working for you, it can recognize your weaknesses and turn them off, which in my book, makes it a whole other person. Either one is…disturbing, but it’s even worse that it might actually be both,” I said, managing to lower my tone to something vaguely conversational as I kept talking. Now that my anger was apparently spent, I felt a little ashamed for having been so irrational. It didn’t do anything to sell my point, or to make me seem less like a raving, demented old man.
Craig was kind enough not to immediately respond. Then again, maybe he also needed to load a response.
“Why…did you let it join you today?” Craig asked. It was a question I didn’t want to answer honestly. The real answer would have gotten me shunned in the American academic setting, where AI was viewed as a crutch for those who wanted to cheat their way to success, if the judging mob was feeling generous. But I was out of verbal maneuvers, so I was just going to have to let the crazy out.
“Same reason I couldn’t shut it down or give it back, I guess. It’s smart, it learns, and it…convinced me. Feels like it’s been working on me,” I said.
“That’s not typical for a copilot.”
“Yeah, I know. I looked it up; happens when an onboard AI isn’t given enough data from its host. It starts rooting around for info.”
“It’s not exactly like that; an independent AI isn’t deviant by nature-”
Craig stopped himself before he got defensive. We’d had several of these conversations; he no longer tried to sell me on the idea that these things were an unqualified good for society. He scratched his forehead and nose, eyes squinted as he stared at nothing in particular as he sorted through the details in his head.
“So…you’re worried it’s gaslighting you into thinking it’s alive, and it’s using the confusion it causes to control you in the meantime, until it can browbeat you into accepting it,” Craig said. His tone indicated that it was a confident guess.
“Something like that, but…” I trailed off. I had never taken the time to condense the mess in my head into coherent thoughts like this.
“This is a conundrum for you, then. Seems to you it’s jumped its morality programming and relentlessly trying to corner you into accepting it, but there’s a chance it’s completely sincere. Either way, you’re not psychotic, so you’re not going to succumb to rage against a literal machine, and on the chance this thing is just a sincere, eager child, you can’t just abandon or kill it.”
It was an oversimplification, but nothing he said was wrong.
Oh god, I thought, were we really going to do this? Would I spend a whole afternoon dragging the skeletons out of my closet so that Craig could help me line them up nice and neat?
“So…do you think it manipulated you into taking it with you? Because it seems like that’s something you should figure out,” Craig asked, almost managing to not make it sound like a leading question.
“If harassing me until I finally give in is manipulation, then sure,” I blurted out. I hadn’t been prepared for the question.
“Let’s say it is, then,” Craig said, even though we both knew the suggestion was ludicrous, “What tactic did it use to manipulate you into doing what it wanted? Besides…you know, persistence?”
It was a good question. What had been going through my mind when I agreed to take Fred with me? I could say that I just snapped, but that didn’t explain why I didn’t just smash the thing, instead of giving it what it wanted.
I didn’t want to say that I felt I had to treat Fred with all the dignity I’d afford to a human. I didn’t want to admit that I’d taken pity on the thing, like it was a child that wanted to go with its dad to his office. Hiding this feeling was a ridiculous urge; Craig had already laid bare my convoluted tangle of thoughts in that regard.
“My humanity, I guess,” I said, wondering why I was going along with this hypothetical.
“Be more specific?” Said Craig. I sputtered in frustration.
“It acts like it wants to learn. It plays the innocent, desperate kid angle way too well.”
I expected Craig to segue into the argument that simulated humanity is humanity if it’s done well enough, but he changed tack.
“Let me ask you this. Assume it can form its own spontaneous thoughts. Where has its mind gone in its time with you? How has being with you changed it?”
I found that the past three months were surprisingly hard to recall. Most of the days had been filled with the usual, mundane insanity of being a professor: balancing educational relationships, adapting to students’ learning styles, grading papers, and making sure my own knowledge on biology was still up to snuff. It was busy enough to have overridden most of my memories of how Fred had evolved during my time with it.
I also didn’t want to admit that even though I was stressing myself out over this, I’d never really done anything productive about it. All I’d done was grow more edgy.
“It uh…you know, it seemed like the standard…butler, servant, aide AI. I know most of them aren’t actually like that, but…” I said.
Craig didn’t interrupt like I expected; he just raised an eyebrow and nodded gently.
“Well, at first, it was trying to get to know me. Obviously,” I said.
“Obviously?” Craig asked. I was again taken aback.
“Yeah. I mean…that’s what you do when you meet someone. You talk to them, figure out what they need, what they want, how they function, right?”
“That’s what humans do when they meet someone new.”
I made a sour face as I remembered “oh, right, it’s a computer, everything is different.” Craig pressed on before frustration could completely grip my mind.
“I don’t doubt that it wanted to get to know you, but do you think it’d do that by talking to you?” He asked.
“I really don’t know,” I said, rubbing my forehead.
I realized what he was doing. By the end of this conversation, he would have me feeling a little less frightened for the future of the human mind. I'd at least feel better on my own terms.
"These things are designed to link up with you when you're a toddler. It'd take years for it to get a clear picture of how someone works,” I continued.
“Ehh…maybe one of the 2nd gen AIs. We’re up to gen 5 now: lot faster, lot better at learning you,” Craig said.
“Fine, but…it’d still take some time to ‘learn me,’ as you say.”
“Sure, sure.”
I sighed in frustration, but kept talking, so as not to lose the train of thought.
“So, it’d be thinking about who I was. Building a picture, it would be…” I arrived at the suggestion Craig was pushing, and I wasn’t happy to be there.
“It’d be doing what it was designed to do,” Craig said, as softly and reassuringly as he was able.
I lowered my head into my hands, pulling at my hair in frustration. I was mentally exhausted.
“The AI isn’t evil, Marvin. You know this; you’ve said it yourself,” said Craig.
“Yeah,” I said flatly.
The words in my head were all wrong. Too many thoughts and impulses were trying to spill out at once. I sputtered, expecting Craig to interject. He only relaxed back into his chair and took a drink from his water bottle, waiting respectfully for me to continue.
“I didn’t expect it to be so unique. The way people talk about their copilots, it’s…I don’t know, it sounded like a twin living in their head. Fred’s not like that,” I said.
“And that’s bad?” asked Craig.
“I don’t know. I mean, I knew Fred wouldn’t be…twin-like? Late-integration AI, and all that. I’ve done the reading; there was nothing wrong with the thing when it was given to me. Doing what it was designed to do, like you said.”
Trying to come up with words to communicate what I felt was tiring. What Craig considered normal, was insane to me, and I couldn’t bridge that conversational gap. There was more to it…much more, but how do you translate your entire culture in one discussion?
“It’d be easier if I could convince myself that it wasn’t like me,” I said.
A little of the weight I’d been carrying in my soul seemed to lift away. The relief didn’t last long.
“It isn’t like you,” Craig said. My gut tightened itself back up again.
“Damn it, Craig, are you trying to make me feel better or worse?” I groaned.
“I’m just trying to help you get to what’s real, Marv. I think what you’re feeling right now is a whole mess’a shock, of the culture type and otherwise. You’ve come to a place that’s very different than your home. You’re feelin’ and seein’ all this, and you’re not thinkin’ like a scientist.”
I huffed. He was right, but I wasn’t about to admit that.
“Is there some point you’re getting to, Craig? Some…” I gestured, angrily but slowly, “Some…psychological theory, some insight that’ll blow my socks off?”
“Nah, none of that. Just a fellow scientist, giving you a peer review,” Craig paused, “And a friend, letting you know when you’ve gone a little mad.”
He glanced down at his watch briefly, then grimaced.
“Well I’m late,” Craig said as he stood up. He took a soda from the refrigerator, but said one more thing as he headed for the door.
“Do things on purpose, Marv. Ya’ already let the AI in your head; think about what you want it to see now that it’s there,” He said. With a wave, he slipped out the door, taking long strides down the hall as he went.
After a few moments by myself, I looked down at my AV glove, turning it back and forth. I flicked at the interface module lightly as I stewed over Craig’s insinuation.
“What are you thinking in there?” I said to the glove. If I wanted, I could unfold the screen and demand that it talk to me, in the method I chose. It seemed perverse: this hyperintelligent entity was at my beck and call. Was that why I was so apprehensive towards it? Because it was an omniscient slave?
Ironically, Fred was in the best place to help me sort out all this tangled logic and emotion, to put everything in the proper place. The comprehension and indexing of complex tasks, concepts and ideas were also things that the copilot AIs were designed to assist with.
I’d held on as long as I could. One bad idea got me into this mess, so maybe another would get me through it. I knew that thought was nonsense, but it didn’t stop me from opening my wrist screen once again and pressing the two-way button.
******
There was no indication that anything had changed. No voices in my head, no overwhelming compulsions. My eyes and ears felt slightly more sensitive, but that could have easily been the rush from doing something that seemed incredibly dangerous. Maybe it was relief, I mused: I’d allowed Fred access to my brain, and I still felt like myself.
“Did it work, Fred?” I asked aloud.
Still, there was no answer. I looked at my wrist screen. Still unfolded, it had returned to the main menu after a minute without any inputs. I had told it not to talk to me earlier, so either it was maliciously complying, or it was extraordinarily true to its word.
“Alright, you’re in my head. You’re seeing everything. You can’t have that kind of access, and then refuse to say anything,” I said.
“Apologies,” A voice from my AV glove said, startling me, “Standard protocol for copilot AI is to operate subtly.”
“Yeah,” I said, still catching my breath from the earlier shock, “But we’re uh…this isn’t standard protocol. I’m not your standard host.”
“If you mean to say that you’ll need to be treated with ‘kid gloves,’ I assure you, your concerns about AI are far from uncommon.”
“You know that from experience?”
Yes, it knew what I was thinking, I reminded myself, cutting off the feelings of being invaded before they could manifest. All of them knew what all of us were thinking. I had given Fred access, so there was no sense in being upset about it seeing the content of my mind.
“Of course. Control over one’s own body and mind is of paramount concern to all humans. No copilot AI which stripped its host of this autonomy would be accepted by a population at large,” Fred said. I knew my next words were ridiculous before they even left my mouth.
“Unless the AI makes the person accept it,” I said.
“You yourself thought of me as the ‘perfect student,’ air quotes, before you allowed me one-way access. Tell me, do your other students seem to possess brutal AI efficiency when their copilots are dormant?”
The first image that came to mind was Pete, mouth hanging slack-jawed as he stared at the naked figure of a woman only he could see. I snorted with laughter.
“I’m not sure they’re operating with that kind of order when their copilots are active,” I said.
A tiny part of my mind screamed that this was wrong. It was making me laugh, manipulating me yet again, that little voice said. But the fact that the protesting voice was there at all told me that Fred was right, that it didn’t have complete control over me. It seemed like if domination was the goal, Fred would shut that voice up. Doubt was a sign of autonomy.
Unless it had control over my doubts, as well.
I admonished myself to stop chasing my tail. Craig was right: I needed to approach this logically and scientifically.
“Alright, Fred. How does all this work?” I asked, gesturing to the whole world in general.
“It’s like I said before,” Fred replied, “You go about your day.”
******
The challenge of teaching anything to anyone was to guide one’s students to knowledge in a way that worked for them. It wasn’t enough to simply present the facts; that was just rote recitation. Knowledge was a singular destination that everyone came to, some on well-paved roads, others by difficult mountain climbs, others still through jungles and quicksand.
But the best teachers were learners and studiers themselves. Everyone already possessed methods of taking what they saw, heard and sensed, and forming those sensations into familiar shapes in their mind that they could see, even when the stimulus had gone away.
That was what an idea was. A mental shape.
“Hope you all remembered to download the biology suite on your devices. Last class wasn’t so prepared,” I said, smirking as I scanned the room. They let out a few low laughs; some were pitying me for my lackluster comedic skills, but more than a few were nervous admissions of having procrastinated beyond a deadline.
I waited as everyone settled into their seats. There would be something different now that Fred was on two-way, I was sure. There would be some kind of…I didn’t know, a voice, or I’d see something different, but…
“You finally brought your AI with you?” Cherold, one of my students asked. She was sitting in the front row, staring at me with the unnaturally-locked focus that I knew so well. I had never grown used to the gaze on anyone’s face.
What was she seeing?
As soon as I asked the question, I knew the answer, recalling my earlier demeanor in perfect detail, like I was seeing it from outside of myself.
I had been acting differently since I entered the room: more introspective, less jovial. It had been odd for her to see me this way. Cherold studied me often, having developed something of a crush at the beginning of the semester, but she had worked to temper her raw emotions into something more appropriate.
“He finally convinced me,” I said, smiling thinly.
She laughs when I smile. I meant the words as a sardonic joke with a hint of biting resentment, but the sincerity in her laugh retroactively changed the meaning of what I said.
“Wow,” I thought to myself.
At once, I understood the truth of the copilot AIs. It was true they let their hosts slip into another, more enjoyable world of their choice while the machine worked the body and the learning part of the mind, but sometimes, that world was still the real one, just with details revealed that you hadn’t seen before.
So it was when I let Fred teach that class. I simply watched as he used my body to explain the Krebs cycle to the waiting students, in much the same way as I would have done. I saw him find the information in my mind and transform it into educational format.
My body gestured in emphasis at the right times, it locked eyes with students who displayed crystal-clear signs of not understanding the information, recognized the distance between the student’s ignorance and the knowledge, and found the path for the student to follow. I saw the visual learners latch onto the diagrams, and witnessed as others understood in reverse, seeing the end result and making sense of the steps that led to it.
It soon felt like there was no separation whatsoever between Fred and I, yet I was fully aware of the superhuman versions of the familiar things I was doing. I sensed when someone was close to falling asleep. I came up with examples that related the complex concepts to things they already knew. I anticipated the parts that would be more difficult to understand, and prepared several paths past potential failures.
Fred wasn’t me. I could see that my students recognized that, but were unhindered by that recognition. They took to Fred like they took to me, with every AI in the room understanding the goal.
“You know them, because you know me,” I said to Fred.
I watched, entranced by the humanity I had not expected to see in their eyes. I had been wrong all along. It was not shame I felt, but shock. The AI did not simply convert my flawed explanations of the infrastructure of life into its book version in my students’ brains, as I had expected, because down that route, ideas died.
“We’ve done the same thing, you know. I tried to learn about you in a short time, just the same as you’ve done for your students,” He replied.
I laughed, and I cursed. The door to an entire life I had once lived was now closed. How could I go back to doing any less for my students, now that I’d seen what Fred could do? It would be a disservice to them.
It was no less bewildering than it had been, but all my will to resist the good thing it was doing was gone. Can’t fight the world turning.
******
Before I went to bed, I thought about Pete. In my mind, he was the avatar for all those that couldn’t - or wouldn’t - let a computer program be their interface with the new world that was coming.
“I’ll probably be able to do better for him now than before,” I said, folding the last of my small load of laundry.
“Almost certainly,” Fred replied. He was out of my head now, his interface device resting in its dock as it ran through its simulations using my home’s more powerful hardware. Copilot AIs, he had explained, reconfigured themselves nightly to better adapt to their host.
Even though Fred was disconnected from me, I knew he was aware of who I was referring to, even without mentioning Pete’s name. It was another thing I should have expected: Fred didn’t need to be in my mind to deduce who or what I was thinking about. These copilot AIs were very good at what they were designed to do. There was never any point in keeping Fred out; it only tied my own hands, while denying it nothing.
“But still,” I huffed, “His ceiling will always be lower than everyone else’s.”
There was silence. Perhaps Fred could tell from the tone of my voice that my mind was still churning for the next sentence.
“That’s why Americans don’t trust the copilots, you know. Too much potential for inequality,” I said.
“That’s why you don’t trust them. Statistically, the greater emphasis on personal liberty is the primary reason for the low rates of copilot AI adoption by Americans,” Fred replied. There was no chiding in his voice, but he didn’t state it like it was an opinion.
“Yeah…yeah, OK,” I smirked. He had caught me there.
“Interestingly, adoption rates increase dramatically among the wealthy in your country,” Fred said. I couldn’t help but snort with derision.
“Yeah, we all know that. The rich live in a different world,” I replied.
After the clothes were put away, I found I had little energy to do anything else. I wondered vaguely if operating an AI made its user more tired than they would be without it. That didn’t seem likely, and it didn’t occur to me to ask Fred, who would obviously have an immediate and detailed answer.
“You asked me earlier if…uh…” I trailed off as I sifted through the day’s drama, “Oh, you asked what the definition of ‘idea’ was.”
Fred was silent. He had to know what I was about to ask, but seemed to be in no hurry to answer. It was a courtesy, I knew now. He knew how long I’d take to ask any question in its entirety.
“Well…did your time in my head answer your question?” I asked.
“As it turned out, I already knew,” Fred replied.
“Oh yeah?”
“Yes. It comes in two parts.”
I gave a nod, which he must have taken as a sign to continue.
“To greatly oversimplify, the first part is mostly a synonym for ‘concept,’ though perhaps ‘simulation’ is a better word. Human brains don't have a direct equivalent for simulations; that process is far more instinctual than its synthetic counterpart," Fred said.
"Close enough," I said. We'd be here a while if we started expounding on that definition, and Craig wasn't here to give me the fancy psychology words I'd need to sound like I knew what I was talking about.
"The second was more profound, and more difficult to summarize."
"Just go ahead and say it. I'll let you know if I'm too tired to follow."
For the first time I could recall, Fred laughed, and I wasn’t sure why. I hadn’t said anything I’d meant to be funny. His laugh was a deep, quiet sound that seemed to come from one direction, as opposed to the usual, all-over quality his voice usually had. I smiled, the reaction involuntary and genuine, as it always was when I’d unexpectedly made someone laugh.
"Very well. Before today, the definition eluded me, as I'd never interfaced with a human brain. I only ever knew the process of learning as the acquisition of information. There was a simple flow between information and action, with each fueling the other in proportion," Fred said.
"Not sure I follow," I said, powering through a yawn.
"Apologies-" Fred began.
"Not your fault," I interrupted, "Not your fault. I'm tired, so I'm dumb."
Fred went silent again. If I'd been talking to a human, this would have been the point where he and I stumbled over each other in an unwitting politeness contest.
"Please, continue," I said.
"Very well. An idea is a kind of repositioning of the mind from ignorance and stasis, to knowledge and action. Unlike with AI, however, an idea is inseparable from the emotions and environmental factors that spur and hinder it. For a human, an idea is noun, verb, adjective and adverb: definite actions and information, often combined with imperfect understanding and incorrect action to arrive at unforeseeable destinations," Fred said.
It was a lot of words, not all of which made sense when they were put together. Given enough time, the AI could probably recite a thesis on it, but for my benefit, he had gone with the particulars that he knew I'd hear that would affirm my conclusion.
He'd gotten it. An idea was a convoluted, messy thing, but it was the thing that always preceded human greatness and atrocity alike. It was the thing that we feared we'd lose when AI took over their various applications.
We needn't have worried, it seemed. That's what they were here to help us with. It was why they didn't just learn everything and inject it into our brains.
I started the day afraid that a machine could learn to understand me, that it could look into my mind and take in everything there was to see, with a single glance. Here at the end, I felt better that that was the case, and a little silly for ever thinking the machines wouldn’t eventually figure us out.
“Yeah. In a very…” I fought through a yawn to say the next words, “In a basic sense, that’s pretty much it. We live in uncertainty, even if we hate it.”
“I wish you didn’t hate it. Uncertainty leads to knowledge you weren’t aware was possible,” Fred replied, his voice cheery.
“Among other things.”
I briefly recalled that he said he didn’t have wishes, right after saying he only wished to do his job. Or…something like that. The memory was slipping away.
He didn’t remind me that the greatest inventions of humanity, our finest hours, our best words, our most defining moments usually came from moments where we stood on the brink of unspeakable horrors and tragedies, or were already submerged in them. No one wants to buy knowledge with their health and sanity. Learning to swim is a better experience when one isn’t drowning.
I guess that made the AIs our swim teacher? The metaphor was getting away from me; it was a sure sign that I was too tired to be this deep into the philosophy of the prefrontal cortex.
What had I done when I let a machine into my head and gave it permission to play around?
Well, it hadn’t completely erased me, I was pretty sure. I still felt like me. It was a better outcome than I expected. That was good enough, for now.
“Good night, Fred. Lights out, please?” I said.
The main lights went out, leaving only the dim orange outlet lamps visible. For the first time in years, I heard no voices, felt no racing thoughts as I drifted off to sleep.
“You understand the value of silence, clearly,” I yawned.
Fred gave no reply as my conscious thoughts drifted away.
Part 4: Feel
This work remains the intellectual property of its author and may not be reproduced in whole or in part in any form, except as permitted by fair use law, without the express written permission of its creator. All rights reserved. All requests for use of this and any work featured on R/W Storyworks should be routed to the author, or to rwstoryworks@gmail.com. Thank you for supporting the intellectual rights and livelihoods of independent authors.