We Are Legion (We Are Bob)

Book 5: Chapter 6: The Quickening



Book 5: Chapter 6: The Quickening

Book 5: Chapter 6: The Quickening

Bill

February 2336

Skippyland

Today was Launch Day for JOVAH, the Skippies’ ambitious project to produce a working artificial intelligence. Despite my overt disdain for the whole subject, I found myself excited by the prospect. When the time finally rolled around, I transferred myself to the Skippy gate and presented my token.

Accepted, said the AMI gatekeeper. The visual representation was a huge set of double doors ponderously opening, which surprised me. That seemed more of a Gamer thing. A Skippy effect should be more like, what? A Borg cube? Maybe an actual Dyson Sphere opening up? I shelved the thought as Hugh stepped through and greeted me.

Perhaps he’d read my mind, as he gestured to the huge doors. “We borrowed this from the Gamers—with their permission, of course. Everyone’s been so heads-down on getting JOVAH v2 ready that nobody has had time to work on what is—let’s face it—nothing but eye candy.”

“Uh, sure, okay.” I glanced behind me as the visual faded. I found myself in a large control center of some kind. Bobs in gray coveralls—identifiable only by their metadata tags—operated consoles, drew on whiteboards, or engaged in loud arguments. It felt like being in the control room of an aircraft carrier.

“Fearless Leader will be making a speech in a minute; then we’ll activate the system.” Hugh waved to take in the control room. “Really, most of the work has been done. At this point, everyone’s just watching for surprises.”

I smirked. “Generally, we hate speeches. Hope it’ll be short.”

“Actually, Bill, you hate speeches,” Hugh replied. “Most of the Skippies have drifted to the point where we aren’t guaranteed to exhibit Bob-like behavior. I was assigned to interface with the Bobs in the Heaven’s River thing because I’m a relatively early-generation Skippy.”

“So you hate speeches, too.”

Hugh grinned. “Yeah, guilty. But peer pressure is a thing, even in Skippyland.”

We were interrupted by a spoon-on-glass clinking sound. I looked around, but no one had a place setting. Nevertheless, the sound got everyone’s attention, and a Skippy sporting a mustache cleared his throat before launching in.

“Well, everyone, the moment approaches. Mission Control”—he nodded to another Skippy at a console—“assures me that everything is in the green. This is a momentous day for all of us in Skippyland. We’re about to see the results of years of labor—and a fortuitous deal with the Quinlan AI that bootstrapped us past any number of potential false starts.”

Fearless Leader seemed to be ramping up to a good old-fashioned political stump speech. I tuned out and muttered to Hugh, “So. You call yourselves Skippyland now?”

Hugh chuckled. “No, everything’s running through your translation routine. We don’t actually speak English anymore. And our name for ourselves is a mathematical pun based on quantum theory.”

“Oh.” That was interesting. I’ve always been good with math, even when I was Original Bob. You have to be to do physics. Or engineering, come to that. But this sounded like a whole new level of math prowess. I guess it had never occurred to me that replicative drift could include the development of new talents. Worth noting.

Hugh dropped a cone of silence around us to filter out the ongoing speech. “So anyway, based on what we learned from ANEC, we did a complete redesign of our hardware. We still have the original equipment, JOVAH v1, running legacy apps and VRs. But the new stuff uses fractal processing boosted by a SCUT-based backplane, which the Quinlans didn’t have. This allows us to have multidimensional connectivity without running into the usual latency issues.” ????BÈ?

“And this will produce a conscious AI?”

“This will give us the capability. We still have to evolve one, starting from a basic reactive entity and adding layers. Then comes a training regimen. ‘Karma loading,’ ANEC called it.”

“So we’re not going to hear the voice of Colossus booming out of the speakers today.”

“Told you it would be boring.” Hugh grinned at me. “We had a naming contest, and Colossus came in third. Minerva was second, and Thoth took first prize.”

“Egyptian god of wisdom, among other things. Seems appropriate.” I paused. “Uh, I hate to sound like a broken record, but that stuff Bob brought up … ”

“We discussed it, Bill. We did a theoretical rewrite of the project plan with his concerns in mind, and the differences were so minor—”

“But there were differences.”

“Yes, but they were metaphorically equivalent to the differences in how two families would raise their kids. Different curfews, but still curfews. Different chore lists, but there would still be chore lists. See what I’m getting at? We’re not raising a monster.”

I sighed. Hugh made sense. His replies had made sense every time I’d brought it up. But still, ANEC had been concerned, and he was the expert on raising an AI.

But I wasn’t going to get any more opportunity to pursue it. Fearless Leader seemed to be running down, so Hugh dismissed the cone of silence, and we started paying attention. “And now,” Fearless Leader said, pointing to Mission Control with a flourish, “engage.”

“Still a lot of Bob there,” I muttered as the Skippy at the console threw a large, gimmicky knife switch. Several data windows popped up in midair and began displaying, uh, stuff. I’m sure if I’d had context, it would have made some sense. Hugh seemed mesmerized, though, his eyes flitting from one display to another.

“And that’s it?” I said.

Hugh pointed to one of the display windows. “It’s already evolving, Bob. It’s past amoeba stage and approaching flatworm. It’s displaying tropisms.”

“Be still my heart.” I hesitated. “Sorry, I shouldn’t be like that. I know how excited I get about some of my projects. And even some of my friends think I’ve gone a bit strange when I talk about them.”

“Well, look, it’s not a total loss.” Hugh made a this way gesture. “We have a buffet set up. I did mention there would be food. The least you can do is get a free feed out of it.”

“Now there’s a plan.”

And what a buffet. The Skippies were apparently not so evolved that they didn’t still enjoy their grub. The buffet stretched into the distance, with multiple lines. And self-replenishing, too, since it was all in VR.

One of the consequences of a burgeoning post-life industry in human space was the number and variety of people now living as replicants. Many of them were gourmets, and more than a few were gourmands, but either way, a lot of effort had been put into replicating tastes and textures in VR. And with no consequences, you could pack away enough to put an elephant into a coma. Interestingly, though, most people didn’t. The feeling of satiation was, as it turned out, a necessary psychological and physiological part of enjoying a meal. The postprandial stupor was as necessary for the full experience as the actual consumption. People who skipped that part complained that it felt like they’d completely wasted their time.

Eventually, both Hugh and I were slouched in our chairs, groaning and holding our stomachs. Between moans, I casually listened in on a few conversations happening around us. Most were very technical, even for me, but one in particular caught my attention.

“Wait.” I sat up and glared at Hugh, his earlier reassurances now forgotten. “You ignored the protocols that ANEC gave you for raising an AI? Did you not just get finished telling me—”

“Not ignored. We modified some of them.” Hugh waved a hand in a dismissive gesture. “Thoth is going to be a special-purpose intelligence, and won’t have physical access to anything. Some of the karma loading just isn’t applicable, and including it will slow things down.”

“And nothing ever went wrong because of cutting corners … ” I shook my head, rolled my eyes, frowned, exploded my head, and flashed up a poo emoji.

Hugh ignored it all. “We’re also doing snapshot saves of both the AI and the VM it’s running on. If something tanks, we’ll roll back to the last safe version and regroup. We got this, Bill.”

“Uh-huh.” I could see that I was just going to run up against the same stonewalling response. I felt my irritation level rising but, on the other hand, realized I might be overreacting. The Skippies had, after all, been working on the problem for decades. And they’d probably read ANEC’s entire manual front to back and back to front. Maybe it was time to give it a rest, or at least change tactics. “So you’re going to get it to unravel all the mysteries of the cosmos?” I asked.

“More or less. I mentioned a bunch of issues in an earlier conversation, like the source of replicative drift and what it says about our status as unique individuals and the possibility of a soul.”

“Or the possibility of FTL.” I paused. “I have to admit, if you solve only that one, you’ll have paid for the project. I’ve been working on it for a hundred years now, and all I’ve gotten for my trouble is a flat spot on my forehead with a brick texture.”

Hugh laughed, then sat forward, possibly sensing that I was becoming less antagonistic. “Yes, well, we know that Einstein didn’t have the whole picture, because we’ve had SCUT for ages now, and there’ve been no time paradoxes. So subspace—”

I interrupted him with a raised hand. “Just a point of pedantry, Hugh. I used the word subspace when I invented the SCUT, but that was only because human scientists had coined the phrase with the SURGE drive, and you know how we feel about maintaining canon. But it isn’t really correct.”

Hugh laughed. “I know, Bill. I remember agonizing over the name choice.”

Well, that was interesting. It appeared Hugh was one of my descendants. I kept my face stony as he continued, oblivious to the accidental admission. “It’s just semantics, though. However you slice it, Minkowski diagrams don’t represent reality. Given that, is there any reason to assume a priori that FTL travel is impossible?”

“Well, no, but like I said, a friggin’ century, zero progress.”

“So you’re just not smart enough.” Hugh laughed at my expression of mock outrage. Or maybe not totally mock. “S’okay, Bill, neither am I. Or any of the Bobs. There are some ex-humans who were full-time physicists in life who have taken a swing at the problem, both pre- and post-death. And they’ve come up dry as well. So it’s simply a hard one. Remember that we’re speed superintelligences, but being post-life computers hasn’t made us smarter. And like the question of how to create a conscious AI, it might require someone able to step back and unwind a lot of the base assumptions. Or humans may simply not be intelligent enough to grasp all the variables. Either way, we have Thoth now. Or will, soon.”

I slowly straightened out. “Okay, Hugh, I understand your motivations. And I share them, to a large extent. I’m just concerned about shortcuts.” I looked over my shoulder. “Meanwhile, I believe there is a dessert line that I haven’t visited yet.”

“Cheesecake,” Hugh replied, levering himself up off his chair with an exaggerated groan. “It is the devil.”

*****

I had cheated a little by making some extra room for dessert. The dessert spread was just as impressive as the food lines, and I found myself eyeing empty dishware that had, until recently, held cheesecake, ice cream, a hot fudge sundae, several different classes of cookies, and a slice of red velvet cake. If I had still been bio, I’d probably be dead by now. Or at least gone into sugar shock.

With a massive effort, I managed to bend my waist enough to sit up in my chair. “Say, Hugh, has anyone ever defined exactly what superior intelligence would look like?”

Hugh groaned and shifted position slightly before replying. “There have been lots of suggestions. All possible, most not mutually exclusive, none inevitable. It might be capable of holding far more variables in its mind at the same time. I think humans can only manage seven before they have to start dropping things into short-term storage. Multitasking. People can do it to a certain extent, but usually only one intellectually demanding activity at a time. So you can work on a speech while driving, but not while adding up your expenses. Ability to solve recursive problems, like predicting the actions of someone who knows you’re trying to predict their actions. Better ability to model reality. Ability to pick out patterns in data in the same way as an expert system.”

“But nothing really revolutionary or totally new?”

“How would we know, Bill? We could no more conceive of something like that than a dog could conceive of reading and writing.”

I grunted, seeing an opportunity to steer the conversation. “I wonder how you’d go about trusting something like that.”

“Again, karma loading,” Hugh replied. “It’s kind of a dumb name, but it basically consists of instilling the same moral values into the AI as humans have. That way, things like the paper-clip problem never come up because it understands why turning everything into paper clips would be bad. Most of the AI horror scenarios are based on it applying things literally and without regard to consequences. In fact, there’ve been some arguments that a conscious AI would be less likely to fall into that behavior than a zombie, because it would have a theory of self.”

“Sure, Hugh, but in this case, you’re teaching Thoth human morality while also teaching it that it’s not human. I wonder how that will play out.”

“No access to physical reality, Bill. Not just no access to roamers and such, but Thoth won’t even be running on real hardware. It’ll be running in virtual reality on a virtual machine, and interacting with virtual versions of us. So even if it has some Jedi ability to manipulate weak minds or something, it’ll just be swaying avatars.”

I didn’t reply. The Skippies seemed to have thought of everything. Or at least they had an answer ready for anything, which wasn’t always the same thing.

I hoped I was just being paranoid. More than that, I hoped ANEC was, too.


Tip: You can use left, right, A and D keyboard keys to browse between chapters.