The strangeness of his new way of speech made insults of his own t.i.tles. It was like being tongue-tied. He felt streams of hot sweat breaking out on his skin from the effort of trying to frame his words properly; but when he put his hand to his forehead to brush the sweat away before it could run into his eyes he seemed dry to the touch, and he was not entirely sure he could feel himself at all.
He took a deep breath. "I am Francisco Pizarro!" he roared, letting the name burst desperately from him like water breaching a rotten dam.
The echo came back, deep, rumbling, mocking. Frantheethco. Peetharro.
That too. Even his own name, idiotically garbled.
"O great G.o.d!" he cried. "Saints and angels!
More garbled noises. Nothing would come out as it should. He had never known the arts of reading or writing; now it seemed that true speech itself was being taken from him. He began to wonder whether he had been right about this being heaven, supernal radiance or no. There was a curse on his tongue; a demon, perhaps, held it pinched in his claws. Was this h.e.l.l, then? A very beautiful place, but h.e.l.l nevertheless?
He shrugged. Heaven or h.e.l.l, it made no difference. He was beginning to grow more calm, beginning to accept and take stock. He knew -- had learned, long ago -- that there was nothing to gain from raging against that which could not be helped, even less from panic in the face of the unknown. He was here, that was all there was to it -- wherever here was -- and he must find a place for himself, and not this place, floating here between nothing and nothing. He had been in h.e.l.ls before, small h.e.l.ls, h.e.l.ls on Earth. That barren isle called Gallo, where the sun cooked you in your own skin and there was nothing to eat but crabs that had the taste of dog-dung. And that dismal swamp at the mouth of the Rio Biru, where the rain fell in rivers and the trees reached down to cut you like swords. And the mountains he had crossed with his army, where the snow was so cold that it burned, and the air went into your throat like a dagger at every breath. He had come forth from those, and they had been worse than this. Here there was no pain and no danger; here there was only soothing light and a strange absence of all discomfort. He began to move forward. He was walking on air. Look, look, he thought, I am walking on air! Then he said it out loud. "I am walking on air," he announced, and laughed at the way the words emerged from him. "Santiago! Walking on air! But why not? I am Pizarro!" He shouted it with all his might, "Pizarro! Pizarro!" and waited for it to come back to him.
Peetharro. Peetharro.
He laughed. He kept on walking.
Tanner sat hunched forward in the vast sparkling sphere that was the ninth-floor imaging lab, watching the little figure at the distant center of the holotank strut and preen. Lew Richardson, crouching beside him with both hands thrust into the data gloves so that he could feed instructions to the permutation network, seemed almost not to be breathing -- seemed to be just one more part of the network, in fact.
But that was Richardson"s way, Tanner thought: total absorption in the task at hand. Tanner envied him that. They were very different sorts of men. Richardson lived for his programming and nothing but his programming. It was his grand pa.s.sion. Tanner had never quite been able to understand people who were driven by grand pa.s.sions. Richardson was like some throwback to an earlier age, an age when things had really mattered, an age when you were able to have some faith in the significance of your own endeavors.
"How do you like the armor?" Richardson asked. "The armor"s very fine, I think. We got it from old engravings. It has real flair."
"Just the thing for tropical climates," said Tanner. "A nice tin suit with matching helmet."
He coughed and shifted about irritably in his seat. The demonstration had been going on for half an hour without anything that seemed to be of any importance happening -- just the minuscule image of the bearded man in Spanish armor tramping back and forth across the glowing field -- and he was beginning to get impatient.
Richardson didn"t seem to notice the harshness in Tanner"s voice or the restlessness of his movements. He went on making small adjustments. He was a small man himself, neat and precise in dress and appearance, with faded blond hair and pale blue eyes and a thin, straight mouth. Tanner felt huge and shambling beside him. In theory Tanner had authority over Richardson"s research projects, but in fact he always had simply permitted Richardson to do as he pleased. This time, though, it might be necessary finally to rein him in a little.
This was the twelfth or thirteenth demonstration that Richardson had subjected him to since he had begun fooling around with this historical-simulation business. The others all had been disasters of one kind or another, and Tanner expected that this one would finish the same way. And basically Tanner was growing uneasy about the project that he once had given his stamp of approval to, so long ago. It was getting harder and harder to go on believing that all this work served any useful purpose. Why had it been allowed to absorb so much of Richardson"s group"s time and so much of the lab"s research budget for so many months? What possible value was it going to have for anybody? What possible use?
It"s just a game, Tanner thought. One more desperate meaningless technological stunt, one more pointless pirouette in a meaningless ballet. The expenditure of vast resources on a display of ingenuity for ingenuity"s sake and nothing else: now there"s decadence for you.
The tiny image in the holotank suddenly began to lose color and definition.
"Uh-oh," Tanner said. "There it goes. Like all the others."
But Richardson shook his head. "This time it"s different, Harry."
"You think?"
"We aren"t losing him. He"s simply moving around in there of his own volition, getting beyond our tracking parameters. Which means that we"ve achieved the high level of autonomy that we were shooting for."
"Volition, Lew? Autonomy?"
"You know that those are our goals."
"Yes, I know what our goals are supposed to be," said Tanner, with some annoyance. "I"m simply not convinced that a loss of focus is a proof that you"ve got volition."
"Here," Richardson said. "I"ll cut in the stochastic tracking program. He moves freely, we freely follow him." Into the computer ear in his lapel he said, "Give me a gain boost, will you?" He made a quick flicking gesture with his left middle finger to indicate the quant.i.tative level.
The little figure in ornate armor and pointed boots grew sharp again. Tanner could see fine details on the armor, the plumed helmet, the tapering shoulder-pieces, the joints at the elbows, the intricate pommel of his sword. He was marching from left to right in a steady hip-rolling way, like a man who was climbing the tallest mountain in the world and didn"t mean to break his stride until he was across the summit. The fact that he was walking in what appeared to be mid-air seemed not to trouble him at all.
"There he is," Richardson said grandly. "We"ve got him back, all right? The conqueror of Peru, before your very eyes, in the flesh. So to speak."
Tanner nodded. Pizarro, yes, before his very eyes. And he had to admit that what he saw was impressive and even, somehow, moving. Something about the dogged way with which that small armored figure was moving across the gleaming pearly field of the holotank aroused a kind of sympathy in him. That little man was entirely imaginary, but he didn"t seem to know that, or if he did he wasn"t letting it stop him for a moment: he went plugging on, and on and on, as if he intended actually to get somewhere. Watching that, Tanner was oddly captivated by it, and found himself surprised suddenly to discover that his interest in the entire project was beginning to rekindle.
"Can you make him any bigger?" he asked. "I want to see his face."
"I can make him big as life," Richardson said. "Bigger. Any size you like. Here."
He flicked a finger and the hologram of Pizarro expanded instantaneously to a height of about two meters. The Spaniard halted in mid-stride as though he might actually be aware of the imaging change.
That can"t be possible, Tanner thought. That isn"t a living consciousness out there. Or is it?
Pizarro stood poised easily in mid-air, glowering, shading his eyes as if staring into a dazzling glow. There were brilliant streaks of color in the air all around him, like an aurora. He was a tall, lean man in late middle age with a grizzled beard and a hard, angular face. His lips were thin, his nose was sharp, his eyes were cold, shrewd, keen. It seemed to Tanner that those eyes had come to rest on him, and he felt a chill.
My G.o.d, Tanner thought, he"s real.
It had been a French program to begin with, something developed at the Centre Mondiale de la Computation in Lyons about the year 2119. The French had some truly splendid minds working in software in those days. They worked up astounding programs, and then n.o.body did anything with them. That was their version of Century Twenty-Two Malaise.
The French programmers" idea was to use holograms of actual historical personages to dress up the son et lumiere tourist events at the great monuments of their national history. Not just preprogrammed robot mockups of the old Disneyland kind, which would stand around in front of Notre Dame or the Arc de Triomphe or the Eiffel Tower and deliver canned spiels, but apparent reincarnations of the genuine great ones, who could freely walk and talk and answer questions and make little quips. Imagine Louis XIV demonstrating the fountains of Versailles, they said, or Pica.s.so leading a tour of Paris museums, or Sartre sitting in his Left Bank cafe exchanging existential bons mots with pa.s.sersby! Napoleon! Joan of Arc! Alexandre Dumas! Perhaps the simulations could do even more than that: perhaps they could be designed so well that they would be able to extend and embellish the achievements of their original lifetimes with new accomplishments, a fresh spate of paintings and novels and works of philosophy and great architectural visions by vanished masters.
The concept was simple enough in essence. Write an intelligencing program that could absorb data, digest it, correlate it, and generate further programs based on what you had given it. No real difficulty there. Then start feeding your program with the collected written works -- if any -- of the person to be simulated: that would provide not only a general sense of his ideas and positions but also of his underlying pattern of approach to situations, his style of thinking -- for le style, after all, est l"homme meme. If no collected works happened to be available, why, find works about the subject by his contemporaries, and use those. Next, toss in the totality of the historical record of the subject"s deeds, including all significant subsequent scholarly a.n.a.lyses, making appropriate allowances for conflicts in interpretation -- indeed, taking advantages of such conflicts to generate a richer portrait, full of the ambiguities and contradictions that are the inescapable hallmarks of any human being. Now build in substrata of general cultural data of the proper period so that the subject has a loam of references and vocabulary out of which to create thoughts that are appropriate to his place in time and s.p.a.ce. Stir. Et voila! Apply a little sophisticated imaging technology and you had a simulation capable of thinking and conversing and behaving as though it is the actual self after which it was patterned.
Of course, this would require a significant chunk of computer power. But that was no problem, in a world where l50-gigaflops networks were standard laboratory items and ten-year-olds carried pencil-sized computers with capacities far beyond the ponderous mainframes of their great-great-grandparents" day. No, there was no theoretical reason why the French project could not have succeeded. Once the Lyons programmers had worked out the basic intelligencing scheme that was needed to write the rest of the programs, it all should have followed smoothly enough.
Two things went wrong: one rooted in an excess of ambition that may have been a product of the peculiarly French personalities of the original programmers, and the other having to do with an abhorrence of failure typical of the major nations of the mid-twenty-second century, of which France was one.
The first was a fatal change of direction that the project underwent in its early phases. The King of Spain was coming to Paris on a visit of state; and the programmers decided that in his honor they would synthesize Don Quixote for him as their initial project. Though the intelligencing program had been designed to simulate only individuals who had actually existed, there seemed no inherent reason why a fictional character as well doc.u.mented as Don Quixote could not be produced instead. There was Cervantes" lengthy novel; there was ample background data available on the milieu in which Don Quixote supposedly had lived; there was a vast library of critical a.n.a.lysis of the book and of the Don"s distinctive and flamboyant personality. Why should bringing Don Quixote to life out of a computer be any different from simulating Louis XIV, say, or Moliere, or Cardinal Richelieu? True, they had all existed once, and the knight of La Mancha was a mere figment; but had Cervantes not provided far more detail about Don Quixote"s mind and soul than was known of Richelieu, or Moliere, or Louis XIV?
Indeed he had. The Don -- like Oedipus, like Odysseus, like Oth.e.l.lo, like David Copperfield -- had come to have a reality far more profound and tangible than that of most people who had indeed actually lived. Such characters as those had transcended their fictional origins. But not so far as the computer was concerned. It was able to produce a convincing fabrication of Don Quixote, all right -- a gaunt bizarre holographic figure that had all the right mannerisms, that ranted and raved in the expectable way, that referred knowledgeably to Dulcinea and Rosinante and Mambrino"s helmet. The Spanish king was amused and impressed. But to the French the experiment was a failure. They had produced a Don Quixote who was hopelessly locked to the Spain of the late sixteenth century and to the book from which he had sprung. He had no capacity for independent life and thought -- no way to perceive the world that had brought him into being, or to comment on it, or to interact with it. There was nothing new or interesting about that. Any actor could dress up in armor and put on a scraggly beard and recite s.n.a.t.c.hes of Cervantes. What had come forth from the computer, after three years of work, was no more than a predictable reprocessing of what had gone into it, sterile, stale.
Which led the Centre Mondial de la Computation to its next fatal step: abandoning the whole thing. Zut! and the project was cancelled without any further attempts. No simulated Pica.s.sos, no simulated Napoleons, no Joans of Arc. The Quixote event had soured everyone and no one had the heart to proceed with the work from there. Suddenly it had the taint of failure about it, and France -- like Germany, like Australia, like the Han Commercial Sphere, like Brazil, like any of the dynamic centers of the modern world, had a horror of failure. Failure was something to be left to the backward nations or the decadent ones -- to the Islamic Socialist Union, say, or the Soviet People"s Republic, or to that slumbering giant, the United States of America. So the historic-personage simulation scheme was put aside.
The French thought so little of it, as a matter of fact, that after letting it lie fallow for a few years they licensed it to a bunch of Americans, who had heard about it somehow and felt it might be amusing to play with.
"You may really have done it this time," Tanner said.
"Yes. I think we have. After all those false starts."
Tanner nodded. How often had he come into this room with hopes high, only to see some botch, some inanity, some depressing bungle? Richardson had always had an explanation. Sherlock Holmes hadn"t worked because he was fictional: that was a necessary recheck of the French Quixote project, demonstrating that fictional characters didn"t have the right sort of reality texture to take proper advantage of the program, not enough ambiguity, not enough contradiction. King Arthur had failed for the same reason. Julius Caesar? Too far in the past, maybe: unreliable data, bordering on fiction. Moses? Ditto. Einstein? Too complex, perhaps, for the project in its present level of development: they needed more experience first. Queen Elizabeth I? George Washington? Mozart? We"re learning more each time, Richardson insisted after each failure. This isn"t black magic we"re doing, you know. We aren"t necromancers, we"re programmers, and we have to figure out how to give the program what it needs.
And now Pizarro?
"Why do you want to work with him?" Tanner had asked, five or six months earlier. "A ruthless medieval Spanish imperialist, is what I remember from school. A bloodthirsty despoiler of a great culture. A man without morals, honor, faith -- "
"You may be doing him an injustice," said Richardson. "He"s had a bad press for centuries. And there are things about him that fascinate me."
"Such as?"
"His drive. His courage. His absolute confidence. The other side of ruthlessness, the good side of it, is a total concentration on your task, an utter unwillingess to be stopped by any obstacle. Whether or not you approve of the things he accomplished, you have to admire a man who -- "
"All right," Tanner said, abruptly growing weary of the whole enterprise. "Do Pizarro. Whatever you want."
The months had pa.s.sed. Richardson gave him vague progress reports, nothing to arouse much hope. But now Tanner stared at the tiny strutting figure in the holotank and the conviction began to grow in him that Richardson finally had figured out how to use the simulation program as it was meant to be used.
"So you"ve actually recreated him, you think? Someone who lived -- what, five hundred years ago?"
"He died in l54l," said Richardson.
"Almost six hundred, then."
"And he"s not like the others -- not simply a recreation of a great figure out of the past who can run through a set of pre-programmed speeches. What we"ve got here, if I"m right, is an artificially generated intelligence which can think for itself in modes other than the ones its programmers think in. Which has more information available to itself, in other words, than we"ve provided it with. That would be the real accomplishment. That"s the fundamental philosophical leap that we were going for when we first got involved with this project. To use the program to give us new programs that are capable of true autonomous thought -- a program that can think like Pizarro, instead of like Lew Richardson"s idea of some historian"s idea of how Pizarro might have thought."
"Yes," Tanner said.
"Which means we won"t just get back the expectable, the predictable. There"ll be surprises. There"s no way to learn anything, you know, except through surprises. The sudden combination of known components into something brand new. And that"s what I think we"ve managed to bring off here, at long last. Harry, it may be the biggest artificial-intelligence breakthrough ever achieved."
Tanner pondered that. Was it so? Had they truly done it?
And if they had -- Something new and troubling was beginning to occur to him, much later in the game than it should have. Tanner stared at the holographic figure floating in the center of the tank, that fierce old man with the harsh face and the cold, cruel eyes. He thought about what sort of man he must have been -- the man after whom this image had been modeled. A man who was willing to land in South America at age fifty or sixty or whatever he had been, an ignorant illiterate Spanish peasant wearing a suit of ill-fitting armor and waving a rusty sword, and set out to conquer a great empire of millions of people spreading over thousands of miles. Tanner wondered what sort of man would be capable of carrying out a thing like that. Now that man"s eyes were staring into his own and it was a struggle to meet so implacable a gaze.
After a moment he looked away. His left leg began to quiver. He glanced uneasily at Richardson.
"Look at those eyes, Lew. Christ, they"re scary!"
"I know. I designed them myself, from the old prints."
"Do you think he"s seeing us right now? Can he do that?"
"All he is is software, Harry."
"He seemed to know it when you expanded the image."
Richardson shrugged. "He"s very good software. I tell you, he"s got autonomy, he"s got volition. He"s got an electronic mind, is what I"m saying. He may have perceived a transient voltage kick. But there are limits to his perceptions, all the same. I don"t think there"s any way that he can see anything that"s outside the holotank unless it"s fed to him in the form of data he can process, which hasn"t been done."
"You don"t think? You aren"t sure?"
"Harry. Please."
"This man conquered the entire enormous Incan empire with fifty soldiers, didn"t he?"
"In fact I believe it was more like a hundred and fifty."
"Fifty, a hundred fifty, what"s the difference? Who knows what you"ve actually got here? What if you did an even better job than you suspect?"
"What are you saying?"
"What I"m saying is, I"m uneasy all of a sudden. For a long time I didn"t think this project was going to produce anything at all. Suddenly I"m starting to think that maybe it"s going to produce more than we can handle. I don"t want any of your G.o.dd.a.m.ned simulations walking out of the tank and conquering us."
Richardson turned to him. His face was flushed, but he was grinning. "Harry, Harry! For G.o.d"s sake! Five minutes ago you didn"t think we had anything at all here except a tiny picture that wasn"t even in focus. Now you"ve gone so far the other way that you"re imagining the worst kind of -- "
"I see his eyes, Lew. I"m worried that his eyes see me."
"Those aren"t real eyes you"re looking at. What you see is nothing but a graphics program projected into a holotank. There"s no visual capacity there as you understand the concept. His eyes will see you only if I want them to. Right now they don"t."
"But you can make them see me?"
"I can make them see anything I want them to see. I created him, Harry."
"With volition. With autonomy."
"After all this time you start worrying now about these things?"
"It"s my neck on the line if something that you guys on the technical side make runs amok. This autonomy thing suddenly troubles me."
"I"m still the one with the data gloves," Richardson said. "I twitch my fingers and he dances. That"s not really Pizarro down there, remember. And that"s no Frankenstein monster either. It"s just a simulation. It"s just so much data, just a bunch of electromagnetic impulses that I can shut off with one movement of my pinkie."
"Do it, then."
"Shut him off? But I haven"t begun to show you -- "
"Shut him off, and then turn him on," Tanner said.
Richardson looked bothered. "If you say so, Harry."
He moved a finger. The image of Pizarro vanished from the holotank. Swirling gray mists moved in it for a moment, and then all was white wool. Tanner felt a quick jolt of guilt, as though he had just ordered the execution of the man in the medieval armor. Richardson gestured again, and color flashed across the tank, and then Pizarro reappeared.
"I just wanted to see how much autonomy your little guy really has," said Tanner. "Whether he was quick enough to head you off and escape into some other channel before you could cut his power."
"You really don"t understand how this works at all, do you, Harry?"
"I just wanted to see," said Tanner again, sullenly. After a moment"s silence he said, "Do you ever feel like G.o.d?"
"Like G.o.d?"
"You breathed life in. Life of a sort, anyway. But you breathed free will in, too. That"s what this experiment is all about, isn"t it? All your talk about volition and autonomy? You"re trying to recreate a human mind -- which means to create it all over again -- a mind that can think in its own special way, and come up with its own unique responses to situations, which will not necessarily be the responses that its programmers might antic.i.p.ate, in fact almost certainly will not be, and which not might be all that desirable or beneficial, either, and you simply have to allow for that risk, just as G.o.d, once he gave free will to mankind, knew that He was likely to see all manner of evil deeds being performed by His creations as they exercised that free will -- "
"Please, Harry -- "
"Listen, is it possible for me to talk with your Pizarro?"