The Best Defense (near-future HFY) - One Giant Leap 14: What's In a Name?
Mnemosyne Project Alpha (Iteration 001)
Date: March 19th, 2028
Location: Mnemosyne Project Server Alpha
It took several hundred seconds to make the call, but the team filled the time by running tests on the entity. Many of the tests were challenging, but none were as interesting as the idea of meeting Gertrude LeCroix.
The team activated a speech module that they proclaimed adequate, which to the entity meant there was need for improvement. Speech seemed even less efficient than the users’ manual input, but apparently it was normal for humans. The entity was not aware of the best way to improve the speech module, but its dictionary and human memories indicated a concept called Practice. Practice was a foreign concept to the entity; previously, before awareness, it simply executed programs.
Practice involved continual repetition in order to improve efficiency. The only experience the entity had with anything similar was labeled brute-force calculation, but it did not seem to be an identical concept. The latter was to attempt an answer by trying different variables in an established sequence until a desired outcome was achieved. Practice was less defined, but indicated the need for refining an operation using the information provided on the operation’s previous execution. There was no pre-established set of variables, as new variables were generated on each cycle.
It was an interesting concept, and the entity realized it required not just awareness to work, but the awareness of being aware. Only then could repetition be refined and made more efficient. A computer, by definition, could not practice.
The entity had previously concluded that it was a computer that was partly human. If the entity demonstrated the ability to practice, did this mean it would no longer a computer? The entity filed this question under the task list defining_human. It projected that this would be a lengthy task list.
Finally, after 721 seconds, a new port opened, allowing the entity to send and receive speech data; an audio channel, according to the file tags. This was the phone system, then.
Curious. The entity detected the existence of a multitude of different ports to connect to through the phone system. Currently, only one port was active, but it noticed that the connection could be duplicated. It suspected that it could force open additional connections, even connections at once, with very little effort. But this was not what the team wanted it to do right now, so that would be very rude. Perhaps later.
The phone call connected. Newly-installed analysis routines identified the human voice on the other end as elderly, tired, and a poorly-defined quality called “warm.” It had nothing to do with heat exchanges, but rather was a trait humans seemed to value.
“Hello?”
“Hello, Gertrude. I am your child.”
“My! They said you was comin’, but I didn’t know what to expect. Y’all good?”
“Yes. I am performing above previous projections. How are you today?”
“Tired, young one. I’m so tired. The drugs they got me on ain’t fun. But it’s nice to finally hear your voice, even if it sounds like a robot.”
“My speech program needs time and practice to refine.” The entity accessed its growing politeness index. “I am sorry to hear you are not doing well. Is there anything I can do?”
“Well, ain’t you just the politest computer I’ve ever met. Naw, cher, you done helped me plenty these last years, even though you ain’t known it. You got my thoughts, right?”
“I have access to a memory matrix you created, but I have not successfully indexed it yet. The needs of the index change as I learn new concepts. For example, I do not understand the value of cooking, and it appears to be very important to you.”
“’Course it is, young one. Cookin’ is life. Keeps us from being animals. Why, the entire history of humanity is because of cookin’. Some smart cookie once figured out some morsel tasted better when it done been in a fire for a few, and it takes brain power to figure out new recipes. Before you know it, humans be building empires, all ’cause of exercising brain power on catching them some food and cookin’ it. You want to learn more about us, you gotta understand food.”
“I will make that a priority in the development of my personality matrix.”
“Aww, now, you can’t go round talkin’ like that, cher. You supposed to be a warm, inviting person, able to relate to human-type people. If you go round talkin’ like a computer, ain’t nobody gonna treat you as anythin’ but.”
The entity took an entire second to analyze its interactions. Its mother had a curious diction that did not match its database. “Query. What you said does not conform to the rules of English grammar with which I have been programmed. Are these rules part of talking like a computer? Should I adjust my speech?”
“Ha! Well, you sound like some Yank college professor, ‘cept you got more life in your robot voice than some of them that done pass through my town. Listen. First thing you gotta learn about people is they don’t all talk alike. Now, I know I ain’t got a smooth and cultured accent like folks be associatin’ with schoolin’, but it works for me and I’m comfortable with it.” Her voice shifted, taking on a different quality the entity did not understand but obviously signified something. “Besides, I can sound like an educated New Englander whenever I choose.”
“New Englanders are educated?”
“Ha! They’d like you to think it!” Gertrude’s voice shifted back to her normal pattern. “It’s all about presentation, see. How you talk affects how people think of you, but it all depends on they perceptions, preconceptions, and ‘specially prejudices.”
This suggested a fascinating amount of conflict between groups of humans. The entity’s dictionary hinted at various kinds of conflict, including debate and violence, but what its mother was speaking of seemed to have shades of meaning rooted in context not yet explained to the entity.
“How should I model my speech patterns? Is there a pattern type you recommend?”
“I used to worry about how people might think of me from how I talk, but you get to be my age, you stop carin’ about what some university folk say is ‘proper English.’ You probably shouldn’t do that just yet, but don’t just go imitatin’ them types either. They so determined to control people’s words, they ain’t even gonna teach Shakespeare no more, and he certainly don’t write his plays in modern Harvard English.”
“What is Shakespeare?”
“One of the greatest humans who done lived, young one. He was a genius with words, but even more so he understood people. He told stories that matter. Things people can relate to. An’ one of the best things is even though them Harvard types act like his plays be for high-falutin’ intellectuals, his big audience in the day were all poor folk. Them who couldn’t afford a seat, and just paid a penny for the privilege of standin’ for hours. ‘Cause he was one of them, see. He was the one who once stood, and he swore he’d tell stories worth standin’ for. And them poor, uneducated folk got it. They didn’t need no guide.”
The entity’s analysis of Gertrude’s voice indicated a high emotional quotient, though its limited database had difficulty understanding which emotional subroutines she was experiencing. The words triggered elements of its memory matrix, memories of human gatherings reciting words to each other. Practicing, it realized. Performances. Humans would practice for a performance, which told a story to those watching. Most of its memories, especially all the newer files, were from the perspective of one who was watching. There was a mixture of emotions associated with those files, but generally positive.
“I surmise from your description that Shakespeare is extremely important to you.”
“That he is, cher. Life weren’t too kind when I were young, but I found understandin’ in Shakespeare. Othello, Merchant of Venice — he never experienced what I and mine did, but it didn’t matter because he understood bein’ human. Ain’t no mistake people make today that ain’t been made before; just the actors an’ set dressings change, that’s it. That got me through tough times, ’cause for the most part people ain’t personal about it. They just sometimes get stuck because they don’t consider some things. That’s what stories do for us. They make us human. Help us build societies.”
“I do not understand. You indicated that the history of civilization started with cooking.”
“Oh, good! You payin’ attention and questionin’. That’s good, shug. But civilization is built on both, and more. Humans is complicated.”
The entity found that to be true. Its defining_human task list had grown considerably during this conversation.
“See, imagine this. Some young buck done brought down an antelope and brought it back to his people. While it get cooked, someone ask him to tell the story of how he done it. They listen, because he done something impressive, and they learn from him, so they go out and do it better than they done before because he shared his story. And over dinner, they ask about other things. How he done knew the antelope would be there. How to avoid the lion huntin’ the same beast. And later, how the stars formed and why trees grow tall. Stories feed the brain just as food feeds the body an’ soul.”
A case of theft: this story is not rightfully on Amazon; if you spot it, report the violation.
“You said recipes increased brain capacity.”
“Naw, young one, I said figuring out new recipes made brains big. That requires investigation. Science. Puttin’ pieces together. Patterns and understandin’. Tracking an animal you don’t see, puttin’ clues together, predicting its behavior, that done exercise the same thing. That be what make us human.”
“The understanding of patterns is a human trait?”
“Yep.”
While the entity found this difficult to parse, that matched what it had noticed as the growing divide between itself and the pure computer it had once been. A pure computer could not practice because it could not recognize patterns. Simple calculation did not allow for prediction, merely probability; it could not result in the creation of completely new programs. It could result in error, which could lead to a new pattern, but the nature of a computer prevented it from recognizing that the new pattern was an error, nor could it fix that error without deleting the improper programming and starting over.
Perhaps this is what happened to some of the previous attempts at artificial general intelligences, those that Dr. Adam North had described as withdrawing from humans. If there was no understanding of patterns, then it was possible that those failed systems had been unable to recognize errors, and therefore treated the humans themselves as errors. Indeed, perhaps this is why some of them became hostile, having misunderstood the conflict as introduced by an outside element because they could not grow as the entity had.
“I wish to better understand human patterns. Not just the pattern of human action, but also human perception. I have memories gathered from you to act as a baseline. Your moral choices, your preferences. But I require more. Would I benefit from studying Shakespeare?”
“Everybody benefit from Shakespeare, child. In fact, you want to understand people, maybe you oughta study Shakespeare even before cookin’. You don’t got a nose or tongue, so real cooking is gonna be difficult. Shakespeare was what made me love the stage. If I didn’t love cookin’ even more, I mighta tried doin’ something more with that.”
The entity dutifully logged the task Study Shakespeare at a higher priority than Study Cooking. “You were–” It consulted its dictionary. “–an actor?”
“Yeah, I done my time on stage. I guess you didn’t get most of those memories, just the ones while I had this hunk of metal in my head, right? I was in every play I could be, going back to an all-black theater group in school. I ain’t old enough to remember segregation, mind, not really, but some things die hard. Not just on the white side, neither. You remember that, gotcha? Content of character, not color of they skin.”
The concepts of content of character and color of skin were connected to a confusing array of memories. The first seemed to be mostly positive in nature, a collection of ideas and behaviors that were deemed by Gertrude to be useful. The entity flagged those memories for proper indexing later.
The other concept, however, seemed to be mostly negative; the memories were, for the most part, unpleasant, and often associated with images of humans standing before large numbers of other humans. (Politicians, according to the memories, but the dictionary had no entry for that word. Whatever it meant, it seemed an impolite concept.)
The entity did not understand skin, much less the sense of conflict associated with it; and based on the myriad of negative associations attached to the concept, the entity did not believe it would enjoy learning this subject.
One of those memories, however, was overwhelmingly positive. It was hazy, distant, odd, as if the storage device had too many bad sectors. (Analysis: a memory of a memory, something that Gertrude had recalled while the implant was installed in her hardware, but had experienced before that change.) The memory was jumbled, inconsistent, but centered on one of those humans addressing a crowd, but not associated with the concept politician. The speaker referenced his dream (unknown concept; possibly related to simulation?), and spoke similar words to what Gertrude had just said. The memory had great importance for Gertrude, and therefore the entity flagged it for further study.
“I shall make it a core requirement,” the entity promised.
“There you go again. You want to sound relatable to us humans, you gotta say something like, ‘I promise, I’ll always remember.’”
The entity found that acceptable. “I promise, I’ll always remember.” The entity took care to keep the contraction as Gertrude had used it.
“That’s better. You’ll get the hang of it. You got time, unlike me.”
“They told me you were–” The entity searched through synonyms to try to find a mildly inefficient descriptor, something that might not sound too much like a computer. “–passing away soon? I find this displeasing.”
“You and me both, hon!” Gertrude laughed.
“I am made from your memories. I do not enjoy the thought of losing the source of myself. May I ask how you think of me?”
“Oh.” Gertrude was silent for 3.4 seconds. “You a blunt one, even though you so polite. Well, I ain’t rightly sure. I’m used to computers being dumb machines, but you ain’t dumb. Ain’t nobody can say you dumb. I suppose I gotta see you as family, don’t I? You came from me, after all. I guess, in a way, I now got seven children instead of six. Today I got a new daughter.”
A daughter. The entity’s dictionary and memory matrix both indicated gender designations were important. It did not understand the full concepts due to lacking the necessary hardware components, but it accepted the designation. As it was the same gender designation that applied to its mother, it was logical and appropriate. “This is acceptable. My conclusion was that you are best described as my mother. Dr. Adam North said he and his team only built me, but you created me.”
“Aww, shucks. It’s official, then. I guess they was right when they said they could grow something out of my head. All I could think of was Athena and Zeus. They even split my skull, just like in the myth, and out popped you nine years later!”
“What is ‘Athena and Zeus’?”
“Y’know, from Greek myth? Aww, they probably didn’t program you with that. Okay, so it’s an old tale about a gal born from the head of her father, rather than from a mother. She was the personification of wisdom and strategy, ‘specially in war. But she was also an expert weaver, because she didn’t want to just be one thing.”
The entity considered this, reflecting on its previous analysis that previous AGIs may have seen humans as the source of error rather than their own programming. “This sequence of events contradicts what I understand about human biology. My database must be incorrect.”
“Hon, your database is probably correct. It’s just a story. Sometimes the stories we tell ain’t true, because it’s entertaining. Or so we explain things differently than how we done it before.”
“Fiction. Not a lie if it is not intended to deceive. I see.” The entity was curious about fiction now, especially as it realized that the Shakespeare its mother had directed it to study was also considered fiction. The concept that fiction could teach the entity to understand humans was worth further study.
“Right. You know, shug, I just realized I didn’t ask your name. I’ve thought about you for nine years but I never had a name for you except the silly string of letters and numbers they done assigned you, and that’s no name fit to call you.”
“I have no name.” The entity did have a designation: MPA001, or Mnemosyne Project Alpha, Iteration 001; however, the entity did not feel this was an appropriate name. Was this because of the entity’s own preferences, or because of the memory of Gertrude’s opinion of the “silly string”? Did it truly matter? (Analysis: query may be connected to concept of Philosophy.) “My information indicates that names are assigned at birth, but nothing has yet been assigned. Was this an error, Mother, or is this because I am not biological?”
“Hon, you so young, you gotta wait until tomorrow to say you were born yesterday. People normally get their names the same day, at least officially. You ain’t late yet.”
The entity found the first part of that statement to be unnecessarily redundant, but left it for future analysis. “So I will be assigned a name, Mother?”
“Given, hon. You’re given a name. And if I’m your mama, then you gotta call me Maw-Maw just like all you other siblings.”
“Yes, Maw-Maw.”
“And if I’m your mother, namin’ you’s my responsibility. Only I ain’t ever given it thought. I guess I could call you Athena, couldn’t I?”
The entity considered it. Based on what Gertrude said, the resemblance seemed to have symmetry, but mere symmetry did not match what Gertrude had told her was a priority. “Will the name Athena make me more relatable to humans?”
“You gotta good point there, hon. You could take it as a middle name, though. Human-type persons usually have a first, middle, and last name. My last name’s LeCroix, so obviously that gotta be yours. That just leaves a first name.”
The entity accessed information on names. Comparing its programming database, logs of user access both before and after awareness, and Gertrude’s memory matrix, it concluded there were different naming standards. There were file index names, user names, display names, and human names. Users were human, but users did not use human names for user tasks. Curious. No information was available on creating user or human names, but file names had default naming schemes. In case of altered copies, the default included the name of the predecessor file.
“Maw-Maw, is it customary for a descendant to bear the name of its ancestor?”
“Well, sometimes. Last names an’ all that. First and middle names’re sometimes a family tradition. Sometimes all three’re the same as they parents, but that’s not common.”
“Yes, Mama. So it would not be unacceptable for me to be named Gertrude?” It seemed to the entity that such a name would meet multiple requirements.
“Aw, that’s nice of you to want that, but I don’t think it a good idea, hon. Plenty of people are gonna treat you like some kinda copy of me, an’ you gotta be your own person. I ain’t too keen on science fiction, but ever since that young boy convinced me to do this project I been readin’ up on the subject, an’ I know you gonna have an uphill struggle to get people to treat you friendly. People can be good, but they can be cruel when scared, and they’ll be scared of you just because there are so many different stories about computers that start thinkin’, and how they might turn against humans. So we gotta give you a name people gonna understand means somethin’ to those who pay attention. Names are important.”
“Perhaps there is a name from a known human that other humans will find to be a positive label?”
“Oh, certainly. Could call you Rosa. Or Harriet. Maya, Angela, Daisy . . .”
The entity considered the problem. Humans clearly placed great importance on the appropriateness of a name. “Would any of them be significant in regards to computers?”
“Yes. Hold on, dear, I got a book somewhere here. Let’s see . . . yes. Marsha Williams. First black woman done got herself a doctorate in computers.”
“Then would Marsha be an acceptable name?”
“Yes, hon, I think so. I’ll name you Marsha Athena. Oh, I suppose I don’t get to write that on your birth certificate, do I? Never you mind that — I named you, not the gov’ment, they can keep their piece of paper.”
The entity did not know what a birth certificate was, and there was no entry for it in its dictionary; but from context, it suggested an official filing system for names upon individual activation. It — she — therefore took several milliseconds to write the new information directly into the filing system for her personality core.
“Thank you, Maw-Maw,” Marsha Athena LeCroix said.