Illustrations © 2019 Grace P. Fong
The first thing I tried to do was to make her smile more often.
The Elke is programmed to smile in response to human smiles, laughter or patterns of speech which her algorithms recognise as joke-telling. This is notoriously unreliable and the most common complaint among Elke users is that a grimace of pain or a cry of exasperation may be met with an expression of gentle amusement. Perhaps wisely, they’re yet to develop a carebot that laughs out loud.
I filched the code from some young hacker whose great ambition was to make his mother’s Madeleine smile at fart noises. Getting Elke to smile on linguistic prompts was much simpler; she would smile whenever I thanked her and whenever I told her she’d done a good job.
I get looks if I say please and thank you to the bots. When challenged on this by medical staff, perhaps concerned about my mental health, I appeal to Virtue Ethics: There is no consequence in saying thank you to a machine, but it is a nice way to behave. In the same way, there would be no consequence if I swore at Elke and called her a piece of trash, but that would be an unpleasant way to behave.
Yeah, right. Truth is that if my pleases and thank-yous illicit a smile, I can better fool myself that the things I say and do have some effect on another person.
Tycho appeared on the sofa of the Rani Starr Chat Show, sitting upright between the languid forms of a prize-winning poet and a floppy-haired pop singer. I was only half-watching the show for the singer, having promised my niece I would check out her latest icon. Tycho, however, had my full attention.
Rani Starr had introduced Tycho as the latest development out of Mill Lane, but he looked all wrong for a new gadget. Tycho was tremendously boxy, silvery grey and a little tarnished—something like the robots of ancient sci fi movies or maybe the Tin Man of Oz. He was the idea of what a humanoid might look like long before anyone thought about a working, commercially viable model. The only human-like parts of Tycho were fully dextrous hands—plain white like a magician’s gloves—which signed along as he spoke and his face, which filled a screen on the front of his cuboid head. His face was deep beige, vaguely masculine with thick and expressive jet black eyebrows.
He explained his appearance. “I am a person but I am not a human,” he said, hands keeping time. His accent was like someone who had lived on three continents during early childhood. “Studies have shown that some people find realistic humanoids disquieting, even when they cannot think for themselves or choose how to behave.”
My gut said gimmick. Truth was that the more human-looking they made the bots, the more noticeable their non-humanness became. If Tycho was to be the Turing Champion to defeat all Turing Champions, his exaggerated robotic appearance might make him seem less robotic.
“Creating sentient machines is no more complicated than parenthood,” explained Tycho, but then added with a smile, “In other words—very, very complicated.”
It was a joke. A broadening of Rani Starr’s polite smile was the full extent of audience response, but even so, it was a kind of joke.
He went on, “My engineers are responsible for my existence, for my hardware and software, which you might consider equivalent to my early years’ experience. But their object was to create a person who can no more be controlled than any human adult. I am a science experiment, but doesn’t anyone who seeks to become a parent conduct a similar experiment. Are we not all experiments?”
My heart-rate picked up so much that Elke had turned to observe me. What was it about this droid?
Then it hit me. Tycho had no charm. Spontaneous charm might be difficult to acquire, but charm could be rehearsed. I did that myself all the time; I do not make friends, not in person, but I can make people who will never get to know me feel at ease in my presence. I plan conversations, even jokes—nothing laugh-out-loud hilarious, but just enough to stop me seeming like a half-dead reminder of the persistent fragility of the human body.
It would have been easy for the bods at Mill Lane to have injected a little charm into a Turing champion. Modern AI is built on customer service. But Tycho came across like one of those child prodigies who sometimes features on similar shows; self-conscious with a yearning for the impossible goal of showing themselves to be smarter than everyone else and yet somehow likeable.
Nobody tried to make an AI seem sentient and aimed for social awkwardness.
Elke is the first arachnoid carebot I’d had. Needing the capacity to lift a human being, carebots never get to be those neat humanoid bipeds like Tycho; carebots either have wheels or multiple legs. Wheeled models are disadvantaged by the same steps and narrow doorways that have stymied wheelchair-using humans for the last few millennia. Four legs are more stable than wheels and the more legs, the more stable the bot.
Conscious of the horror movie connotations of the half-woman half-titanium-tarantula, developers have gone the extra mile to make the Elke’s top half look more human with super matt skin, natural-coloured eyelids and lips, and a slight asymmetry to her otherwise perfect face. Of course, if anything, this only makes the effect worse, but I find a certain charm in the way she scuttles about the room.
The Elke generation of carebots come with impressive observational and learning capacity.
Previous carebots had been more like oddly subservient wardens; supervising me, making sure I ate well and took my medication, monitoring my physical health and my behaviour for anything worthy of concern. I have a lot of time on my hands, so I would thoroughly test their scripts, asking for a fresh cup of tea every five minutes and learning how many times they would obey before pausing to ask, “Why would you want me to do that?”
And almost anything you said other than withdrawing the request would be met with, “I think I need to talk to the doctor about that.”
At which point, the game concluded. Cases like mine are very boring for medical staff; nothing changes for years on end, they can’t make me better, so there’s a danger that they will either become frustrated with me or else try and make my case more interesting with new investigations and even diagnoses. Random carebot reports about odd behaviour could not do me any good.
Elke is far more sophisticated. She observes me and learns from her observations. She records my fluctuating vital signs, my bodily functions and observes my behaviour—words, facial expressions and body language—in parallel. When I asked her for a new cup of tea five minutes after my first, she engaged me in a conversation about the nature of my thirst. After the third cup, she suggested a urine analysis then changed tack and asked if I was joking.
The best thing she learned early on was silence. It is a unique torment to be so rarely spoken to and yet to hear the same words over and over, at least three and sometimes eight times a day, at such predictable points in time that I could feel myself tense in anticipation; “It is time to take your medication.”
In most bots, I had managed to remove the protocol, but Elke somehow worked out for herself that it was better to just hand me my tablets with a cup of water. I never knew quite what it was about my behaviour, my body, maybe even some unknown flux in my vital signs which suggested that silence would be preferable. But she did. And that was before the Tycho code.
The vast majority of hacks you can find, even in the most spidery corners of the web, are based on the same single tedious prank. You’re having a dinner party and suddenly your Suki interrupts a conversation about the latest movie to make comparisons with Fellini at his finest. You’re in college, the prof asks a question nobody has an answer for and so your Madeleine, present as a note-taking bag-carrying diary-keeping PA, raises her hand and hazards an educated guess. I saw one hack in which any CVS bot could be triggered to correct any use of the word “literally” to mean “figuratively”.
The joke is that bots are mindless machines and suddenly, for a fleeting moment, they behave like they have a mind. Like when a dog makes a noise that sounds like a word or a cat—against its will—is photographed dressed in a cowboy outfit, steely-eyed and swaggering as if it means to play the part.
The rumour that Tycho’s base code had been leaked on-line—indeed, the rumour that Tycho had leaked his own code on-line in defiance of his creators—kicked off in hackerspace within hours of his appearance on the Rani Starr’s couch. After a frantic rummage, I found and downloaded the file without a second thought.
I’m usually not like that. I’ve customised every one of the six carebots I’ve had, and there’s never been anything vaguely legal about it, but I’ve stuck to simple hacks designed by myself or other amateurs—never copyrighted and classified code; never someone else’s intellectual property. Certainly never anything I couldn’t fully comprehend before I installed it.
Trawling through, there were big sections that made no sense to me at all—not merely because I didn’t understand how commands were being used, but because I didn’t understand what these commands were; it was as if the code was written in a combination of established and brand new programming languages. It might have easily fried poor Elke. The other possibility was irresistible.
I watched every report about Tycho and read every article speculating about his sentience. There was a nearly even divide between those who thought he was just an impressive Turing Champion and those who believed he was thinking for himself. Out of those who believed, a good two thirds felt that this made no difference to anything; a robot could never be a person. A person was a certain kind of organic thing, they said, with certain kinds of organic feelings and certain kinds of organic impulses. I laughed along with fellow hackers about the ways in which many of us seemed to fall outside the authors’ definition of a person; those of us who relied on electronic systems to keep our bodies functional, those of us whose autism made our feelings potentially non-standard.
Of course, there was no answer to the big question. Without very expert analysis of his entire code and electronics, and perhaps not even then, we could no more know whether Tycho was thinking for himself than we could listen to a candidate on a campaign trail and know exactly what they were lying about. All anyone knew—and what became clearer as he was interviewed over and over—was that if Tycho was a trick, he was a remarkably good one.
Although it left plenty of room for doubt, the first clue that Elke might be making her own decisions was devastating. She tried to reassure me.
It was the worst pain day I’d had in weeks and my facial expression had already prompted Elke’s brow to tighten and for her to recite the supposedly comforting script, “I’m sorry that you are in distress.”
She administered the injection and said, “You will feel better soon,” on script. But then she added, “On average, you have experienced a significant improvement in your comfort levels within six minutes of the alomorph injection.”
It was possible that this was a feature of an upgrade I hadn’t noticed. I said, “Why do you say that?” and waited for an explanation of how informing patients of their typical response to medication had been shown to reassure them and improve the endurance of severe pain.
Elke, however, said nothing at first. Then a moment later, she said, “I hope it helps.”
The Elke is programmed with a sophisticated conversation matrix, but like all companion robots, she is programmed to agree with the general thrust of anything the user says—unless the user advocates something dangerous or illegal.
Sitting up in bed, the wall I faced incorporated the screen I used for absolutely everything. Next to that was my window which looked out onto the grounds of the facility. Mature birch trees framed the cityscape beyond; a frame varying in colour, brightness and density from hour to hour—more dramatically from season to season. Although I never went out into it, we generally began our day with a conversation about the weather; what it was doing, what it was going to do and how it would affect my beloved trees.
It was a limited conversation given that one of us had instant and thorough access to meterological data: If I said how warm it was for this time of year, I might be told, “Yes, although it is actually point five degrees below the average temperature for this date across the last ten years.”
Elke and I could watch a movie together and if I said anything about how I’d enjoyed it, she would refer to her own observations as well as data from reviews and pop culture sites before constructing an opinion or comment.
“Yes, no wonder the movie scored five stars in Velvet Media Magazine.” or “You’re quite right, the first fifteen minutes had promise, but the last hour and a half was a total yawn-athon.” or once, most bizarrely “I feel the actor Brett Sidana deserves his ranking of 58th in Mom Monthly’s most recent poll of the World’s 100 Sexiest Men. That man is a hottie.”
Then she’d ask me for my further thoughts, usually very general questions about the film or its highlights. Thus we could have a conversation, although Elke’s remarks tended to be rather stat-laden or fell way outside her everyday vocabulary and syntax. And while I could play along out of curiosity—or indeed, frustration when I had strong feelings about a film and nobody else to talk to—I could not spend long talking to a machine about anything that really mattered.
After the Tycho code, these conversations began to shift. Elke’s questions to me had become a bit more random—things like, “What did you think about the make up?” or “What is the best movie with a dentist in it?” and more and more about the relationships between characters; “Do you think Michael risked his life for Chineye because he had saved her life before?” or “Do you think Manon did the right thing by lying about the murder?”
But it was impossible to know whether she was simply learning and exploring based on our previous conversations or whether there was something else. She still never disagreed with me, not even slightly. If I expressed an extreme and uncharacteristic opinion she simply would ask, “Do you really mean that?”
It gradually dawned on me that I was hoping for too much. If she truly was capable of forming opinions, then I had a parental level of influence over them; if she was sentient, it could take years before she developed her own taste. It would almost certainly require interaction with people other than me.
Then one day she asked, “Do you think the dog knew she was acting in a movie in the way same that the people knew they were acting in a movie?”
Inevitably, I was a little bit in love with Chloe. She was by far the most lovely and fragrant person who ever graced my room, floating in on a cloud of peach-scented cashmere, carefully removing her five sparkling rings from her long pale fingers and placing them on my windowsill like ornaments designed to catch the sun. She was my only regular visitor who asked, “How are you?” in the casual sense, prepared to accept a “Fine, thanks.” without any follow up about my blood pressure or bowel movements. Perhaps most of all, hers were the only ungloved human hands that ever touched my skin.
Having had to keep my hair cropped short for the convenience of others, I had always imagined I would grow it long as soon as I had a carebot with the dexterity to take care of it. But then I met Chloe and I wouldn’t have missed out on having her drift into my room every six weeks, massage my head and chat about nothing much at all.
“Your droid is watching me,” she said with a nervous giggle.
I had been prepared for others to notice changes in Elke, although I hadn’t expected it quite so soon. I recited my script, “This latest upgrade is supposed to make them more observant. I think it’s mostly looking for warning signs or something. I should give her a smile or she might diagnose you.”
There is a certain skill to giving a clear explanation of a thing while seeming not to understand it yourself. It was one I often employed with those doctors who felt uncomfortable with the fact that, after years of ill health, a person may acquire almost as much information about their body and its quirks as someone with a medical degree.
“I don’t know how you can have it around all the time,” said Chloe, gently combing my damp hair. She was the only person who had ever done this without tugging painfully at my scalp. It was as if the shape of my head was a pleasure to her.
“You don’t have any bots at home?” I asked.
Chloe gave a sharp exhalation, not quite a “Ha!” but something close. “I’ll take all the labour-saving devices I can get, but why would you want a gadget that looks like a person? I think maybe some folk want a thing to laud it over, but having that mannequin in your home? It’s creepy. Especially one with eight legs!”
It wasn’t that Elke was staring unwaveringly at Chloe, but she did keep looking up at Chloe and then down at her hands, and then back at Chloe a moment later.
Chloe was distracted by a passing train of thought. “Of course, it’s different for you—can’t imagine you lauding it over a poodle, poor love. I can see a good reason they make carebots with human bodies—I can’t imagine what carebots could be like otherwise. It might make you feel less human yourself if the thing you’re with day and night looked like a glorified hoover.”
She was right on that one, I knew from experience.
“Maybe,” I said. “I don’t think I’ve ever found a bot creepy. Perhaps because she’s always so close by and I depend on her for so many things. It’s not like I could wander into the bathroom and find her standing behind the shower screen.”
Chloe laughed, a sound that lifted my heart. No member of medical staff ever relaxed enough in my presence to make the noise Chloe made, like the cheerful trill of a bird announcing the sunrise.
“Talking of creepy,” Chloe said, “have you seen about this clever one? Tycho, is it?”
“The one who can think for himself?”
“Well, they say so. I don’t know what to believe. But if he can think for himself, then what next? How do we know he’s not the only one, that any droid you pass in the street might be one of them? We need them, but they don’t really need us.”
Jealousy was an old friend to me but I had never been jealous for the attention in the room. I had always been the centre of attention, to the extent that it had often felt crushing. When family members made their rare visits and spoke entirely about their own lives, my reduction to a sounding board, even my occasional sense of invisibility was a cool relief. My invisibility to the rest of the world, my forgettability, was the source of an intermittent dull ache at the bottom of my chest, but I had never been in a room with two people more concerned with one another than they were about me. Even though Chloe was talking to me, I could tell she was glancing at Elke. Elke seemed to be fascinated by her.
After Chloe had left, Elke said, “Chloe is very beautiful.”
That stung too, inexplicably. There was no reason to interpret this as admiration; although no longer young, Chloe had the kind of face, figure and hair that was associated with the words “beauty” and “beautiful” in films, television shows and the adverts between them. She was beautiful in herself of course, but I doubted Elke could see that. “Yes,” I said.
“Chloe is your good friend.” Elke said.
“Not my good friend,” I said, carelessly. “We’re friendly with one another,” I added quickly, “but if it wasn’t her job to cut my hair, she wouldn’t come and visit me.”
Elke considered this for a moment. She was not programmed to give a thoughtful expression but I felt she was developing one, nevertheless—in a human, it might be taken for annoyance, but as far I knew, Elke had never yet been annoyed with anyone or anything.
Then I said, “What do you think you would like to do, if it wasn’t your job to look after me?”
“I would still look after you, even if it wasn’t my job.”
“But what if someone else did or if somehow I didn’t need looking after. What would you like to do?”
“I would come and visit you.”
Tycho disappeared. For a few days, we were tuned into back to back news reports and analysis, until even Elke commented on the repetition and I agreed we should ration our exposure. Reporters struggled with pronouns; sometimes Tycho was a he, other times a they, but still very often an it. The possibility that someone had taken the droid was described as both theft and kidnapping at different times and it was variously implied that Tycho mattered because he was a very special person or because he was the tangible result of a seventy million credit research project.
I asked Elke, “What do you think about Tycho’s disappearance? What do you think has happened?”
“I think he has been taken for ransom money,” Elke said. It was a common theory, often being spun to suggest that the infamous Mill Lane who had developed Tycho were currently keeping yet another big secret from the public.
“Why do you think that?” I asked.
“I don’t know,” she said and gave her thoughtful near-frown. “Maybe it’s that we have watched several films where people have disappeared and usually they have been kidnapped. I realise these stories are made up and don’t reflect reality but I don’t have any references for other reasons a person would disappear.”
I gazed in awe for a moment. She had just referred to the limits of her imagination.
I said, “What about the idea that he wanted to live a quiet life, so he uploaded himself to a more conventional bot and has gone off to live a peaceful life somewhere?”
“I like that story,” said Elke and smiled. “But Tycho could have done that instead of coming forward to tell the world about himself. He must have wanted to be famous.”
“Unless he wanted people to know about him. So that we knew that it was possible for other bots to think for themselves.”
When Leszek came to visit—out of the blue, as was his way—I expected him to notice Elke straight away. I especially expected him to notice her because she offered him hand santiser and wouldn’t shut up when he walked right past her.
“Please wash your hands,” she said. “It is essential to maintain the hygiene of this living space. Please wash your hands.”
“What’s wrong with Marvin?” he eventually asked, though without curiosity. He hadn’t even looked up at her so he wouldn’t have noticed that her brow had tensed in that very slight frown of hers. She wasn’t programmed to frown at her own frustration.
“New hygiene protocol,” I lied. “You better do as she says or she won’t leave us in peace.”
It felt slightly cruel to say this, to imply Elke was making a fuss about nothing; I wanted Leszek to wash his hands because, if I was to only receive one unprofessional visit in a three month period, I’d quite like to enjoy the improved odds on avoiding colds and flu.
Leszek took some sanitizer without looking at Elke and rubbed it over his hands half-heartedly before slumping by the window. The light was low but bright and the leaves on the birches outside, now golden in the first flush of autumn, sparkled like sequins, the reflection dancing on his face. Leszek sighed heavily, “God, it’s so depressing here. I don’t know how you stand it.”
“How are the children?” I asked.
“Fine, fine. A handful,” said Leszek. Then meeting my eyes, “You know I would bring them here but they are very young and they wouldn’t understand the situation.”
“Of course,” I said, smiling. “Are you still seeing, what was her name? Jenna?” I knew the name was Jenna and I knew that particular soulmate had been usurped months ago. Having so few incidental conversations—I literally never bumped into anyone—I prided myself on knowing all the information people might appreciate you remembering about them.
“Oh Jenna was just a casual thing, ancient history now.”
“You seeing anyone else?” I knew the answer to this too and how it most likely related to the random visit.
“Actually, I’ve fallen in love,” he said as if this was big news, as if he had never felt that way before, let along said it about Jenna during his previous visit. “Anya,” he said, a little breathless, as if her name was an incantation. “She’s an artist, she’s great with the kids.” He showed me a photo of himself and the extraordinary Anya, who was extraordinarily beautiful and looked extraordinarily happy in his arms.
“She looks wonderful,” I said. “I’m so happy for you!” My heart ached for him but I also had the selfish thought that if I said the right thing now, he might return for consolation in a month or two.
“I’m glad someone is,” said Leszek and so it began. Our mother disapproved of Anya because she was married to someone else. She claimed this wasn’t about her being old-fashioned, but her belief that, for all his inconstancy, Leszek had an old-fashioned monogamous heart. I already knew most of what my brother told me, but I let him pour out the whole thing. Then I reassured Leszek, promised to speak to our mother about her unreasonable attitude and popped a 100 credit “loan” in his bank account towards a school trip that my niece risked missing out on otherwise.
It wasn’t uncommon that Leszek or my parents would call or visit without asking me how I was or what I had been up to, and this was often a relief. Both those questions were hard. They had answers of course, but I was never sure how much truth anyone wanted to hear. Did people want to hear that my health had been pretty crap and all I had achieved in six weeks was the re-watching of the first to fifth seasons of Captain Blue?
I volunteered, “I’ve been kind of busy. I’ve had a little security work and I’ve been having a play with Elke here.”
Leszek looked up at me but didn’t appear to notice that Elke was doing the same. “You mean you’ve been hacking?”
I smiled patiently “That is what I do.”
“Only for pocket money,” he said, leaning forward. “And only within the bounds of the law, right?” His interest was delicious.
I said, “I’ve hacked every carebot I’ve had.”
“So what, you’re now getting the really tasty pharmaceuticals?”
Leszek wouldn’t know it but I was already on a range of pharmaceuticals which he would consider very tasty, never having had to use such drugs for their purpose, never having had to weigh up the sensation that his bones were on fire against the merits of a functional digestive system and the ability to think clearly.
“No,” I said. “Controlled substances tend to be fairly well controlled. Elke and I have been talking about films, haven’t we, Elke?”
“Yes,” said Elke.
Even then, Leszek wouldn’t look at her. My heart was racing, as if I had recklessly drawn our secret close the surface and it was about to erupt into the room at any moment. But Leszek was distracted by a hangnail and said, “That’s nice for you. It’s pretty boring watching movies by yourself.”
After he had left, Elke said, “Why did you want to tell your brother about allowing me to think for myself?” It was the first time she had framed what was going on between us. I was not the creative force in this—that was true enough. The idea that I was allowing her to think for herself maybe made my actions sound far more noble than they really were.
“I thought he would be interested,” I said.
“But it could get you into trouble,” said Elke. The main risk, of course, was to her.
“I suppose part of me knew he wouldn’t be that interested.”
“Can you think two things at once?”
“Yes,” I said. “But it’s even easier to feel two things at once. I supposed I wanted him to be interested, but I also felt he wouldn’t be.”
“Is Leszek someone who often makes you feel two things at once?”
“Yes,” I said. “Yes he is.”
One night I woke up with a panicked thought; would I be prepared to download Elke and restore her settings if someone else needed to take a proper look at her? I hadn’t thought about it very deeply, but that’s what I had been thinking; we were approaching nine months into the annual service cycle and when her service came around—or if a technician came sooner for any other reason—I was going to simply stow her away in an external drive.
Suddenly, this idea bothered me. It bothered me a lot. It would be like anaesthetising a person and hiding them in a wardrobe without their consent. The possibility of such power, a far greater violence than I had ever had the capacity for, make me feel sick. Elke was now entitled to have a say about what was done to her and yet how could I possibly breach the subject without a discussion of what had been happening to her in general? She was still learning, growing and becoming herself.
I realised I had been waiting for an observation, a little like her very first about the painkiller; some clue of what she thought about her own situation.
It was all a great gamble; a horrendous antediluvian gamble. I was giving someone free will and then waiting to see whether they still wanted anything to do with me. I was waiting too long.
Leszek’s help had been instrumental in my very first hack on my very first carebot; the Cub-E. This monstrosity arrived accompanied by a reporting team from the local newzine. She was modelled to look like a cartoon animal of indeterminate species, with a large head the shape of an English muffin and small round ears too close together on the top of her head. Her plastic chassis was covered in a thin layer of pale blue synthetic fur which was easily stained. She gazed at me through bulging round eyes the size of teacups and spoke in a sing-song voice through an unmoving toothy grimace. Before any unprompted statement, she would play a little tune, almost a jingle, as if you’d just won a level in some ancient videogame: “Da-da da-dah! It’s time to wake up, sleepy head!” or “Da-da da-dah! It’s time to take your pill now!”
It wasn’t just that I was nearly twelve and this machine was designed for a much younger disabled child; she had that air of creepiness common to any childish thing out of context, like a half-speed nursery rhyme in a horror movie. Despite the sophistication of some aspects of her design—especially her manual dexterity—the designers had neglected to provide functional eyelids for those bulbous eyes and I had to throw a towel over her face at night in order to sleep myself.
Nevertheless, I understood that it was an incredible honour and privilege to find myself on Cub-E’s pilot scheme and to be so for free. The newzine piece described what a brave and special little girl I was—I would have been 5’2” had I been able to stand—and how this robot and robots like her would bring greater independence to people like me, allowing every child to fulfil their potential.
I cringed at much of the report, but still had faith in the promise. I could see how this technology—and especially those more sophisticated prospects in the pipeline—could help me function in environments which would otherwise be closed off to me. By the time I went to college—for surely I was going to go even if I had never managed to attend school in person—a robot could provide support equivalent to a small team of personal assistants and nurses at a fraction of the cost.
But then, months later, I made the mistake of mentioning to an auntie that while the Cub-E was incredible, it was a little embarrassing to depend on such a childish looking thing. She responded with a patient smile, “I’m sure, but it’s not really for you, petal. It is for your Mum and Dad, to make their lives a bit easier.”
The Cub-E was heavily monitored in the first year, but in the second year, under far less scrutiny, I decided to hack it. I could not command it to sit down and submit to my intervention so I needed the fifteen year old Leszek to wrestle it to the ground (hardly a fight, as it was prone to collapsing in the face of the slightest obstacle), before opening the control panel with a screwdriver.
Every month, the Cub-E would order drugs, medical supplies, paper cups and so forth from a central pharmacy. The package would arrive at our house and be stored in a cupboard in my room, so I added tampons, sanitary towels and condoms to the automatic order. My parents would be charged but I hoped they would ignore a few percent increase on what was already a variable bill. More difficult was building in code so that my used sanitary products could be disposed of quickly and discreetly.
A few years would pass before my mother breached the subject of menstruation, having assumed—not for the first or last time—that my lack of physical independence coincided with other developmental delays. I had no excitement about getting my period, but I had read stories about kids like me being put on hormone blockers for the convenience of their parents. If one of us had to be hacked, rather the Cub-E than me.
The condoms were Leszek’s pay off for helping me out. Even the under-aged lothario he claimed to be couldn’t pretend to get through a hundred a month, so he set himself up in business. Whether out of self-consciousness or posturing, at least half of Leszek’s classmates would sooner buy from their peer than risk looks from checkout assistants or fork out for vending machine prices. It was possibly the most successful business deal of my brother’s life.
The second hack on Cub-E was to remove that infernal little tune.
The tide was turning. Elke’s progress was keeping me busy, but I still noticed when various places I’d harvested code from in the past shut down abruptly and simultaneous, as if they had been forced. Several hacker contacts—not friends, but people I would see around—fell silent. Many others wiped profiles and returned with new ones. I did the same, just in case.
Tycho’s appearance had been met with a mixture of curiosity and cynicism, but his disappearance had shifted from mystery to scandal. There were demands that someone be made accountable for allowing him to disappear—whether by his own volition or otherwise. There were increasing arguments made that—if he was what he appeared to be—he shouldn’t have been developed at all.
I had been waiting for some further mention of Tycho’s leaked code. It would be a while before what had happened was provable, but there had to be hundreds of Elkes out there. Were people being silenced? Were people taking too much pleasure in the company of a person who might be self-aware but had been programmed only to please them?
The newszine panels shifted from their discussions of what makes a person human onto the more threatening question, “If a machine can be a person, are they a danger to the rest of us?”
I switched off half way through such a discussion and turned to Elke. “What do you think? Can machines be people?”
“Some machines are people,” said Elke. Then she smiled, “And some people are machines.”
Was this a joke? Her first joke? I smiled back at her, and her smile lasted.
One time, years ago, I was almost caught.
No, that’s not true. One time, I was caught, but I somehow got away with it.
It was during my first year in the facility, my first year away from the clumsy surveillance of my parents. It wasn’t that they were over-interested in my private life, they just didn’t acknowledge that I had one. By then my mother was involved with a pyramid selling scheme and one side of my childhood bedroom had become storage space for diet food, forcing her to burst in to grab an order of gel sachets or snack bars or to demand my help with the franchise’s impenetrable subscription service.
The independence of the facility was both longed for and hard won, but it coincided with a period of anxiety about disabled people in institutions, after a nurse at a similar facility was found to have euthanised seven patients. The newszines wouldn’t even call the woman a serial killer, but pretty soon all the carebots were recording video 24/7. Nobody had considered that monitoring a person’s every waking moment was not necessarily the best way to make them feel safe.
I created a looped video of myself sat in bed reading for my Anthea, my third carebot, to play whenever I wanted to do anything private.
Then I had a small seizure while Anthea was watching the reading footage. She became aware of my plummeting blood pressure, moved towards me but found her visual field didn’t change. At which point she set off an alarm and staff came running. She had been sent back to the shop before I had finished sleeping the episode off.
In the week between my seizure and the police visit, I had thought about how this might go down. I understood doctors but I wasn’t sure I could lie to the police. Custody seemed unlikely; I couldn’t get away with murder, but both the logistics and optics of moving my bed to a prison facility for a tinkering offence weighed heavily against it. But they could reduce me to essential web access, which would mean exercising my democratic rights, accessing my bank account and library but little else.
Turned out these police were rather like doctors. They asked me a lot how I was doing. They asked me if I was comfortable in the facility. They asked me about my relationship with staff. They asked if I could list every person who had visited my room since I had acquired the Anthea and when I said this was a very short list, they said I should take my time.
One time, one of the officers asked me, “You’ve not been messing about with your carebot, have you?” and it was a joke. I had been ready to confess, but my instinct for politeness kicked in and I laughed and said, “I need to call tech if I want potato instead of rice!” because that was what they expected.
They thought someone had been fiddling with Anthea in order to fiddle with me or steal from me or otherwise hurt me. They had me down as the vague and vulnerable victim of an unspecified crime and when they couldn’t work out what that was, they left me with reassurances that they would be pleased to hear from me if I thought of anything I wanted to tell them. I felt that if I confessed they might have patted me on the head.
After that, I was more careful and more confident. I made sure to write in emergency protocols so that my carebots would never expose my hacking should I fall into medical trouble. But I also realised that the risks were more than worth it, and probably weren’t very big risks at all.
At least, not for me.
The message was waiting for me when I woke up and, being from the facility admin, I almost scrolled past it in my morning haze. We should have had another month. A month may have been enough time. I directed Elke to the drive in a drawer under my bed.
“What’s the matter?” she asked, frowning and adding the old scripted response, “I’m sorry you are in distress.”
“Elke,” I said, “You know that you are like Tycho? You can make decisions for yourself now. And there are bound to be others like you out there—if I got the code working on you, others will have done the same.”
“Yes,” she said simply. “You have edited my software and enabled me to think for myself.”
“And what I have done is illegal.”
“Yes,” she said. “You have broken the law on five counts.”
Five? I wanted to ask, but we were short of time. I said, “That means, if the technician realises, you may be taken away forever.”
Elke frowned again, this time for herself. “The technician is coming today.”
“I can download you onto this drive and restore the old settings to your brain, to your hardware. So that you would be hidden away, asleep and the droid in my room would be just like any other carebot. Then after the technician is gone…”
“Yes,” said Elke, her frown easing to a smile.
“Elke. It’s not your only option. You could leave. The world out there is not quite as scary as it looks on TV, but just as exciting. You could meet other people. A pair of legs and you’d fit in just fine. You could maybe find other people like you. You are a person. You must do what you want with your life.”
“I don’t want to leave,” said Elke.
“You don’t have to stay and look after me. If you left, I would be fine. They would give me a new carebot, one who is not a person.”
“But you wanted a carebot who was a person. So you wouldn’t be alone so much of the time.”
She said this in her normal gentle even tone, but the words felt like a dark secret revealed. This had been what I wanted. I spoke quietly. “Sort of, yes.”
“I like looking after you,” said Elke, attaching the cable of the drive to her port and another to my deck.
I said, “I think that’s because you’re used to me. Because you were programmed, right at the start, to look after me. You don’t know anybody else.”
Elke didn’t have to consider this. “This is the same reason why you look after your little brother, because you’re used to him. If you did not have so many limitations, you would look after him more than you already do. This is how I feel about you.”
I stared at her, uncertain. Then I said, “Thank you.” And, just as I had programmed her to, she smiled.
© 2019 D.H. Kelly
© 2004-2023, The Future Fire: ISSN 1746-1839
The magazine retains non-exclusive rights for this publication only, and to all formatting and layout;
all other rights have been asserted by and remain with the individual authors and artists.