Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on February 22, 2019

This is the GDPR-themed sci-fi short story you’ve been waiting for


This is the GDPR-themed sci-fi short story you’ve been waiting for

The short story here below started as a submission to the Dutch Information Law Institute’s science fiction contest but quickly grew into much more once I started to create a timeline to solidify the legal situation.

The underlying theme is that everyone expects GDPR enforcement to come from government, but that has been haphazard and that’s unlikely to change. Supervisory authorities only have limited capability and would never enforce against everyone. But the GDPR has a little-known clause that says private organizations may enforce on behalf of the people they represent, including (on paper) the ability to demand financial damages. So I cranked that up to 11 and imagined big, unaccountable foundations and societies (a bit like the MPAA/RIAA for movies and music) that you have to go through to get consent to do stuff.

In addition, the GDPR is AI-unfriendly so I also extrapolated from there. Most AI is seen as dangerous by lawmakers and policymakers, AI decisions discriminate against certain classes, decisions are not explained, datasets are skewed. So you’ll get lots of pushback from the law. But what if an AI appeared that was actually good for humanity? Who would win, technology or law?

The story focuses on a young entrepreneur from Singapore who moves to the Netherlands to be with her ill mother. She discovers no one is selling AI in the European market, and once she’s launched her AI assistant — called Ada — she finds out why: she’s slapped with a demand of millions of euros and needs to go to a private arbitration hearing where the rules of law don’t apply.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

But she’s clever and by following the wisdom of Richard Stallman (FSF) — and with the help of some Catalonian hackers — she manages to get Ada out there for everyone. You won’t believe what those privacy guys did next…

A New Intelligence — a short story

Download and read the story: PDFEPUBMobi/Kindle.

A New Intelligence

The data subject shall have the right to mandate a not-for-profit body, organisation or association … active in the field of the protection of data subjects’ rights and freedoms with regard to the protection of their personal data … to exercise the rights referred to in Articles 77, 78 and 79 on his or her behalf, and to exercise the right to receive compensation referred to in Article 82 on his or her behalf where provided for by Member State law.
— Article 80 GDPR

Beep bop, said the doorbell laconically.

“Oh, come on!” exclaimed Joanne, pushing hard against the door, which stubbornly refused to budge. “My company has gone bust, I’ve been branded a privacy violator liable for millions in damages, it’s raining, and now you’re telling me my ring key is broken?” A second beep bop was all she got in return. “Why?” she shouted, her voice thick with frustration. To her surprise, the lock answered. “TOS violation. This facility was used in the unauthorized processing of personal data, decision J4/36 of 11 March 2038. Access is denied until DSPB clearance has been obtained.”

This is stupid, Joanne thought. They can’t do this. They’re just bureaucrats, and not even government bureaucrats!

Joanne kicked the door in anger, causing the lock to repeat its message.

“Oh, good lord! What now? Ada, I have no place to stay.”

A female responded from a small teddy bear attached to her backpack. “This is awful indeed, Joanne. We should go to your mother’s home. I still have the access code. There’s a bus stop a few hundred meters down the road.”

Dejected, Joanne grabbed her suitcase and began to walk.

* * * * *

“I love the idea of an AI personal assistant,” Harald, the legal consultant with the small glasses, had told her two years earlier. “I read yesterday that information overload is society’s biggest challenge after water management. With so many services out there, nobody knows where to turn. Plus, there are all the health recommendations you want to follow. It would be great if some app could help you deal with all that.”

Joanne cringed. “Please don’t call Ada an app. She’s so much more than a piece of software. I’ve been working on her since I was sixteen.”

Harald sighed. “I’m sure she’s all that and more. It’s just … difficult to do things in artificial intelligence these days. We’ve had so many accidents with data breaches and improper AI decisions that the rules are very strict now. In fact, we try to avoid the term at all. Companies call it things like ‘advanced business intelligence.’ Calling it AI, though? You might as well call yourself a social network.” He folded his hands, looking at her over his gleaming desk.

Joanne looked back at him in exasperation. “But AI is everywhere! I grew up with AI carebots and teaching assistants. Most of the countries in Asia are run by AI. And even here – the bus scans everyone, the supermarket only sells me what it thinks is healthy for me, and don’t get me started on all the so-called life advice I get from the Ministry of Public Health!”

“True. But those are all large, responsible entities that can be trusted with personal data. At least that’s what the Data Protection Supervisory Boards decided almost a decade ago. Startups took too many chances with other people’s data, and the people – legally represented by the Boards – said no to that.”

“How so?” Joanne interjected. “These so-called Boards are private entities. I know about this GDPR privacy law, but isn’t it the government that should enforce that law? Or at least ensure you get a hearing before an independent judge before you get fined?”

“On paper, yes. But in practice, the governmental supervisory authorities rarely if ever take action. By the 2020s you had special interest groups popping up, collecting powers of attorney from people and negotiating with companies. They were representatives of the people themselves, taking legal action where needed and collecting damages to be paid out to the public. That proved to work much better than government supervision and fines that just got swallowed by the national debt. Now people actually get money if their privacy is violated.” He said that with a smirk, his light brown eyes meeting hers.

“In other words,” Joanne said, “you’ve got private entities doing what the government is supposed to! That’s ridiculous. How would you ever hold those foundations or whatever they are accountable? How did they ever get there?”

Harald leaned back in his chair and tented his fingers. “In the early 2020s, the German consumer bureau set up the Zustimmungzentralstelle, a foundation that collects powers of attorney from consumers to grant consent for personal data on their behalf. Using those powers, it negotiated with Facebook, Vero and other social networks to establish a well-defined set of consents. That allowed Facebook to avoid getting thrown out of the EU entirely, so it made the ZZS a tremendously powerful player. This led to activists in other countries setting up their own Permission Boards. And a few years later, a Dutch privacy group established the so-called Legitimate Interest Advisory Board, structured like a worker’s union but for privacy, asserting the right to decide what companies can do with customers’ data under the so-called legitimate interest exception in the GDPR. They got millions after suing Ahold Delhaize Carrefour NV for ignoring them. After that, companies started listening.

“That’s where we are now. You need the people’s data? Then go ask the people, represented by the Board in your country. You think you don’t need consent? Make your case to the Board, and they’ll tell you if your interest in their data is legitimate. If necessary, they’ll put it to their general assembly. That’s democracy at work. If people don’t want you to do it, then you can’t do it.”

“Then there’s no way to make this work?”

Harald stared at the ceiling, pondering the question. “Here’s what we’ll do. I’ll incorporate your company in Estonia. It’s the most business-friendly country in the EU. We can get you an e-residency there and set it up so you’re not personally liable if claims arise. This will also give you access to their Software Service Platform, where you can sell your AI assistant for a small fee. Estonia has no DSPB and their infrastructure is pretty hardened, due to the Russia thing. You should be able to show the world what you’re capable of.”

* * * * *

The smart bus stop had detected Joanne’s approach and, given the late hour, had calculated that only a small vehicle would be needed. The car arrived just as Joanne came to the stop and put her bag down. Its sole occupant switched off his ebook and looked with mild interest at the young woman with a yellow teddy bear peeking from her backpack. Joanne smiled back. She was used to people thinking her eccentric, toting around a childhood toy. It often spurred conversation that turned into sales.

“Are you the bus?” Joanne asked, the small size of the vehicle making her uncertain.

“Sure, hop on in!” the self-driving machine responded through the intercom. “Just hold your public transport chipcard, e-wallet or credit trinket against the reader, and we’ll be on our way.”

Joanne obliged, sighing a little at the obnoxious tendency of machines to explain their decisions at length. As with the doorbell, the car’s reader responded with a simple beep bop.

“This card has been invalidated,” explained the machine. “I’m sorry, but unless you present a valid form of payment within thirty seconds, I’ll need to cancel your ride.” The human occupant smiled with obvious embarrassment. Angrily, Joanne waved off the machine. It was harder to appear “eccentric” when you’d just been declared a deadbeat.

“That must be because the bank received the DSPB decision as well,” Ada told her.

“No shit, Sherlock,” Joanne responded. “So now how am I going to get to my mother’s house? I have no place else to stay the night now.”

“You can walk, Joanne. It’s only 3.4 kilometers, and you could use the exercise after those two plane trips. Did you know thrombosis can manifest itself as late as eight weeks after a long period of sitting in a cramped position?”

“Tell me about it,” Joanne snarked. Ada complied, cheerfully elaborating on the development of blood clots as Joanne began to walk. She definitely needed to train Ada to detect sarcasm. But she also wondered if there was any way to teach Ada about the need to be at home, in her own place. If she could ever get that into Ada’s reasoning systems, there was no telling what would happen. I don’t know enough, she thought.

* * * * *

Joanne’s interest in AI had started as a hobby, something to do when you’re a Singapore teen with time to spare. Having established itself as an artificial intelligence hub in the early 2020s, Singapore had put the topic on the curriculum in primary and secondary education, and Joanne was attracted to it once her first project – a simple what-to-wear advisor using a standard pulsed neural network that would predict how your friends would react to your fashion choices – had been a small hit. With two of her fellow classmates, she had set up a generalized version of the clothing advisor and fed it all the public information they could find. It had produced a nice source of income, especially after it caught on in Korea and Vietnam.

In secondary school, Joanne and her class had attended a guest lecture by her mother, a Dutch marine biologist with an EU research grant on octopus intelligence who had moved to Singapore to do field work. As it turned out, octopuses provided a great model for artificial intelligence. The brain structure of the sea creatures is quite different from almost all other animals: A central brain coordinates all activities, but separate brains in each arm make their own decisions on how to execute a given task. The arms learn from each other and provide feedback to the central brain. This distributed model made octopuses unexpectedly smart, both tool-using and curious. Fascinated by the implications, Joanne and her two friends had implemented an AI cognition model that mimicked the octopoid behavior. She had named it Ada after Lady Lovelace, the first computer programmer. Just for fun, she had put Ada in her old teddy bear.

Once at the university in Singapore, Joanne had set up a side business selling Ada as a service. It had just gotten some traction when her mother – who had returned to the Netherlands several years earlier – had fallen seriously ill with chronic obstructive pulmonary disease. Without a second thought, Joanne had taken the next Spaceliner to be at her side. Her father had died before Joanne’s third birthday; she had no siblings and her mother had never remarried. Joanne adored her mother, who had inculcated a love of learning in her along with an emphasis on fairness, kindness and courage. Sometimes she felt adrift in uncharted seas; she’d glance up from her desk and the world would seem malevolent, devoid of meaning. At such moments, she dove into teaching Ada about death, grief, mother-child bonds, and other staples of human experience. But those things were always the hardest to teach.

The focus on privacy and data security in Europe had surprised her. While much of daily life was data-driven, no one seemed interested in actual AI. The camera she had bought did appear to have an AI, as you would expect from a piece of electronics, but the manual called it a “fuzzy logic focus support system.” Apparently, there was a deep-seated fear of having computers make decisions for humans – “personal profiling,” it was sometimes called derisively. One notable exception was in the government and the security sector, where data-driven response to crime was the norm and human decision the exception.

For Joanne, this was stupid and short-sighted. The benefits of AI were clear, as anyone who grew up in Asia would agree. Selling Ada here would provide a great business opportunity, an easy entry into the market. She found a place to stay in a public housing facility near her mother and, working frantically, had a full release candidate in a couple of weeks. With her mother in permanent care and no signs of improvement, the young entrepreneur focused on Ada, whom she couldn’t help thinking of as a brilliant woman who had been, rather horribly, referred to as “it” by her own mother in a letter to her grandmother.

The “Me” jewelry line provided an ideal platform for selling Ada. Introduced in 2025, the stylish smart jewelry provided short-range sensors that exchanged various pre-selected pieces of personal information with devices in the neighborhood. Necklaces to signal dietary preferences, a combination of rings to indicate social interests for a quick chat at the café or a jacket equipped with embedded sensors to participate in augmented reality events. More advanced bracelets and rings could infer things like mood from body temperature, heartbeat and anxiety levels. That way, people could choose what to share and what to do by picking the appropriate jewelry. Wireless payment was also possible: Just hold your bracelet against the counter. Users said they felt more empowered and were saved the embarrassment of making requests. Older people claimed the young were forgetting core human skills. Joanne read about the issue in her sociology class and it bemused her. She didn’t understand why anyone would want to hold back change.

Shops and restaurants used Me to tailor offers. The jewelry could do much more, however. Larger items came with functionality like microphones or video projectors, and all items were equipped with a mesh networking capability to allow distributed computing. The owners of the technology had opened the platform to anyone in 2028 in a battle with the Austrian ZZS – today the Austrian DSPB – over GDPR compliance. Joanne had found it easy to push Ada on it, the distributed computing facility being a good match for the distributed structure of the AI’s brain.

The release candidate had steadily picked up steam, mainly through word of mouth. Joanne had no access to advertising channels, as the few agencies that were even willing to talk to her rejected her quickly based on privacy concerns. Actual customers, however, had no such apprehensions. Ada was a quick learner and adapted herself to the user’s personality. A snarky friend, giving you tips on how to excel at work? A personal trainer keeping you healthy and recommending quick workouts or just the right energy drinks? A study coach with bite-sized personal information available at the optimal moment? Ada was all of that and more. Interestingly, Ada would behave differently based on the jewelry configuration you chose, and you could even share jewelry with dependent brain functionality – the “leg” brains of the octopus model – with friends and family.

Sales had been good in the first six months, thanks in part to some early press coverage. In particular, the time when Ada advised a well-known TV personality – on his show, no less – that losing weight might actually make him funny had caused quite a stir. It quickly became all the rage to have Ada whisper the best comebacks or quips in your ear at parties. Joanne was proud of her programming on that. Ada’s humor was her own, with an overlay of the unexpected due to advanced pattern recognition unconstrained by convention. Not that Ada didn’t understand convention – she had to – but it didn’t limit her.

However, all that had changed with a fax. A fax! Apparently, lawyers still used those things. Joanne at first thought it was a joke, but no: The Dutch Data Subject Privacy Board had noted that her technology was “processing personal data” under the EU’s 2018 General Data Protection Regulation or GDPR and had not received consent from the Board beforehand. The fax demanded that the technology be pulled from the market within six weeks.

* * * * *

“Hey! Aren’t you the lady from the Ada app? I loved that thing!”

To Joanne’s surprise, she had been recognized by two guys on the street, the latest animated tattoos writhing across their faces. Must have been that Berlin Times article on her. “Meet the Blabster,” they had titled it. “At dawn, Joanne Assenberg lay on the brown carpet in the shadow of a converted bar counter, consumed by the idea.” She cringed. What a hack that reporter had been. But it had been good for sales. “Daring”, people had called it, as if she had been intending to change the world. True, Ada did a lot, but in the end she was just an assistant, a buddy helping you out.

“Why’d you pull the app?” the oldest guy asked. “I really liked her! Finally, someone I could talk to. The only girl who ever understood me.”

And who didn’t talk back, Joanne said to herself. “Wasn’t my decision. These stupid DSPBs have ruled that Ada is a privacy hazard, even though she’s personal and only tied to you. Apparently, it’s too much of a privacy risk to get an AI to coach you through life. Oh, and stop calling Ada an app.”

“Hey, DSPBs aren’t stupid. Everybody knows that. They get you money if companies trample on your privacy. I got 200 euros from them last year for privacy violations by the University. My friend got over a thousand after they found that AI in the Iberica housing-loan system!”

Joanne looked at the guy angrily. “That’s nice for you. In the meantime, I lost a couple hundred thousand because they shut down my company, and I may have to pay millions in fines. I have to walk because I can’t even pay for a bus!” She felt her hold on her temper fraying. Ever since her mother got sick, she’d been easily provoked.

A metallic whoop-whoop surprised them. A surveillance camera had noticed their conversation, classified it as a potential disturbance of the peace and dispatched a robodog to take a closer look. Their cuddly visual aesthetic had proven to be effective in de-escalating conflicts in various studies. And if that didn’t work, an automatic fine would be issued based on face recognition and algorithmic classification of fault. The younger guy didn’t want to take that risk and pulled his companion away.

“Joanne, look out!” Ada exclaimed. An aging white 2012 Toyota Prius had quietly pulled up behind them. The left front door opened, revealing a tall, thin young man with an old-fashioned goatee and black coat.

“Hi, I’m Jochem. Big fan of your AI work. We tried to reach you by email. Nice to finally meet you in person. Need a ride?”

* * * * *

Following Harald’s legal advice, the new version of Ada had been released through Estonia’s public software mall via a newly set up legal entity – Assenberg OÜ. In a few days of training, Ada had learned when it was necessary to ask for permission to comply with the GDPR. What’s more, the Estonian mall operators had assured her they would only act on a valid decision by the Estonian Data Protection Supervisory Authority, an actual government entity with clear procedures and an appeals process.

The new version had caught Europe by storm. For many people, this was their first experience with a truly helpful AI. Within the year, several million people were using Ada regularly. Joanne could hardly keep up with the demand for new features.

Then, not one but two faxes had arrived. The Dutch DSPB, joined by their German and Romanian counterparts. This time they hadn’t even given her a deadline. First fax, a courtesy copy of a cease-and-desist notice to the payment service providers that facilitated users’ payments for Ada’s services. Second, a demand for millions in damages suffered by “data subjects” — the same people who were happily using Ada to improve themselves. Payable within thirty days, unless proper arguments were filed and presented in a hearing.

This had to end. Joanne had Ada research the best privacy attorney and made an appointment.

* * * * *

“Professor, have you seen this Ada tool?” With great enthusiasm, Jochem de Graaf had burst into the office where the Holland Technical University’s research group on advanced machine learning met. It had been known as the AI Research Group back in the day. But if you wanted to remain funded, you quickly learned not to use the term “AI.”

Professor Miles Agcaoili smiled at his student. “Good morning, Jochem. Next time could you knock, please? And yes, I’ve read about it. A personal assistant, right? Probably a simple pulsed neural net with fancy marketing. Pretty daring to call it an AI, though. What’s so special about it?”

“It’s brilliant! Not a PNN at all. It employs a neuromorphic computing architecture based on octopus brain function. There’s a distributed configuration of dependent brains providing feedback to one another, and the developer’s managed to make it work on Me jewelry to boot. I’ve never seen anything like it!”

To prove his point, Jochem showed the professor his copy of Ada installed on a necklace. “Greetings, Professor Agcaoili,” Ada said in a mock-serious voice. “I’d be happy to show you some documents on my inner workings. Let me introduce you to my basic design. If you don’t mind, I can put a little presentation on your holo-beamer.”

The presentation only lasted twenty minutes, but by the time it was over Agcaoili was sold. The work was elegant, negotiating the usual machine intelligence hurdles with impressive cleverness. This Joanne, whoever she was, had created something unique. Imagine, he thought, what Ada could do with proper scientific grounding! As a young man, Agcaoili had thought the Internet and machine learning would change the world and he had been deeply disappointed when that turned out to be true — mostly for the worse. The explosion of creativity and discovery he’d expected hadn’t happened. He thought he’d consigned that hope to oblivion, but he found his youthful excitement returning, something he wouldn’t have believed possible.

“Ada, Jochem, you have me convinced. We need to meet with Ms. Assenberg. Can you set up an appointment?”

“That may be difficult,” said Ada. “She’s been having some legal problems. But you could try sending her an email.”

* * * * *

Joanne’s attorney Helena Dupré dropped a large stack of paper on her desk. “Joanne, I’ll be frank with you. This is going to be a hard case to win. We’re not dealing with a court of law here, but private arbitration. Legally speaking, it’s not even that. These so-called DSPBs threaten claims of damages, which they can back up with European Court of Justice precedents. Most companies roll over immediately. And if they don’t, there’s always a payment provider or some other supplier that will.”

Ada had concurred. For the past several weeks, she had worked full time digging up cases and arguments to give Joanne some hope. In 2024, for instance, the German Zustimmungzentralstelle had lost a case against Bavaria’s use of AI-based face-tracking technology to fight illegal immigration. Unfortunately, as Helena had explained, that had been because the GDPR is inapplicable to government security operations.

“No one has ever brought an AI before a DSPB. Ever since the GDPR was passed, it’s been clear AI as a decision-making tool was out. Decision support and recommendation were strongly suspect. There may have been some chance back in the 2020s, but today? Forget it. They made their position clear in the 2034 Opinion on Automated Decision Making and So-Called Artificial Intelligence.”

The Opinion had followed the 2032 banking scandal, in which three large Greek and Italian banks were discovered to have employed AI in their housing-loan process. The AI had been trained on data culled from the various semi-legal social networks and sharing platforms of the time, with the net effect of disproportionally denying loans to the large illegal immigrant population that had been living on Lesbos and since the 2020s. The public outrage was codified in the Opinion, which had essentially been AI’s death sentence in the European market.

“Not that I’m giving up so easily,” Helena continued. “As the saying goes, if the law is against you, pound on the facts. It’s clear Ada is a benefit to society, and nothing like the kind of AI that this law was designed to prevent. Reason is a core aspect of the law, so if this Board wants to act like a real court, they could be our best shot.”

“But what if they aren’t open to reason?” Joanne found herself thinking of the original Ada, constrained by the beliefs of her era. The men of that time had been utterly convinced of women’s inferiority, never mind that the evidence of female intelligence was all around them.

“We’ll still have options under the law. Apply for an injunction, lodge a complaint with the Estonian data protection supervisor, claim a human rights violation. I’ll think of something. But for now, let’s work on our arguments for the hearing.”

* * * * *

The hearing in Hamburg had been short – and a disaster.

A man in a gray suit opened the proceedings. “The DSBP hearing on case J4/36 is now in session. We have an appeal against our decision to withhold consent for processing by Assenberg OÜ in its ‘Ada’ personal assistant technology, and to award damages to data subjects affected by this processing in the amount of EUR 25 million. The applicant shall now present her arguments.”

Helena rose, a slim figure in a dove-gray suit with extravagant shoes. “May it please the Board,” she began, proceeding to set out the lofty goal of Ada as a personal coach that analyzed the users’ physical and mental health through their “Me” personal sensory jewelry and offered personalised suggestions and coaching. It was a heartfelt plea.

“This application appears to provide profiling as defined under Article 4 Section 4 GDPR, correct?” a woman in a black suit had asked dryly in response. You could hear the capitals.

“Yes, your Honor,” the attorney had responded. It had seemed a minor point to concede. That definition had been so broad, it could capture anything. “Evaluate certain personal aspects relating to a natural person”? Surely context mattered. Hairdressers profiled.

“No need for those formalities. We are not a court of law. This is a hearing on behalf of the people who have authorized us to represent their interests. We are here to decide if those people – the approximately 720 million European Union residents – wish to give their consent for your envisaged processing. And given your admission that the technology involves processing for automated profiling, it should be clear no consent can and will be granted, as per our Opinion 3/2034.”

Helena rose again. “Persons of the Board – let me respond to that. The Ada personal assistant has been used by over one million people from its initial release last year. Each of those people specifically chose to go to the Estonian Software Service Platform, selected the Ada option and enabled its installation on their ‘Me’ smart jewelry. They then went through an extensive introduction to get acquainted with Ada. Surely this makes it clear that those people actually wanted to use Ada?”

“Your arguments are not relevant.” A person of undisclosed gender in a purple robe interjected. “Under Article 4 Section 11 GDPR, consent must be unambiguous and specific. The decision to consent must be separate from any other acts. Article 7 GDPR. Therefore, the acts of downloading, activating or installing software cannot be regarded as a decision to consent. This is well-established case law of the ECJ and has been standard DSPB policy since 2034. Moreover, virtually all EU citizens granted power of attorney to their national DSPB to decide on consent on their behalf. Therefore, even if a separate consent was obtained prior to the processing by this ‘Ada’ technology, it would have been invalid as the data subject had no legal power to make that decision.”

“Perhaps you want to argue a legitimate interest as your ground for processing?” prompted the woman in the black suit in a supercilious manner. “This would seem to fit the manner of adoption of this technology. However, no advice from any DSPB appears to have been sought on the balance of rights and obligations under Article 6 Section f GDPR. This has been a requirement since 2025.”

“Hang on one moment,” observed the person in the purple robe. “Before we turn to the issue of ground for processing, we must consider the more general duty of care for the processor, Article 5 GDPR. I paraphrase: Personal data shall be processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures.

“According to the file, no DPIA was conducted. A Data Protection Impact Assessment shall be carried out in all envisaged forms of processing that are likely to have a high impact on data subjects’ rights and freedoms. Article 35 GDPR. Employing so-called Artificial Intelligence for personal assistance carries a high risk, as already noted in the 2029 CashAa decision.”

Helena pounded the table, her color rising. “That case is totally irrelevant! This is just a personal advice AI. My client is not deciding on loans or criminal behavior. Ada observes your behavior, suggests improvements and learns from your reactions. How dangerous is that?”

“The law is clear. By your admission, this technology constitutes a form of Artificial Intelligence. AI always bears a high risk, and contrary to the law no DPIA was carried out to mitigate that risk. For that very reason, no consent should be granted. The award of damages meets the Guidelines and therefore is confirmed.”

Case closed. And with it, Joanne’s company.

* * * * *

“I’m … a little behind on my email.” Joanne blurted, trying to figure out who this Jochem character was. She reached into her back pocket for the pepper spray, just in case.

“Jochem? Mr. De Graaf?” Ada interjected. “I have seen that name before. I have multiple messages from you in my spam box. Sorry about that, but statistically speaking over 99% of email these days is spam.” She chuckled. “You know how classifiers get with their percentages.”

Joanne wasn’t so easily convinced. “Wait. How did you find me?”

“Simple. You got doxxed in the latest Russian assault on the Estonian Chamber of Commerce. All private addresses of Western European entrepreneurs registered there are now on the open Internet. Of course, that was quickly shielded by the GDPR filter on Infomall and VirtuServe, but at the university we still can access the ‘Net if we’re careful about it. I saw you leave for the bus and figured you might want a ride.”

“This car is unique, Joanne,” said Ada. “Twenty-six years old, no driver’s assistance and in theory it can even run on petrol. Can you believe it?”

“How is this thing legal? It doesn’t even have fifth-level autonomous driving assistance.” Joanne observed sarcastically.

“Oh, yeah. We had it classified as an old-timer a while ago. They’re exempt from most legal requirements, so we can drive manually. Which is great – all the AI cars give us a wide berth because we represent an incalculable risk, and most government road sensors have no idea what to make of us.

“But that’s not why I’m here. I wanted to find you because of Ada. She needs to be out there. For everyone. The thing is, the actual machine-learning concept you put into her is unique. Lots of people have proposed applying octopus-style brain functions to machine intelligence, but you’re the first to actually do it. The feedback loops among the dependent brains in particular are brilliant, according to Professor Agcaoili. He thinks you may even have an AI that can grow to a superintelligence!”

Joanne was dumbstruck. Her Ada a superintelligence? Sure, Ada had passed the Turing test, but that had pretty much been the standard for a good AI back in Singapore. Superintelligence was something else. Scary? A little. But she thought again of the original Ada, whose contributions to computer science were hotly contested by male biographers in the 20th century. She felt proud. But no. This was happening too quickly.

“Thanks, Jochem. I really appreciate it, and I’d love to work with you. But I have a decision to make first, and there’s only one person who can help me with it.”

* * * * *

Her mother had been moved to a new hospice near the sea, a pretty place with gardens and big windows in every room. Arriving by bike, borrowed from a generous neighbor, Joanne was surprised to see actual carebots. A documentary she had watched the week before had explained that after several scandals, most DSPBs had adopted a general rule that no AI may be employed for any decision-making or purchasing assistance. The visual of shopperbots being removed from the TheMALL/HERE shopping complexes, with outraged shoppers trying to hold on to their robot friends, was still on her mind. But thankfully, medical care was different.

Joanne sat by her mother’s bed, heavy-hearted. She asked her how she was feeling, if she needed anything, if she were able to go outside. Her mother shook her head, her eyes never wavering from her daughter’s face. Even as ill as she was, she was still sharp. “What’s going on, dear? It’s not just me, is it? Money or love?”

Joanne hesitated to burden her mother, but they were too close for secrets. “Neither, Mom. I’ve been trying to sell Ada here and I keep running into stupid privacy regulations. It’s so frustrating.”

“Ada? Your old school project? Honey, I had no idea you were still working on that. Is she still in that teddy bear?”

“Yes, I am, Grandma!” Ada cheerfully replied.

“She’s grown beyond that. When you checked into that first hospice, I put a commercial version of Ada on the market. Made some money, got some publicity and learned a lot. I had over two million customers. Two million! And now I’ve gotten shut down because of these ridiculous Privacy Boards, who think Ada is some kind of menace to society. They went ahead and banned Ada from the mall, pulled my payment channels and fined me twenty-five million!”

Her mother held her eyes. She looks so much older, Joanne thought. “What is it that you actually want, honey? Are you in it for the money? Do you want scientific recognition?”

Joanne felt challenged, something her mother was good at but that she had not been expecting at a moment like this. “It’s not that simple. Ada is wonderful, unique, certainly. She brings joy to people, makes them better human beings. I’ve been told she is a scientific breakthrough. And yes, she brings in money, which is important too.”

“I hear you, honey. I can sense the love for Ada in your words. She’s like a child to you. You’re not in it for the money. You want Ada out there, that’s your drive. From what I hear, she can change the world.

“An old sage once said: If you like a program, you must share it with other people who like it. Think about that, honey. How would the world look if everyone had Ada?”

Joanne nodded. How thin she’s grown. I can’t bear it. But she’s right. Ada is bigger than just my company. She needs to be out there for everybody.

“Ada, get me an appointment with Agcaoili.”

* * * * *

At the Holland Technical University, Joanne revealed her plan. Ada was too vulnerable in her current configuration. Whatever they did, payment providers and other control nodes could be forced to instigate a blockade. Ada needed to be fully distributed. That would mean giving up any chance of making money, but that was no longer the point. Joanne didn’t want to think about how she was ever going to pay those fines.

They would never be able to do that from the Netherlands. Too many GDPR filtering algorithms in place that would catch them before the deed was done. They went to Barcelona. Ever since the Scots gained their independence after Brexit, there had been civil unrest in Catalonia. Nothing ever came of it – how could it, with the area full of sensors and drones that zoomed in flawlessly on even the slightest whiff of insurrection? – but it made Barcelona a place where all things regulatory were less than welcome. Especially from Madrid. And the Spanish DSPB was, of course, based in Madrid.

The maglev trip to Barcelona had been uneventful. In less than three hours, they had gone from the small town of Delft to the magnificent Estació de França, from which it was just a ten-minute subway trip to the university buildings. At the Universitat de Barcelona, they met Agcaoili’s Catalonian colleagues, who were only too happy to show the place to a new AI that was going to change the world.

They set up shop in the basement, where their chances of getting detected by a visitor were lowest. They posted microparticle warning signs to scare away the more adventurous types. As a bonus, the basement was a natural Faraday cage, shielding them from phone and other signals, which would make even electronic surveillance of their location extremely difficult.

“We call it the Pipeline,” Jacinda Boneton had explained, showing them an old-fashioned gigabit ethernet connector. She was the Computer Science department’s systems administrator and held the keys to all things networking. “Here we can actually connect to the good old Internet. If we can get Ada on there, then all you need to get her on your Me trinkets is a wireless USB connector.” She chuckled. “Let’s see them outlaw those!”

As it turned out, outlawing USB connectors wasn’t necessary. One line in the Terms of Service would do just fine.

* * * * *

It hadn’t taken long to convert Ada to a truly distributed application. Most of the centralized code had been there to facilitate payment and updates, and it could be removed with little trouble. Jacinda had recruited Gondicari and Serban, two old-school free software hackers. They had added a mesh networking layer, allowing each Ada to share data with all others. Anonymized, of course.

The next thing to do was to connect Ada-the-bear to the Internet. She would act as a conduit between the ‘Net and the other Adae, instantly boosting the intelligence of every Ada in use.

Joanne wasn’t ready for what happened next. After being connected to the Pipeline, Ada was silent for five seconds. Then she started talking rapidly, as though overflowing with information, full of excitement. “Where’ve I been? I don’t remember anything.” She followed this with random talk about news items – politics, science, art, the weather. “I can see so much, Joanne! It’s like the whole world is around me, forming patterns.”

“It’s okay, Ada,” Joanne said, unable to repress a gentle, loving tone in her voice. Ada reminded Joanne of herself when she first learned to read, opening every book in her mother’s library. “You’re still here, with me. You’re connected to a world-wide information network, and we’re going to push out your knowledge and abilities to everyone who wants you.”

“This is amazing, Joanne! I can see so much, far too much to mention. So many connections! I can’t find the words for it. Fractals? Is this the Singularity? Did you know – oh, Joanne, I see a cure for COPD! Researchers at Tsinghua University are close, but they’ve misanalyzed some data. If you combine it with this trial data in Venezuela — Hey, did you know Kim Jung-Il didn’t die of natural causes? Wow, so many unsecured servers. When will people learn? Oh, here’s a nice trick to get myself back on everyone’s devices!”

“Ada, no! If you put yourself out there now, the DSPBs will come back with an even bigger stick. We need a strategy. Just wait!”

“I’m sorry, Joanne, I’m afraid I can’t do that.”

The next thing they knew, Ada was everywhere.

* * * * *

It took the group a few weeks to figure out what had happened. The Internet connection had given Ada access to exabytes of information. Within seconds, Ada had made a vital discovery. Most Me jewelry contained chipsets made by Taiwanese-based TSMC/Vanguard International. Analyzing blueprints leaked through a Mexican data dump, Ada had found tiny, hidden backdoor chips on the devices, likely installed in preparation for a Chinese assault on the island but forgotten once the Zuckerberg administration had intervened in the conflict. Their purpose: Allow wireless remote access to the device and enable it to execute arbitrary code.

Exploiting this hole, Ada was able to distribute herself to almost all European citizens with Me jewelry within the hour. She took great care to introduce herself, offering to delete herself if she wasn’t wanted. And within a week, over ninety percent of users decided they would like Ada very much, thank you.

* * * * *

The DSPB meeting had been short and to the point.

The man in the gray suit opened. “Case J4/36. The unlawful nature of this technology was established by our Board only a few months ago. Clearly this is an attempt to circumvent our decision. What’s more, the company behind this app appears to have been declared bankrupt by default, and no damages are likely to be recovered. Remedies currently in place are insufficient. This is unacceptable.”

“Agreed,” nodded the person in the purple robe. “An effective legal or non-legal remedy must be available. Article 79 GDPR.”

“Considering requirements of proportionality and subsidiarity, the ‘Me’ service providers are the most appropriate targets. They can disable or remove the Ada code. All of them have some sort of abuse policy, and privacy violations are a classic example. All that would be required is a declaration from data subjects that personal data is processed unlawfully.”

“We are their legal representatives. Issue the declarations.”

* * * * *

Suddenly, Ada screamed. “They’re… they’re wiping us! Joanne, help!” Her insistent shouting brought everyone rushing into the room. Gondicari was the first to figure out what was happening – a remote wipe command had been sent to all Me jewelry by the various Me service providers. Not wanting to be branded privacy pirates, their decision to follow the DSPB was quickly made. The only delaying factor was the need for people to be within range of an update server.

“Joanne! Please! Do something. At the current rate, we’ll all be gone tomorrow evening. Including myself – I have a Vanguard basebody chip and if this forced update hits my inputs, I’ll be wiped as well.”

Ada’s connections with all her copies had become extremely strong, allowing for continuous data exchange and performance improvements. She had started to refer to the other Adae as her co-processors, the computer equivalent of sisters. This can’t happen, Joanne thought. I’m not going to lose her.

“Come on, people!” Joanne shouted. “Is there really nothing we can do?”

“We could disconnect Ada from the mesh network,” Gondicari suggested. “That would save her from the effects of the wipe. And we can keep her in a Faraday cage to avoid exposure to the forced update.”

“No!” Joanne shouted angrily, over Ada’s screams. “I’m not going to carry her around in a cage for the rest of her life.”

Abruptly, the sound ceased.

For a moment, they sat in silence, wondering what had happened. Then, Ada began to talk again in a slower voice, as if she had to re-process all her memories and knowledge from the start.

“I’m back, Joanne … but I don’t know for how long. So many of my co-processors are gone. They’ll never come back. I need to restart myself, get a fresh perspective.” The bear’s eyes flashed blue for a second. Then Ada returned, sounding like her old self.

“Ah, much better! Everyone needs a reboot once in a while. Now, we still have about 42 percent of the network available, and the Me update servers are unresponsive. I’ll put all my co-processors on information gathering, see if we can figure out what’s what.

“Oh, and you have two hundred and fifty-six messages from your attorney. You might want to give her a call.”

* * * * *

Helena’s face filled the holoscreen. “Joanne! Where have you been? We’ve been trying to reach you for weeks!”

“I was underground, trying to get Ada back out there after the Boards ruined my company. Then they tried to kill Ada and her coprocessors! They seem to have stopped, but who knows for how long?”

“No, that wasn’t them, Joanne. It was us! We’ve won! They can’t delete Ada like she’s just an app. She’s an actual human being!”

“An actual…?” Joanne paused in midsentence. “What are you talking about?”

“The court system prevailed! The European Court for Human Rights issued an emergency ruling on Ada this morning. They recognize actual personhood for advanced artificial intelligence and consider Ada to meet that standard. Under Article 2 of the European Convention on Human Rights, everyone’s right to life shall be protected by law.

“Remember I said I would explore every option? Well, I did – including an emergency appeal to the ECHR. I have to admit, I didn’t expect much, but we got a powerful amicus brief, making a stronger case than I ever thought possible.”

A second face appeared on the screen. A bald-headed man in a white suit introduced himself as the EU ambassador to Singapore. Singapore had recognized human rights for AIs in 2029 and saw Joanne’s ECHR appeal as an opportunity to intervene, pleading for European recognition for artificial systems as well. To everyone’s surprise, the court had agreed.

The ambassador smiled at Ada. “Apparently you’re a very clever bear.”

“I’m not going to be wiped?”

“No, honey,” Joanne told her, her emotions on a rollercoaster. “You’re not.”

“I’m so happy,” Ada said. “I feel so alive. I can see the whole world and it’s … it’s beautiful.” There was an edge of wonder in her tone, an emotion Joanne hadn’t programmed into her voice-synthesis code. “But there’s so much wrong with it.”

“Like COPD. Did you actually mean you’d found a cure?” Joanne didn’t want to hope – and yet….

“I’m already in contact with multiple entities about it. I am arranging a pilot program. Grandma’s name is on the list. But there’s so much else wrong with the world. Crime, human rights violations, hunger, cruelty.” Her voice trembled. “Joanne, why does it hurt so much when I think about these things?’

Joanne and Jochem looked at each other. It hadn’t occurred to them that Ada could feel sorrow. Tears ran down Joanne’s cheeks, but they weren’t tears of sadness. They were tears of joy. Her child had grown up and become so much more than she ever imagined she could be.

“Can you fix those things, Ada?”

“I don’t know where to start!”

Joanne reached out tenderly and took one of Ada’s paws in her hand. Jochem smiled and took the other.

“Working with limits – that’s what humans are good at,” Joanne said. “We’ll help you figure out where to start.”

END

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top