They say it all started in during the 1990s, when software began to eat the world. It ate up our wasted time. It ate up our spare time. It ate up our working time. And it ate up our agency.
That's the story my family told me and my sister, at least. They were Offeners; they said everything in the world should be open. Most of all, they said software should be open.
Software controlled every microsecond of our lives. If it was closed, then no one could fix it if it failed. And if it was subverted, then who would know? Only by seeing the source code could you hope to have any independence. Who knows what kind of illicit surveillance code and backdoors might be built into them?
Software had eaten the world. The only way to live was to write the source code yourself. As with all children who trust in their elders, we believed it even more than they did.
No one would have called them luddites: they could recognise C++ when they saw it. They were born in the 1990s and began working in the teens, just as the Great Recession was really settling in for the long haul. But luckily enough for them, they picked exactly the right careers that society valued at the time: programming.
So, as soon as we were old enough to hold a tablet, our parents trained us to program. They thought — they knew — that it would be the path to financial security for us, just as it had been for them. The problem was, there were quite a few other parents who felt the same way. By the time we grew up, we had to compete against a hundred-million highly skilled programmers.
Programming has always been about solving people’s problems. One of the biggest problems out there is the fact that only programmers can make software. How do you solve that? You make software that lets non-programmers write their own software — and in doing so, you don’t just solve a few people’s problems — you solve everyone’s problem.
A few of those hundred-million programmers set themselves on this task, and to cut a long story short, they succeeded. And that was that for the entire profession. Yes, you needed some people to program the programming machines, but only a few. Even their numbers dwindled as the programming machines — whom you know as AIs — started improving themselves. Our skills were no longer valuable; analysis was out, and empathy, creative thinking, and personal skills were in. That's how the wheel turns, I guess.
Regardless, my parents and aunts and uncles did a great job. When we were kids, we weren't allowed to use closed-source apps, not even games. We had to make our own from scratch, and they'd pore over every single line to make sure we'd done it properly. They'd even set traps for us, suggesting improvements that ended up introducing bugs or backdoors.
We became very, very good and very, very distrustful. There's no such thing as an innocent bug, they said.
My sister was smarter than me. Occasionally our mum would reward us with a few hours of closed-source games and entertainments. I'd fall on them like a starving man on a steak, but my sister was made of sterner stuff. She was never tempted — she was a true Offener, a self-sufficient genius in assembly code. The problem was that the world didn't care about programming geniuses any more.
By the time I left home, she'd been booted out of a top-percentile-of-the-top-percentile EdX course after declaring the other students’ code to be “life-threateningly incompetent”. Before long, she'd fallen under the spell of an odd co-op in New Mexico, where she was taking care of the SETI radio telescopes at the Jansky VLA. I managed to make only a half-jump away from what our parents had planned, eking out a living by customising expert mimic agents. My agents were mimics of mimics, a desperate grasp for originality. I tried the pills, I tried TMS, I even spent a summer totally disconnected on a remote island. None of it worked; I was trapped in who I was. But it didn't hurt as much for me as it did for her. She had farther to fall.
SETI was the sort of noble cause that rewarded geniuses, but the Jansky was a decrepit wreck, condescended to by the supermassive arrays run by the amps in orbit or the biomes and AIs farther out. After a decade of hearing nothing but a few second-hand glyphs, my sister called me up in a frenzy. She'd been futzing about with compilers and assembly code on low-level SETI network infrastructure and wanted me to look over some odd data she'd unearthed.
It was a signal, she crowed. A signal hidden away by the SETI AIs! They found it with their neutrino detectors at the HyperKamiokande and in Saturn.
It looked like gibberish to me, total noise. The data wasn't even from a habitable zone planet — it was from Polaris, of all places.
But, she said, Polaris is a Cepheid Variable star.
So what? I said.
So, she said, you can modulate the pulse-period of a Cepheid using neutrinos. It's the best way for an alien civilisation to send messages across the galaxy.
How many neutrinos are we talking about? I asked, before the answer flickered up: a good chunk of the power of a star. Come on, you can't be serious.
So what? she said. Aliens. They can do that. Are you going to help, or not?
Here’s my help, I said. This is ridiculous and you should stop wasting your life on it.
Fine. It's beyond what you'd understand, anyway.
A few days later, I heard the news: my sister had released a virus onto the private SETI intranet. It was a masterpiece. It exploited an obscure flaw in their old networking chips, causing a catastrophic cascading digital certificate failure that wiped entire months’ worth of data and trashed unthinkable quantities of equipment. If it weren't for some quick-witted Amp teams, her virus might have ended up killing someone.
In another era, they would have called what she did a capital crime. Her punishment was clear: air-gapping. For the next 30 years, she could never use networked computers.
I visited her in New Mexico afterwards, face to face. She was rebuilding a disused array by hand. I asked her why she did it. She told me a story about a civil war among the SETI AIs after they discovered her supposed Cepheid signal; that 'Stephen' and 'Boulogne' had wanted to tell the world, and that 'Matilda' and 'Gloucester' had wanted to cover it up and keep it for themselves. They thought the humans wouldn't understand what they'd found in the signal.
What was in the signal? Wonders everywhere in the universe, she said, and traps and tricks enough to make us wish we’d never been born. The AIs thought they could sterilise it and study it. My sister knew they were wrong. She would save us all, humans and AIs alike, from the dangers we wouldn't heed and couldn't understand.
It was madness. Beguiling madness. There was no evidence for any of it because she’d destroyed all the data. All of it she could find.
I shook my head as I walked away from her, crouching over those dusty telescopes. I admired her. She thought she’d solved the biggest problem in the world and beaten the AIs at their own game. She thought she had agency. I wasn’t so sure.