Super-intelligent aliens are going to destroy humanity? Whatever | Joel Snape | The Guardian
Going on how long it has taken us to get men to the moon and chuck a couple of probes past Uranus, it s probably safe to assume that any aliens who make it all the way to Earth are going to have technology more incomprehensible to us than a washing machine is to a dog. The only real options are that we re either the only reasonably intelligent beings in the entire vastness of space, or that super-intelligent beings are out there but we haven t heard from them for reasons that, I stress again, are mostly terrifying.
This is called the Fermi paradox, and if you haven t heard of it before then I m about to ruin your day. The basic version goes like this: there are countless billions of stars in the universe and a decent number of them are orbited by Earth-like planets. Even if you assume the chances of intelligent life spontaneously evolving on one of those planets are very low, and even if those planets are astonishingly far away, the age of the universe and the staggering numbers of planets involved mean there should be at least a few alien societies far more advanced than our own by now, in possession of technology that means they could pop by and say hi.
The fact that they don t may mean that we are beneath their attention, can t comprehend their existence or are exhibits in some sort of weird alien zoo. It could also mean that we are flying under the radar of one or more malevolent super-civilisations, in which case the funny sketches and cheery greetings we occasionally fling into the void are the equivalent of standing on the edge of a jungle and yelling: Hellooo?
Still, most of these are preferable to option two, which is that some sort of horrible event typically referred to as the Great Filter happens to these civilisations with some regularity, wiping them out before they can make it over here.
This is especially worth pondering at the moment because so many of the things we are doing seem a bit & Filter-y. There s a decent argument to be made that no society can make it through its early stages without relying on fossil fuels. If we use them all up without shifting wholesale to a viable alternative, we won t be making it to the stars. Similarly, maybe no galactic civilisation has ever managed to crack space travel without causing a degree of climate change that wipes it out. Nuclear warheads, biological warfare, the inability to ever agree on anything that would be collectively good for us: these might all be problems that no society has ever surmounted. And then, of course, there s AI depending on which tech-bro you listen to, the thing that is either going to unlock unimaginable advances in science or turn us all into paste.
Personally, I ve hit on a solution to all this: stop caring about any of it. In galactic terms, we re here for less than a blink: it seems unlikely that some galactic super-predator is going to wipe us all out in the 80 years I m alive, but in the event that they do, it s even less likely that we will know about it before it happens. Maybe we re completely alone in the universe apart from a few bacteria which would be insane and amazing, but fine. Or maybe super-intelligent AI is doable, and will actually usher in a new Age of Aquarius evidenced by the fact that no other civilisations rogue bots have turned us into their equivalent of paperclips yet and everything is going to be fine. Until then, Men in Black is still a banger.
This would work fine, except some of us have kids. Wouldn’t want them to be enslaved by giant spiders or reptilians for that matter. I’ve read some of the literature on the allegedly strange universe we live in and I haven’t found anything remotely persuasive. David Jacobs, who reads like a reasonable chap, at least at first, relies largely on hypnotically recovered memories of abductees. But this amounts to no evidence at all, last I checked on the reliability of such “recovered memories.” But if their memories are real, well, we’re eff’ed for sure.