More Dangerous Than a Gun

The Matrix is one of my favorite movies, and I’m not entirely convinced that we don’t live in a simulation—though not quite the one the movie depicts. As much as I love that film, the premise is technically implausible for reasons I won’t get into today, mainly because humans consume more energy than we produce. But I digress.

I think the way AI will destroy us is much more insidious.

It starts with the device you’re probably reading this on. I’m sure you’ve seen ads for things you were just talking about; most people are now aware that your phone and your Alexa are listening to you all the time. Old news, right? But did you also know that your news feed—the ads, the posts from friends, the articles you see—is tailored exclusively to you? It’s based on the ads you pause on and the videos you watch. It is geared to show you things you already agree with. It is designed to feed you your own opinion.

This is public knowledge. There are increasing reports discussing the sycophantic nature of AI and the resulting “AI-induced psychosis.” What you’re reading here is no different; if you’re reading this at all, an algorithm decided you might want to.

So, how does this destroy us? It stops us from talking. It stops us from listening. It turns us into a world of people existing in individual echo chambers, seeking only the opinions that validate our preconceived ideas of right and wrong. It forces us to draw sharper lines of “us vs. them.” And when the talking stops, all that is left is war. We will destroy ourselves long before a physical enemy does.

I am pro-Second Amendment and yet somewhat anti-gun. I support the right to own one, and I emphatically believe people should be able to overthrow their government by force if necessary—and we may be close to that point. That being said, I haven’t owned a gun since I was old enough to live on my own. I grew up in Texas; I know how to build them from scratch and I’ve always been an amazing shot, whether with pistols or long-range rifles. But I gave them all back to my family. They are a thing I wish we could un-invent.

I bring up guns only to say this: I think cell phones are more dangerous. This constant barrage of information is robbing us of the ability to think for ourselves. Even worse, it is pointing us in the direction an algorithm thinks we should think. I cannot imagine anything more dangerous.

So, am I special? I hope you’re wondering that. I’m not. I’ve just never assumed I was right, despite what you may think of me. I question everything, including myself. I fact-check my own thoughts and have people proofread for me. I go out of my way to immerse myself in diametrical views, and even then, I am hesitant to speak.

When I do speak, you can be sure I’ve done my homework. Even then, I’m frequently wrong—and that’s okay. How else are we going to learn if we don’t stumble? And aside from using AI to edit my grammar and spelling, I try not to use it as a primary source. There is a well-documented bias in its architecture, and I’d rather do the thinking myself.

Leave a comment