The AI has been around for quite some time and it keeps getting better and more sophisticated. It’s no wonder that a certain amount of people, I would even say the majority of people, are worried that AI might overthrow us and surpass us to the point that it is no longer in our control but rather has its own artificial conscience. That would mean that it has gained the ability to do what it wants and, what’s scarier, what IT thinks is best which can encompass a lot of actions that we, actual people, would deem horrific.
This topic has been talked about a lot, but I will talk about something else, related to AI. It has been almost 10 years since I first talked about that with my older brother, and that is future warfare and drones. The topic came up because we both love talking about history and wars and this was an era when RC toy helicopters became popular and widespread. One day, during the walk with our dog, we started talking about how dangerous drones could become and weaponized.
Sadly, as we’ve seen in Ukraine for the past three years, drones have become a large part of warfare, not just big sophisticated drones like the MQ-1 Predator, but the simplest, cheap, and disposable drones that lead the course of the battle. I’ve even seen footage from the drone that shows soldiers beneath it begging the drone operator to spare them, as they can’t do anything else except beg for mercy. I’ve also seen drone footage that shows a mortar shell being dropped on soldiers beneath it.
Can it get worse?
However, what the two of us talked about was far worse than what we see today happening in Ukraine. What we predicted is mass-produced weaponized, cheap, and disposable drones that can be dropped from a cargo plane at high altitudes and in large numbers. We were talking about drone swarm attacks with hundreds or even thousands of drones. Those drones could be only a few centimeters in diameter and need to carry only one bullet or small explosive charge. At the time of discussion, it seemed like some science fiction horror, because such an attack would require a lot of drone operators, and many drones would probably collide with each other and fall to the ground.
We realized that such attacks probably can’t be done with advanced artificial intelligence, but once such AI is developed, such attacks would be alarmingly easy to conduct. AI could easily control every drone, without them colliding, and even use facial recognition technology for targeting specific people. How could anyone escape a swarm of 1000 drones that are the size of a wasp (or probably even smaller in the future), each one equipped with a small explosive charge, controlled by AI using facial recognition (or some other method) to identify you and ram your head and detonate itself?
I believe that AI today has reached or is about to reach a point allowing it to conduct such an activity. This is what scares me the most in my life, by far. Such technology could be the end of us. What bothers me sometimes is how much attention is given to the dangers of AI outsmarting us, but people fail to see the risk of weaponizing the AI and how such a weapon would be more dangerous than the nukes, as it would be accessible to almost anyone.