AI-pocalypse

It’s Monday, and everyone from average Joes to industry giants gets back to the race, only now they get to inject a new performance-enhancing drug tinged with some binary magic. Of course, why not cheat when a few clicks can do the trick of saving us some series of attempts to defend ourselves from our own incompetence? No one will know; isn’t that the point?

The AI-lluminati theory

While the realm of AI continues to increasingly offer countless benefits as the days go by and people continue to miss out on some genuine human experiences, there is a major alarming situation brewing right under our noses. Yes, it’s worse than becoming lazy or jobless. As the world watches with bated breath, it’s only a matter of time before AI blurs the line between science fiction and reality by transforming corporations and nations into AI superpowers that could start a war, vying for control and dominance. The race for breakthroughs awaits as the billions invested in research and development foster the need for bright minds, and it will come down to who has the better lines of code. Even though the race has just begun, open-source generative AIs are quickly catching up with the latest trends in the world of silicon chic that are constantly redefined, not just for better browser market share. The machine rule lies ahead in the dystopian future once these hyper-intelligent AIs become too self-aware to help their creators, or worse, become the creators of their own.

An AI for an AI

From AI defense lawyers to AI relationship advisors, this race rages on only to increase the absurdity of relying on technology to solve every aspect of our lives, and while it might be intruding on us daily, it listens. If I asked, it would happily write about its flaws for me. What have we not witnessed in this revolution? Facial recognition, AI surgeons, therapy bots, smart cars, autopilot in vehicles, management systems—everything unimaginable is automated to make our lives easier. That’s where the demand and supply curves jump for joy and trigger intense competitive minefields globally. There is unhealthy competition for every possible service to outwit and outperform their rivals, which becomes an incentive to push aside ethical concerns and build digital fortresses. AI might not seem like any other boring machine, but it sure as hell isn’t insignificant enough to be a part of something bigger than us. Enter autonomous weapons, with their AI-powered systems designed to select and engage targets without direct human intervention. Industry leaders already warned that AI could be as deadly as pandemics and nuclear weapons. It’s no longer about military might or economic prowess; it’s about who controls the algorithms that determine our future.

AI to Z

I might be getting ahead of myself with this whole AI war, but who doesn’t fight to protect their sensitive information, especially when big brother, AI, is always watching you? Data is like fuel for ammunition-like algorithms. Data is the universal currency. So the perfect match for this chaotic amount of data should be highly sophisticated computational beings such as AI. Amidst this uproar, there emerges a paradox—the paradox of being omniscient-like yet fundamentally lacking common sense yet having unintended consequences. This paradox remains favorable to us, and I’d rather have my AI be dumb enough to not think of itself as a superior being to us and secretly plan an end to our civilization. But just because we still hold power over AI, that doesn’t mean we can use it to gain power over others. So we should be able to guide AI’s evolution rightfully, and all the laws and regulations shouldn’t say otherwise. Unless the creation of alternatives to the popular, trained-to-be-politically-correct chatbot ChatGPT, like a liberal truth-seeking AI named “TruthGPT,” creates enough political ambiguity that the whole world (well, not incapable people like us) grapples with defining the boundaries of AI, leaving us no choice but to rethink our understanding of intelligence and consciousness.

Share this on: