Prelude
Before reading this post, read the post on three layers of criticism.
A Lot of Nonsense on AI Out There
AI is the hot topic of the day (for good reasons maybe). Like everything else, most of the people that are talking about it don’t have much clue [Prompt]. I hear statements such as generative AI is as intelligent as an X years old human, or the new model built by company X, has higher IQ than an average human, or soon AI is going to be more intelligent than human. While one cannot deny the impressive progress in the field of AI, specially the large language models and gen AI that have created much excitement in the past couple of years, IQ and human intelligence should be defined more accurately. If the claim is that gen AI is able to do a certain task better than humans, it’s not the same as having a higher IQ. It’s just yet another task that computer is able to do better than humans. Same way that over two decades ago computer beat human in chess, or same way that computer beat human in certain vision related tasks with neural network advancements.
If the claim is that AI is able to destroy humanity, atomic bombs and hydrogen bombs are way more effective and capable in this regard. If the claim is that gen AI is sentient like the computer in “2001: A Space Odyssey”, there is not much to say since the claim is not serious enough to receive serious attention. If the claim is we are fast approaching sentient or superior artificial intelligence, then you don’t know much about AI and how it works. Again, we are not talking about the feasibility of sentient AI or even AGI, it’s about how close we are to building a system that outsmarts us as a new species rather than a technology.
The litmus test of AI
Fine, so you think AI is already passed human intelligence or is about to do so very soon. Let’s take a bet and have time as the judge. When you compare the most genius individuals that ever lived with average human beings, only a slight increase in one’s intelligence makes them come up with general or special relativity compared to all other individuals that have ever lived on this planet. Now, you are claiming that AI is going to surpass human intelligence in tangible if not significant ways. Fine, then I assume within the next five years, AI should solve the puzzle of quantum gravity. If you think AI is not capable of doing this, yet you think it has or is about to surpass human intelligence, then take a break and choose a different topic to talk about.
But can AI surpass human intelligence?
Hopefully yes. I’m not sure if it’s even the same type of phenomena that we are referring to as AI these days or not. It might be a fundamentally different technology, but hopefully there will come a day that we have cleaner and more abstract intelligence where it doesn’t need to generate novel thoughts by shoving pasta and veggie burger down its throat.