AI Pioneer Yann LeCun Says AI Is "Dumber Than A Cat" - But Maybe That's Smart Enough
The AI researcher argues that AI is still far below human intelligence, lacking true understanding and creativity, and fooling us by making predictions based on statistical patterns from huge data sets. But isn't that enough for now?
The Wall Street Journal quoted Turing Prize winner Yann LeCun in an article on October 11 saying that AI is far from matching our intelligence and is in fact "dumber than a cat". AI makes predictions based on probabilities calculated from the mountains of data it has been trained on. The AI researcher goes on to say that behind these probabilities is no understanding and certainly no creativity, and thus no real intelligence, but rather a kind of statistical trickery. Or as he puts it: “You can manipulate language and not be smart, and that’s basically what LLMs are demonstrating.”
From the perspective of a scientist like LeCun, it may be reasonable to equate intelligence with creativity and deep understanding - true intelligence is more than the ability to recognize and predict patterns. Outside the ivory towers of science, however, most of the tasks we perform in our daily lives do not require much creativity or deep understanding. In fact, many activities are routine and follow clear rules. This includes jobs that supposedly require high levels of intelligence, such as software development.
Programming is often thought of as an activity that requires a lot of creativity, yet modern programming paradigms and the languages based on them aim to encourage programmers to write code that is as uniform as possible. Consistent and homogeneous source code is vital for software that has multiple programmers working on it. If each programmer were free to use his or her own creativity, the result would likely be spaghetti code that is nearly impossible to maintain and extend. It is this lack of creativity that makes AI a reliable programmer in most cases. What percentage of programmers hired worldwide work on complex algorithms that require a high degree of creativity and deep mathematical understanding? Let's face it, most of the work is adding functionality by adding boilerplate code. This is where AI can bring huge efficiencies.
LeCun argues that we are a long way from true artificial general intelligence, and that concerns about the threat of AI overtaking us are completely overblown (in fact, he calls these concerns "complete B.S."). This leading AI researcher knows what he is talking about, and he is probably right that today's LLMs are nowhere near "true" intelligence. At the same time, he does not deny that the impact of AI on his employer Meta has been "enormous". But perhaps the fact that AI is "dumber than a cat" is enough to disrupt our economies?