The thing is, AGI and AI are different things. Like your “LLMs aren’t real AI” thing , large language models are a type of machine learning model, and machine learning is a field of study in artificial intelligence.
LLMs are AI. Search engines are AI. Recommendation algorithms are AI. Siri, Alexa, self driving cars, Midjourney, Elevenlabs, every single video game with computer players, they are all AI. Because the term “Artificial Intelligence” by itself is extremely loose, and includes the types of narrow AI all of those are.
Which then get hit by the AI Effect, and become “just another thing computers can do now”, and therefore, “not AI”.
That just Compares it to human level intelligence. Something which we cannot currently even quantify. Let alone understand. It’s ultimately a comparison, a simile not a scientific definition.
Search engines have always been databases. With interfaces programmed by humans. Not ai. They’ve never suddenly gained new functionality inexplicably. If there’s a new feature someone programmed it.
Search engines are however becoming llms and are getting worse for it. Unless you think eating rocks and glue is particularly intelligent. Because there is no comprehension there. It’s simply trying to make its output match patterns it recognizes. Which is a precursor step. But is not “intelligence”. Unless a program doing what it’s programed to do is artificial intelligence. Which is such a meaningless measure because that would mean notepad is artificial intelligence. Windows is artificial intelligence. Linux is artificial intelligence.
You can’t just throw out random Wikipedia links. For example, the Article on AGI explicitly says we don’t have a definition of what human level cognition actually is. Which is what the person you were replying to was saying. You’re doing a fallacious appeal to authority, except that the authority doesn’t agree with you.
They already did. AGI - artificial general intelligence.
The thing is, AGI and AI are different things. Like your “LLMs aren’t real AI” thing , large language models are a type of machine learning model, and machine learning is a field of study in artificial intelligence.
LLMs are AI. Search engines are AI. Recommendation algorithms are AI. Siri, Alexa, self driving cars, Midjourney, Elevenlabs, every single video game with computer players, they are all AI. Because the term “Artificial Intelligence” by itself is extremely loose, and includes the types of narrow AI all of those are.
Which then get hit by the AI Effect, and become “just another thing computers can do now”, and therefore, “not AI”.
That just Compares it to human level intelligence. Something which we cannot currently even quantify. Let alone understand. It’s ultimately a comparison, a simile not a scientific definition.
Search engines have always been databases. With interfaces programmed by humans. Not ai. They’ve never suddenly gained new functionality inexplicably. If there’s a new feature someone programmed it.
Search engines are however becoming llms and are getting worse for it. Unless you think eating rocks and glue is particularly intelligent. Because there is no comprehension there. It’s simply trying to make its output match patterns it recognizes. Which is a precursor step. But is not “intelligence”. Unless a program doing what it’s programed to do is artificial intelligence. Which is such a meaningless measure because that would mean notepad is artificial intelligence. Windows is artificial intelligence. Linux is artificial intelligence.
You can argue what you think the words should mean in your opinion in the field of artificial intelligence. I agree with some of them.
It just doesn’t change them.
You can’t just throw out random Wikipedia links. For example, the Article on AGI explicitly says we don’t have a definition of what human level cognition actually is. Which is what the person you were replying to was saying. You’re doing a fallacious appeal to authority, except that the authority doesn’t agree with you.
That’s a disturbing handwave. “We don’t really know what intelligence is, so therefore, anything we call intelligence is fair game”
A thermometer tells me what temperature it is. It senses the ambient heat energy and responds with a numeric indicator. Is that intelligence?
My microwave stops when it notices steam from my popcorn bag. Is that intelligence?
If I open an encyclopedia book to a page about computers, it tells me a bunch of information about computers. Is that intelligence?
If AI helps us realize that a thermometer fits the definition of Intelligence when it shouldn’t, then it’s entirely valid to refine the definition