Take Chang and Perrig on emergent algorithms regarding cluster formation — Akanthinos
Your quote from the developer of a particular software application claiming his program shows "emergence" is an example of what I'm talking about. A word with a vague and tenuous meaning, but that sounds impressive if you don't think about it too much.
Your example illustrates and supports my point.
But never mind that. I've given all this some thought and I retract my former statement that no AI shows emergence.
Rather, I say that pretty much everything shows emergence, from simple programs to the world around us. By itself, emergence is a vaguely defined term that conveys little or no useful meaning.
Wikipedia defines emergence as
In philosophy, systems theory, science, and art, emergence is a phenomenon whereby larger entities arise through interactions among smaller or simpler entities such that the larger entities exhibit properties the smaller/simpler entities do not exhibit.
https://en.wikipedia.org/wiki/Emergence
There's a more detailed and nuanced definition or set of definitions in the main part of the article, but this will do for our purposes.
Well, what's emergent then? A glass of common tap water is emergent. Hydrogen's a gas, oxygen's a gas, put them together and it makes water. Water is wet. Hydrogen and oxygen "self-organize," a phrase used in the article, into something that has properties neither of the ingredients have.
That's emergence.
Speaking of ingredients, how about some nice light fluffy scrambled eggs? Eggs by themselves are not fluffy. The fluffiness comes out in the presence of a master chef. Is the chef fluffy? No. The chef is not fluffy. The egg is not fluffy. But the chef-egg system produces an omelette which is fluffy. Emergence.
Let's take computer programs. Do AI's do things that you couldn't predict from just looking at the code? Sure. But that's a commonplace in software. Pretty much every ancient legacy mainframe program from the 1960's, a lot of which is still running in the back rooms of the banks, the insurance companies, and the military, is completely inscrutable to its current programmers. They do their best to fix problems and not break things. Nobody understands these old legacy systems.
Yet nobody thinks these old systems are conscious. They just consist of lot of lines of code that nobody understands, and that produce outputs that their programmers did not expect and can't entirely explain.
Microsoft Windows is over 50 million lines of code. Windows does plenty of things that are a complete surprise to the maintenance programmers. The original designers of that system are long retired with their stock options.
So inscrutability of the output is not just something AI exhibits. Virtually every nontrivial software system soon becomes too complicated for any one individual to fully understand it.
But when it comes to AI, we're supposed to think that being surprised by the output is a big deal. Or that a Go program producing a "clever" move is any more meaningful, than the first chess program I ever saw. It came on a floppy disk. It could beat me. I never said wow it's emergent. That's just a buzzphrase. We program it to compute something and it computes something. Computers are good at that and humans have become absolutely brilliant at programming computers. But they're just computer programs.
So what does emergence really mean, in the world or in the context of AI? Sometimes people say that self-awareness is an emergent quality of the brain. Maybe it's true. I don't think that's a very well-defined notion.
But what would it mean for an AI? To say that an AI has emergent behavior means nothing. All it means is that the program has outputs that surprise the programmers. That it makes "clever" moves. We programmed it to do that, why shouldn't it find moves a human wouldn't see? Computers are just great at that kind of stuff. Anything that has rules.
It most definitely does not mean that AI's have any kind of self-awareness or some kind of elevated consciousness. It means that we've gotten really good at programming them to play Go and recognize faces and mine and organize data. We've been organizing data for a long time, since the ancient IBM databases of the 1960's. Neural nets are a really clever way of organizing data and assigning probabilities for what logic branches the program should take. But it's just a program. Code executing against data. In principle, nothing we haven't been doing for sixty years.
And remember, all of these AI programs run on conventional hardware. They are no different in principle to a Turing machine implementing the Euclidean algorithm. Just a set of instructions.
So this mysticism and gee-whiz-ism around AI's is what I'm objecting to.
And the word emergence is a symptom of that.
So can you tell me, when you say an AI shows emergence, what does that really mean at the end of the day? Specifically? Start here:
Take Chang and Perrig on emergent algorithms regarding cluster formation — Akanthinos
Beyond sounding cool, what does that really say? What does it really mean? That water is wet but hydrogen and oxygen aren't? You have to do better than that if you intend to say something meaningful about machine intelligence.
ps -- SEP says that "Emergence is a notorious philosophical term of art.
A variety of theorists have appropriated it for their purposes [my emphasis] ever since George Henry Lewes gave it a philosophical sense in his 1875 Problems of Life and Mind" and "Each of the quoted terms is slippery in its own right ..."
https://plato.stanford.edu/entries/properties-emergent/
So I'm not the first person to question the claim that because something shows "emergence" that therefore I should buy whatever the speaker is selling.