Tag evolutionary computation

AI vs. EC

Since the 1940s, computer scientists have been seeking to make machines perform the same kinds of tasks as humans. This pursuit of artificial intelligence (AI) has yielded a lot of impressive results, such as Deep Blue beating a chess grand master, but it has fallen short of people’s expectations: computer translation, for instance, is still a long way away.

And so computer scientists started emulating evolution by natural selection, a process about as far removed from intelligence as possible: try everything and see what works. A process so amazingly stupid that even inert, nonliving material cam perform it. This research seems to have succeeded much better than anyone expected.

Which just goes to show that artificial intelligence is no match for artificial stupidity.

Front-Loading: I Do Not Think It Means What You Think

One of the ID creationists’ favorite words is “front-loading”. From
context, I gather that it means that the output of an algorithm is
inherent in the algorithm itself. In other words, if you write a
detailed program that calculates the square root of 16, then that’s
just a long-winded way of having it print “4”. You could have saved
yourself a lot of time by just having it print “4” in the first place.

Front-loading comes up in two arguments: 1) evolutionary algorithms do
not demonstrate that evolution works, because the solution is hidden
in the code, and 2) the fact that complex organs exist is evidence of
the unfolding of God’s an unspecified intelligent
designer’s plan; the appearance of limbs and organs in the fossil
record is part of the unfolding of God’s the
designer’s plan and was front-loaded at creation some
unspecified point in the distant past.

Read More

Gil Dodgen: Uncommonly Dense

Gil Dodgen posted the following over at Uncommon Descent:

All computational evolutionary algorithms artificially isolate the effects of random mutation on the underlying machinery: the CPU instruction set, operating system, and algorithmic processes responsible for the replication process.

If the blind-watchmaker thesis is correct for biological evolution, all of these artificial constraints must be eliminated. Every aspect of the simulation, both hardware and software, must be subject to random errors.

Of course, this would result in immediate disaster and the extinction of the CPU, OS, simulation program, and the programmer, who would never get funding for further realistic simulation experiments.

All I can say is “wow”. Either Dodgen is having us all on (which I doubt, since he’s started a new thread to respond to the charge that he doesn’t know WTF he’s talking about), or he honestly doesn’t understand the difference between the simulated environment and the machine doing the simulating.

Presumably he also believes that when NOAA simulates the effect of a hurricane hitting the Florida coast, they have to pour rain onto their computers. And that every time an orc dies in World of Warcraft, a real orc dies in some distant land.

I know that I’m often too rooted in the concrete and have trouble going from a collection of facts to a general principle, but damn!