A friend asked me what i meant by "even pigeons are capable of abstract thought".
The experiment goes something like this:
there are 2 levers and the pigeon does not know which one will give food. Above each lever is a circle drawn.
to get to the food the pigeon has to pick the lever with the bigger circle.
Once the pigeon(s) has been used to this configuration, it is then changed, with the previous bigger circle now being the smallest of the two. Pigeons chose automatically the biggest circle and not the circle they were used to.
That means they are choosing the relatively bigger circle, and not a circle based on a concrete shape or size. That is exactly what is meant in general by abstract thinking.
You must realize that Bigger Than (>) is a so-called primitive: you cannot explain it, just use it. When saying/seeing that a>b, you cannot give any reason why this is so without using (>), the same thing you were set out to explain.
Or you end up giving reasons which only make sense for beings dotted with the same same brain you have.
Stone a is bigger than stone b because stone a has more surface. So bigger than will mean having more surface. But what is more surface? Try to bring this argumentation to an end, you will end up with something like:
more surface:
- my eyes movements from one end to the other are different when looking at surface a than when looking at surface b.
- it takes two hands to hold stone a and only one hand to hold stone b.
- stone b fits in hole h but not stone a.
......
We only use these indirect "proofs" when the difference is really small, because otherwise we just can see/feel that 2 objects are different, with one bigger and the other smaller.
In computers bigger means: 1>0. It is a convention which works well with our world and experiences. It is not a definition of bigger than, since there are no such definitions.
So when pigeons choose the bigger circle, they are not computing surfaces or anything:
when they are looking at the first 2 circles, they experience something which we would be hard put to put into words. (what we were trying to express when we try to explain what bigger than means).
When they look at the second set of circles, they experience the same thing as they do when looking at the first set: the circle present in both tests triggers one set of experiences in the first case, and another in the second. Why?
Let's take a simple example: I give you two stones and tell you to put the lightest of the two down, which I then replace with another one, and ask you again to put the lightest down, and so on.
Whatever definition you choose of lighter/heavier, whatever neural explanation you will come up with (by heavier objects more muscles are involved, more neurons and...), it will still be a circular definition.
Let us accept the fact that when more neurons are involved with manipulating object a than with b, we then say that a is bigger, heavier, rounder.....
It is just as good a definition as any. But then, how do we know that more neurons are involved?
That's just it, we know it only after the fact. We can say that, according to our definition, another convention, more neurons had to be involved with object a than with object b (forget about brain scans for now).
More in one case might mean 500 vs 2000, or 2000 vs 3000. Which means we cannot define what "more" means except through other conventions (like higher up in the number series).
Does the brain have a way of counting or otherwise distinguishing the number of neurons involved in each situation? If you say yes, you believe in the little man in our brains: the homunculus (google it, very interesting). This little man also needs a little man in his brains, who also.....
So, we have now this:
More neurons (or whatever other explanation you think is better) ==> we say a is bigger. In other words we give a name to the sensation that more neurons are involved.
So there must be, somewhere in the brain, a "bigger than" processor that gets activated when more neurons are involved in case a than in case b. This processor would then point at object a, and that would allow us to say that object a is bigger than object b. "Processor" might be a strong word in this case. It is more of a scale really: it tips to the side holding the heaviest object.
Imagine you are living in a post apocalyptic era (skynet has almost obliterated humanity before committing suicide and disintegrating all machines), and you want to know which of 2 stones, which feel the same to you, is the heaviest. You throw a rope over a branch, attach each stone to an end, and... You must make sure that both stones are AT THE SAME HEIGHT before deciding which is heavier.
This is very interesting, because it means that, before you let Mother Nature take over and decide which is heavier, you have to tell her where to begin!
Did we land in relativity land without us knowing it? That is another theory i would really love to understand in its finest details, but since I do not, I will just leave it at that.
But i do not think that it makes the test any less "objective". An ancient Greek philosopher said that "Man is the measure of all things", but I think he went a little bit too far. After all, the begin situation (both stones at the same height according to you), is just one of infinite possibilities that Mother Nature can handle. It is not something that falls outside of natural laws. And the first thing scientist do is play with the begin situation and observe the different behaviors.
So back to our pigeons. Do they have also a "heavier-than scale"? Apparently they do, that is, if such a thing exists at all.
But then, we must also have a " same or equal scale", a "longer or shorter scale", and so on.
When you look at computer instructions, be they digital, or quantum, it always involve movement of data from the memory to the registers and vice-versa, and comparisons before the actual operations (add, sub, mult....). Data movements can be easily compared to human bodily movements, and the different comparisons look very much like our inner scales. The other operations are also easily identified with our own actions.
Computers have "equal", "bigger" or "smaller" for every kind of difference we humans have special words for (longer, redder, nicer, more interesting...).
Maybe that is the problem which computers face when trying to emulate humans: they just do not have enough inner scales. Their architecture is much too coarse. All the computer can say is: something is bigger than or equal to something else. So all the human nuances must be reduced to this single dimension.
We can now rephrase or goal of an intelligent computer: either we find a way of expressing human sensations and feelings with the coarse digital/quantum architecture, or we will have to give computers a personality. By that I mean not only a way of distinguishing all the differences man can distinguish, but also the emotions/feelings that go with them. We need more Marvins, and preferably not all as neurotic as he. We cannot of course endow computers with feelings and emotions, but maybe we can fake them?