Functionalism 2

Contents

Overheads

An objection to identity theory: might

silicon badgers feel pain?

The introduction of the concept of software running on the brain allows physicalists to improve on Identity Theory in one key way.

It gives something new , and perhaps more plausible, to identify phenomenal experience with - namely the 'calculation' that is being carried out by the programme at the relevant time. This gives us our three levels of description - neurophysiology, calculation being carried out, what it 'feels like'.

And this new equation - between what it feels like and calculation being performed - solves a problem that Identity Theory faces.

Notice that there are in a way two questions that look like one when you ask 'What is pain?'

If I currently have a toothache, there is the question of what that particular pain that I am having is. The Identity Theory answers (let us say) it is the firing of such and such a bunch of c-fibres in the nervous system. (We will go on talking about c-fibres even though neurophysiologists no longer think of pain as associated with a particular sort of fibre in this way.)

But I might be interested also in another question, namely what is pain in general? To this the Identity Theory can only answer it seems: whenever such and such a bunch of c-fibres goes off, that is pain. But this links pain indissolubly to the firing of c-fibres. If you come across an animal or life-form that doesn't have any c-fibres the Identity Theory implies that it couldn't in principle ever experience pain.

Badger news

Supposing you found a sort of animal on another planet, which seemed to behave very like badger, but which turned out to be made of silicon or something, with not a c-fibre to its name. If pain simply was the firing of c-fibres, we could never entertain the idea that this new creature might be capable of feeling pain. Or think not of a Martian badger-type creature made of silicon, but of one of our own dolphins. A very sophisticated animal we might think, and then discover that its nervous system was not at all like ours; and in particular that it didn't have any c-fibres. Again if it were true that pain is the firing of c-fibres we wouldn't be able to think of the dolphin as capable of feeling pain.

BUZZ: do you think that a creature with a very different nervous system from our own might nevertheless feel pain? Does this prove physicalism wrong?

Some at any rate find the Identity Theory position on this point implausible.

But Machine Functionalism allows them to avoid this problem. Machine Functionalism identifies the feeling of pain not with the firing of c-fibres but with the calculation that in the case of my neurophysiology is carried out by the firing of a particular bunch of c-fibres. If in another body that same calculation occurs but is carried out by different hardware, pain will be experienced. Machine functionalism says the calculating is the pain, and if the calculating is there, even if some other structure is carrying it out, so will be the pain.

Type / token distinction

To put what I am saying here into gear with the literature, there is a piece of jargon I had better explain: I mean the 'type/token' distinction.

TOKENS

Suppose I am told that the pie costs £2.

I have £2 in my pocket.

You have £2 in your pocket.

Which £2 does it cost?

Well - either. When you say a thing costs £2 you leave it open whose or which £2!

A piece of jargon is used when it matters in philosophical contexts whether you are saying the £2 in my pocket or the £2 in general as it were that the thing costs.

I have a token £2 in my pocket. You have another token £2 in your pocket. The general £2 in the assertion that the pie costs £2 doesn't refer to a token at all. It is saying something more general. Use 'type' for the general thing as opposed to the token.

Similarly, take the belief that Fodor is a difficult writer. Some people say if you have that belief, and I have it also, then you have a token of that belief in your mind or head, and I have another token of the same belief.

 

BUZZ

Write down a sentence in which a token intention is referred to. And another in which an intention but not a token intention is referred to.

Suggestion

 

We can use the type/token distinction to express the claims of the identity theory and the objections it is open to.

The identity theorist says: my twinge of toothache is to be identified with the firing of a particular bunch of c-fibres in my nervous system S/he is talking here about the token-pain - the one I have at a particular time.

But what about type-pain? What is the Identity Theory's claim on that?

The answer that suggests itself is: type-pain is the type-firing of type-c-fibres.

The machine functionalist has a different account.

If the brain is a computer running a program, at any particular moment - say the moment at which a pain is felt - there will be neurophysiological activity which is carrying out some part of the program, some calculation or subcalculation. Machine functionalists argue that it is this that is to be identified with the pain - the function (calculation) that is being carried out, not the neurophysiological sequence considered as a neurophysiological sequence in itself.

So using the type/token distinction we can say that the machine functionalist account of pain is this:

Token physical states are token pains.

Type-pain is the calculation performed by those physical states.

I've explained this in terms of pain, but the machine functionalist approach generalizes the point.

So much for the type/token distinction, which I draw attention to here because it is used in the literature in connection with this problem, but also with others.


Different hardware, same calculation

There is another element of the Machine Functionalist case that I want to draw out explicitly.

This is the notion that where a program requires a particular calculation to be carried out, it is quite possible for that calculation to be carried out by any of a variety of different machinery.

We can get a given calculations done by an electronic calculator, or by an old-fashioned mechanical adding machine, or with paper and pencil, or by putting people in rows and giving them clear and definite instructions.

Prompt

Just to settle this idea in, can you think of another example of where you have a variety of different hardware, but each variety doing the same thing?

Suggestion

 


Reprise: the three levels of description

The type/token distinction also helps me make the point about there being three levels of description in slightly different terms.

Think of the network of c-fibres in me and the token-state they are in at a particular time.

I can offer a neurophysiological description of that state. (Arrangement of neurons, their size, their activity etc.)

But if the machine functional perspective on the brain is right, I can say a second thing about that system of c-fibres in me: it is realizing a function - performing a particular calculation specified in the program.

And thirdly, there is the pain as an experience. The machine functionalist says we can understand this as the calculation that is being carried out via the c-fibre activation. On any particular occasion, for us human beings, this is an identity between the experience and the activity of c-fibres in our nervous system. But if we are interested in type-pain - in pain as a phenomenon - we are to think of it as identical not with c-fibre activity, either type or token, but with the calculation the brain program demands when, in a token human case, it demands the activity of token c-fibres.

So type-pain - pain as a phenomenon - is to be identified with the type calculation set out in the program.

One and the same c-fibre activation therefore has

So three levels of description of one physical system:

So for machine functionalism there is a divorce, though of a subtle kind, between mental and physical.

The physical supports the mental, but only in virtue of the calculation the physical performs.

Lycan puts it like this:

 

'And so there is after all a sense [if functionalism is right] in which 'the mental' is distinct from 'the physical': though there are no nonphysical substances or stuffs, and every mental token is itself entirely physical, mental characterization is not physical characterization, and the property of being a pain is not simply the property of being such and such a neural firing.'

Lycan, Mind and Cognition, 2nd edition, p. 6.


END


222 home