Higher-Order Theories of Consciousness and the Phenomenology of Belief

Next week I am heading up to SUNY Freedonia to give two talks as part of the Young Philosophers Lecture Series . Here is a rehearsal of the first talk which is my most recent attempt to show that Rosenthal’s HOT theory is committed to cognitive phenomenology


  1. John Gregg

    First, does anyone really say that thoughts, beliefs, etc. have no qualitative properties? If that were true, how on earth would you ever know you had a thought? It is for considerations like this that I call the redness of red the gateway drug of qualia. It’s great to bring the intuition that qualia exist to people who have never considered it before, but it is the proverbial tip of the iceberg.

    Secondly, however, I still don’t see the plausibility of HOT theories. Many of the Rosenthal quotes in the video were staightforwardly circular in their definitions or explanations of consciousness. Let’s look at Jerry in pain. The HOT is the second balloon pointing at the first. OK – as naturalists, let’s talk about that arrow, the pointy part of the HOT balloon. Some information channel? Some number of bits per second? Perhaps bidirectional? It better be something like this. If the HOT is doing all the work in terms of consciousness, then we could swap out the FOT (first order thought) entirely with something like a playback tape that merely maintained the approporiate conversation on that communications channel. Then, barring anything spooky and mysterious, the HOT would never know the difference, and would still be conscious in every way it would have been in a case of “real” pain (for example). So then all we’ve done is push the Hard Problem up a level (lather, rinse, repeat). Why should the HOT be conscious based on the bits over that particular wire, and why isn’t that just the same as the original question of why should we be conscious given the bits on the wires from our senses?

    In general, people who try to “naturalize” consciousness try to sneak an awful lot of magic into the innocent-seeming notion of intentionality (The HOT is about the FOT, therefore consciousness. We are self-representing in some suitably integrated way, therefore consciousness. We are embodied representational systems, therefore consciousness . . .).

    -John Gregg

  2. Richard Brown

    Hi John, thanks for the comments!

    yep, plenty of philosophers really do say that thoughts lack qualitative properties…there are different ways that they would handle introspection but, for instance, on the higher-order thought view one would know that one had a conscious thought by coming to have a thought to the effect that one had a conscious thought…
    The arrow was merely meant to capture the idea that the HOT is about or targeting the first-order state. I happen to think that this is captured by causal connections between first and higher-order states but David thinks that it is captured by the intentional properties of the content of the thoughts…and yes on David’s view if we swap out the first-order state but preserve everything else then what it is like for the person follows the higher-order thought’s content…but according to David we haven’t pushed the problem back because the higher-order state is not a conscious state (if it were there would be a further higher-order thought about it). It is not simply that the HOT is about the FOT and therefore consciousness…it is that the HOT represents oneself as being in some state, which makes one aware of the state that they are in, which means that from their point of view it will seem to them as though they are in the FOT; and that’s all we ever meant by ‘consciousness’…it is simply the way things seem from the first-person point of view; HOT capture that and so explain consciousness. That is not circular it is a hypothesis about the nature of consciousness. But I agree that it is controversial, and I have previously discussed the kind of objection that you seem to raising here if you are interested…
  3. John Gregg

    The circularity I was talking about is evident in statements like “…it is that the HOT represents oneself as being in some state, which makes one aware of the state that they are in”. I find that we can think more clearly about things if we do not use prejudicial terminology. Let’s not talk about the FOT and the HOT. Let’s talk about A and B, and the bits-on-a-wire channel between them. If the consciousness actually happens in B, regardless of how we implement A (as long as it keeps up its end of the conversation properly), then we have the good old Hard Problem: why should B be conscious just because it gets some bits over a wire? We can say the bits are “about” A or “targeting” A if you like, but such terminology does not answer the question. “The HOT represents oneself as being in some state” – what does this mean? It has a self-model? Like the publisher’s catalog that lists itself? You say you believe that the connection between A and B is based on causal connections, which means I assume you won’t object to my bits-on-a-wire characterization. Rosenthal says no, it is really “intentional properties of the content of the thoughts”? I have no idea what this means. If we are really naturalists, we must sharpen this up a lot. Bits, bytes, and billiard balls. If he believes in some extra-physical force called “intentionality”, or some being that transcends an actual data structure called the “content” of that data structure, we have left naturalism behind on the side of the road.

    “…From their point of view it will seem to them as though they are in the FOT”: if we are going to be materialists, we don’t get to say “from their point of view” anything “seems” like anything. To a computer (broadly construed as a functional/physical system of the type that materialists believe could support consciousness given the right algorithms and data structures), everything is purely dispositional. There is no point of view, no seeming. Don’t anthropomorphize computers – they don’t like it.

    Basically, what I’m getting at is this: if you can explain consciousness in terms of purely dispositional states, in terms of physical systems, great. But I am extremely wary of people sneaking the ghost into the machine by loading their arguments with terms like “about”, “content”, and “seems”.

    -John Gregg


  4. Richard Brown

    There is no ghost being slipped in here, and there certainly is no circularity. The theory employs psychological terms because it is a psychological theory…you want to talk about the neural implementation of this psychological theory, and that is fine, but we need the psychological theory…intentional properties can be perfectly natural properties; I, and many other philosophers and cognitive scientists, think so so no one has left naturalism on the side of the road…Rosenthal has written a lot on intentionality if you are interested…

    I don’t really get the complaint about anthropomorphizing computers. I didn’t mean to be doing that. Consciousness is a matter of how things seem to be from the first-person subjective point of view. So when we are giving a theory of consciousness we will be giving an account of how things seem from the subjective point of view. The higher-order theory is a theory of consciousness and so aims at giving an account of how things are from the first-person subjective point of view.  One general argument for this view goes as follows;
    1. Thoughts of the appropriate kind result in our being conscious of (aware of) things in the environment. When one has a thought that a certain object is present I become aware of that object. 
    2. So when I have a thought that a certain mental state is present I become aware of that mental state.
    TP: Having a conscious mental state consists in being aware of myself as being in that state
    TP is the Transitivity Principle which is supposed to something like a common sense platitude. From TP and 2 one gets the HOT theory. So without this theory it is a mystery why the thing is conscious. 
    We can give a separate argument that supports the HOT claim that phenomenal conscious is capturable by thoughts. 
    3. Changing the conceptual content of thoughts changes the phenomenal feel of the experience. For example if one is expecting to taste Ranch dressing but one instead actually is eating Blue Cheese, it will taste to like you are eating disgusting Ranch dressing. When you find out that it is actually Blue Cheese it now tastes good…so too learning new concepts changes what one’s experience is like. For instance, if you were to naively listen to two John Coltrane songs you might think that they sounded the same. But if I then told you that one had a bass clarinet and the other didn’t you would then notice that they sound different. 
    4. If applying different concepts to one’s first-order states results in different phenomenal experiences then applying concepts to one’s first-order states in the first place results in phenomenal experience in the first place. 
    Now you may not agree with all of the steps in the argument but the point I am trying to make is that there is no place in this argument where we have to appeal to some ‘extra-physical force’. This is the theory at a higher-level of abstraction; what we do then is to look into the brain and see if we find things in there that match up to the postulates of the theory. the theory commits us to things like beliefs, thoughts, desires, itchs, tickles, pains, seeings of blue, aboutness, etc that is true and I suppose it is also true that we may look into the brain and not find these things there, but I think we have already started matching the items from our psychological theories to candidates in the brain…we can debate that if you want but again the point is just that it is wrong to say that since teh theory employs terms from psychology that it is non-naturalistic. 
  5. John Gregg

    I must say I’ve never read a convincing account of how intentionality might be naturalized. Of course nothing keeps us from speaking figuratively, and saying that photons from distant stars represent those stars. But defining intentionality so broadly buys us nothing beyond physical causation.

    The ghost I think you are slipping into the machine, the anthropomorphism, comes from talking about talking about inanimate systems as having points of view, of them being aware, or things seeming to them like anything at all. When we describe what goes on in our minds, we naturally use psychological terms. But merely restating our own intuitions in this way is different than explaining them naturalistically. Yes, when I have a thought that I am in a certain mental state I become aware of that mental state, etc. but this doesn’t make a dent in the Hard Problem, and the explanatory gap is as unbridgeable as ever. I have no problem when you keep things in the first person (I apply concepts to my first order states, I represent the world in a certain way), but the main problem is explaining how something other than “I” could do all that (even a brain, naturalistically described).

    Seeing red on one hand, and knowing things on the other, seem to us like different categories of mental stuff. Yet there seems to be no space for sunlight between seeing red and knowing that I am seeing red. They appear to be two sides of the same coin. My general complaint about HOT theories is that they simply restate this admittedly mysterious situation without actually dissolving it.

    -John Gregg

Comments are closed.

Back to Top