ASSC XIII finished on Monday but it was so packed that I needed a couple of days to rest and ruminate. Contrary to somehow disappointing TSC in Budapest in 2007, ASSC XIII was excellent, and as Thomas Metzinger stressed a couple of times, you could really see that the field is becoming mature.
There were almost 200 posters, 36 talks, 4 plenary symposia (with 3 talks each), and 6 keynote lectures… Yet, there were recurring themes – many talks were focusing on the very same ideas.
A recurring theme was a integrated information theory of consciousness, introduced at the start of the conference by Gulio Tononi during his presidential address. The idea of integrating information, and actually measuring the integration came back in a submitted talk by Christof Koch who improved over Tononi’s measure of effective information by averaging it on all possible states of the network. Clearly, Tononi’s and Koch’s measure are not effectively computable for any non-trivial network, so an excellent symposium, chaired by Anil Seth, on measuring consciousness on the last day of the conference was a nice follow-up. Seth reviewed many measures offered so far and included his idea of measuring the integration of information by causal density (something which reminded me of Herbert Simon’s frequency of interaction that is used to set off the boundary of nearly decomposable systems). To wit, all these accounts attempt at measuring the intensity of information integration, which in turn means that they measure the degree the system is not simply an aggregate of individual information processors but a system. In other words, the background assumption is that conciousness is made possible by highly integrated, information-processing systems, or it emerges in such systems (is an emergent property of the information network in Bill Wimsatt’s sense in relation to individual information-processing elements).
Of course, if you mention emergence, you start thinking about Jaegwon Kim. In his keynote lecture, he tried to downplay emergence (in a similar way he did it in Kirchberg) and defended qualia epiphenomenalism, and consequently suggested that there cannot be a science of consciousness, only of the things that supervene it. Again, I am not at all convinced by his arguments, as he’s making his job too easy by attacking strawmen. Kim’s lecture, entitled Armchair Reflections on Consciousness and the Science of Consciousness, was indeed quite remote from the way philosophers and scientists alike took their stance on consciousness. The only other more conceptual-focused keynote lecture was held by David Papineau who was reasurring that we shouldn’t worry that much about the explanatory gap (there were slight changes in his position about the antipathetic fallacy, by the way). Anyway, other keynotes and symposia were much more experimental.
Michael Tomasello and Susan Carey focused on social cognition, shared intentionality and theory of mind as relevant for consciousness (by the way, there was also a poster by Allison Gopnik). A talk by the William James Prize winner, Joel Pearson, on imagery influencing perception, showed an ingenious way of showing the role of imagery and top-down effects in consciousness (see his paper in Current Biology).
There were also many interesting symposia, and lots of excellent posters. The poster Do Dissociations Work by Elizabeth Irvine was awarded the prize of the best poster by a special commision that included Ned Block and Michael Tye. You can find all inspiring abstracts here. My own talk on computationalism fits, as you probably guessed, the information-integration theme.
Marcin, thanks for telling us about the conference!
Sounds fun. I remember how exciting it was to go to the first ASSC conference in Tuscon long ago as an undergrad.
I’ve never been too impressed with these measures of Tononi. I can build a little circuit gadget that integrates information from multiple sources but that doesn’t seem enough to make it conscious. It seems like another buzzword that philosophers probably don’t understand (because they don’t understand information theory for the most part), and once they learn it they will be less impressed (much like happened with artificial neural network theory). In other words, I’ll come back in five years when the hype over this stuff has died down and say “I told you so.”
What I have seen is we end up with some quantity that is hard to interpret, and which isn’t helpful for making predictions about how brains work, and which doesn’t tell you if the system is conscious, and doesn’t really reveal much new about what consciousness is, or its function.
On the other hand, perhaps their measures will have some independent usefulness for understanding global brain function. I doubt it, but time will tell.
.
Eric,
I share your sentiment. The opposite seems true of Jackendoff’s intermediate level representation account in his wonderful “Consciousness and the Computatonal Mind” (1987). I recently saw a great talk by Jesse Prinz that suggested contemporary neuroscience has largely confirmed his hypotheses.
Hmmm. That would be news to me.
Well, Tononi’s measure is definitely not the best one (Koch showed how it varies over time, for example). However, it might be applied in a quite surprising way – there was an interesting talk that fish cannot be conscious as there isn’t enough integration in its neural machinery.
I think that integration is just a side-effect of complex information processing but I doubt it would be sufficient. Yet, there are reasons to think it’s necessary.
Another thing is that there are fine statistical measures that could probably show a lot more (take all these measures of mutual information, for example – they seem quite good candidates, but you need to account for the actual signal, and not just for possible signal in the circuit, to measure it realistically, I guess).
That’s unquestioned that the masters write essays service will write the superb quality custom term paper for students which don’t have writing skills. That can be the easiest way for such kind of people, I do guess!