In my most recent paper, we report on work in which we trained rats to discriminate among infrared (IR) light sources (normally
invisible to the rat). We coupled the output of a head-mounted IR detector to microstimulators in the whisker region of primary somatosensory cortex.
The rats initally respond to stimulation with a ‘Hey what the heck is this thing on my face?’ type of reaction. For instance, in the following video, you can see an IR light turn on in a behavioral chamber, and hear the output of thecortical microstimulator (which the rat does not hear):
Eventually they adopt new foraging strategies, sweeping their IR sensor around to find IR sources. In the following movie, the rat navigates a behavioral chamber with three ports that contain water spouts, and she must go to the spout in which the IR light is on to receive water:
One question that comes up quite often is whether they experience microstimulation as tactile in nature, or as a new sensory modality. We discuss this in the paper:
Overall, our behavioral results suggest that animals initially treated S1 electrical stimulation as an unexpected whisker deflection, and later they learned to treat it as a stimulus originating away from the body in the surrounding environment. However, we are unable, using the methods in this paper, to determine whether the fully trained rats consciously experienced microstimulation as a novel sensory modality, or simply learned to associate a tactile sensation with an otherwise imperceptible distal sensory cue. This is a question that could presently be addressed with sensory substitution experiments in humans. Indeed, one such study suggests that some subjects experienced tactile stimuli as visual in nature after training with a visual-to-tactile peripheral substitution device (Ortiz et al.,2011).
Note that in the paper, when we say the rats perceive IR light, we use the term ‘perceive’ in its most general sense, which also includes unconscious perception (Marcel 1983). We chose such terminology to remain deliberately neutral about whether the rats consciously experience the new inputs as tactile, or a subjectively different modality.
Source: Thomson, EE, Carra, R, and Nicolelis, MAL (2013) Perceiving Invisible Light through a Somatosensory Cortical Prosthesis. Nature Communications. Paper is available here, and lab web site has a page about it with four movies and a link to the penultimate draft.
Updates:
- New Scientist did a nice montage of our movies which you can find here.
- An excellent article about the work at Scientific American.
awesome. thanks for sharing.
Awesome indeed, though not at all good for my inferiority complex w/r/t people who actually run experiments.
Well, Paul Churchland saw it coming. From Chapter 2 of ‘Scientific Realism and the Plasticity of Mind’:
Thanks for sharing this interesting experiment, Eric. A couple of variations on this paradigm would be worth trying.
1. Restrain the rat behind a transparent barrier while the IR light is on; then release the rat and see if it makes a beeline to the appropriate source.
2. Code IR flashes by simple Morse-like patterns, identify the reward location by a particular IR flash pattern while other kinds of IR patterns are on at different locations and see if the rat identifies the correct IR code and goes to that source.
Hooking up the IR detector, as you did, augments the rat’s tactile whisker map which signals events in personal space by coupling it with the visual map of the rat’s distal surround. It would be interesting to see if a rat that learned to locate reward by an IR code can generalize to immediately find the rewarded location by similarly coded visible light with the IR sensor turned off.
I would argue that this must happen in retinoid space so that the rat’s navigational bearing is determined by the excitatory trace of its heuristic self-locus (the rat’s selective attention) from its self-locus to the IR source as both are represented in retinoid space. According to this view, the rat experiences the joint visual-tactile inputs as distinct from uncoupled inputs. But I doubt if the rat cognitively experiences this as a subjectively different *modality*.
Given my interests in Molyneux’s question, it goes without saying that I’d be thrilled to see what would happen if you tried out the last of Arnold’s suggestions:
That is a very interesting question. In practice, fairly complicated.
The obvious experiment to try
(that I think has fatal, if instructive, flaws) would be to take an
IR-trained animal, and then train it on the visible version of the task
(no IR light, only visible light). Compare her learning curve to a naive animal that has not had any experience
with the task.
The problem is, this would
not control for the fact that the IR-trained animal would have a major
advantage in that she knows the basic structure of the task, the layout
of the behavioral chamber, and she knows that she can
get reward if she pokes in these weird reward ports (and she knows exactly how to move her body to trigger the reward deliver). Even that simple task (getting water from water spouts) takes a day or two to learn.
However,
that’s just to say the obvious experiment would need to be modified or
replaced with something more clever that keeps these potential confounds constant in the two animals. But I’m writing a grant right now, so don’t have the time to think of it.
>>>1. Restrain the rat behind a transparent barrier while the IR light is
on; then release the rat and see if it makes a beeline to the
appropriate source.
My hunch is she would, without hesitation. They become really good at this task. She would be chomping at the bit.
>>2. Code IR flashes by simple Morse-like patterns, identify the reward
location by a particular IR flash pattern while other kinds of IR
patterns are on at different locations and see if the rat identifies the
correct IR code and goes to that source.
This is something we would like to try. Ultimately we want to dial up the amount of information we are pushing through there, and this is one way we could do it. This isn’t the last of the IR rats, at least I hope.
Noe on the practical side, there’s the whole sensory prosthetic angle I didn’t mention in the post. We want to incorporate tactile feedback from prosthetic limbs back to the somatosensory cortex (imagine a prosthetic hand with tactile sensors that tell your brain how much pressure you are putting on the styrofoam cup). The rat whisker system is a really nice place to test out such technology.
> one such study suggests that some subjects experienced tactile stimuli as visual
> in nature after training with a visual-to-tactile peripheral substitution device
Although the evidence for extensive use of sensory substitution devices leading to experiences in the substituted modality is certainly not yet conclusive, another example is the paper “Visual experiences in the blind induced by an auditory sensory substitution device”, Consciousness and Cognition, Vol. 19, 2010, pp. 492–500, https://www.seeingwithsound.com/extra/cc2009_preprint.pdf
Studies of both invasive and non-invasive sensory bypasses may deepen insights in the extent to which brain areas are “metamodal” (task-oriented rather than linked to specific sensory modalities) and plastic.
Peter Meijer
Looks interesting, Peter! Do you know if congenitally blind subjects claim to experience the new inputs (whether tactile or auditory) as a different modality?
Hi Eric. That is a very tricky question to answer. As far as I know, there is nothing like light perception with sensory substitution among congenitally blind users (unlike the reports of some experienced late-blind users). Moreover, congenitally blind people lack memories of prior eyesight as a kind of subjective reference for what “true” vision was like. On the other hand, vision is also about perceiving the environment in a very different way from normal touch and hearing, and sensory substitution enables that. You get much the same visual information, invariances etc as with low vision. So what exactly defines vision, or having visual experiences, or experiencing inputs as a different modality? Some congenitally blind users of The vOICe speak in terms of seeing when for instance saying (here follows a recent literal quote from a female user): “Hey I was playing around with the voice on my cell phone and I had dropped my crochet hook on the floor. I thought to myself, hey I’ll try to see if I can see it. And I could. I could see it best when it was straight up and down. If it was horizontal I couldn’t see it at all. When it was diagonal I could also not see it really much. And what was weird was that I could also see my hand coming down to pick it up. I have seen that before but it’s always a weird feeling to “see” your hand just like sighted people do. Because you don’t get to watch yourself doing anything when you’re blind.”
Pretty cool (though I can’t read the paper because my institution doesn’t have a subscription to Nature Communications which has been really annoying, lately).
I’m curious – do you think that you were stimulating many neurons or just a few? Did you try to see how far you could reduce the current, ie what’s the minimal stimulation required to be able to track an IR beam (and then how does behavior respond when the IR beam is more uncertain – that would be great for foraging data!)?
And the sneaky question: do you think you could have done this by inserting the electrode into an arbitrary region in cortex? 😉