SlideShare a Scribd company logo
1 of 35
SOMATOSENSORY CORTICAL ACTIVITY DURING GRASP: COMBINED
PROCESSING OF TACTILE AND MOVEMENT-RELATED SIGNALS.


Meller DM1, Naufel SN1, Helms Tillery SI1,2,3



1.   Graduate Program in Biomedical Engineering
2.   School of Biological & Health Systems Engineering, and
3.   Dept. of Psychology
      Arizona State University
      Tempe, AZ 85287-9709




CORRESPONDENCE:
    Stephen I Helms Tillery
    School of Biological & Health Systems Engineering
    ECG 334, MS 9709
    Arizona State University
    Tempe, AZ 85287-9709
    Email: Steve.HelmsTillery@asu.edu
    Phone: 480-965-0753
INTRODUCTION

Daily use of the hand reveals the striking capability of the sensory nervous system to quickly identify
objects being haptically manipulated. This skill, stereognosis, allows us to select unseen objects from
our pocket, to quickly release one of several held objects, to identify irregularities like bugs on our skin,
or to pluck a barely seen berry from the middle of a tangle of thorny branches. A complete understanding
of this capability will require a synthetic model of tactile sensation, as haptic discrimination requires the
ability to discern the fundamental characteristics of objects including shape, texture, hardness, and
temperature, as well as the spatial relationship between local regions of skin experiencing these sensations
(Lawson and Bracken 2011; Rincon-Gonzalez et al. 2011) . Constructing such a model is both
conceptually and experimentally challenging.      Despite careful classification of neurons from the
periphery to the cortex in terms of their primary response characteristics, it is clear, even from the
mechanical system that binds the receptors, that the various stimuli listed above can drive the activity of
multiple receptor systems (Johansson and Flanagan 2009). Thus, each receptor system responds in
complex ways to the mixture of physical parameters that occur in natural manipulations and enable
stereognosis.


The stimulus coding properties of many channels in the somatosensory system have been carefully
measured and quantified. These experiments provide good detail about the physical parameters that
drive system channel responses. Regarding cutaneous sensory receptors, a long experimental tradition
has analyzed the spatiotemporal response properties of sensory channels in response to stimulation with
punctate probes, oriented bars, textures, and gratings (Bensmaia et al. 2008a; Bensmaia et al. 2008b;
Sinclair and Burton 1993; Sinclair and Burton 1991). For example, recently a high density tactile display
has been used to characterize the fine details of responses from neurons in somatosensory cortex (Pei et
al. 2011; 2010).   For deep systems, characterization has included correlation with changes in joint angle
or arm configuration, applied force or perturbation force, and response to sinusoidal or random force
application.


Proprioceptive parameters have also been decoded from neurons with tactile responses. For example,
neurons in SI with cutaneous receptive fields on the skin encode the static posture of the arm expressed in
terms of shoulder and elbow joint angles (Cohen et al. 1994; Kalaska et al. 1990; Prud'homme and
Kalaska 1994; Tillery et al. 1996). This is not a highly surprising finding given that skin strain manually
introduced at digital joints were sufficient to induce proprioceptive psychophysical illusions (Cordo et al.
2011; Edin 1992; Edin and Abbs 1991; Edin and Johansson 1995). In principle, it should be possible to
decode the characteristics of a variety of peripheral somatic stimuli from their associated responses in
sensory cortex. This would echo the successes of decoding volitional movement direction from the
activity of neurons in primary motor cortex (Carmena et al. 2003; Georgopoulos et al. 1983;
Georgopoulos et al. 1982; Georgopoulos et al. 1984; Georgopoulos et al. 1986; Serruya et al. 2002; Taylor
et al. 2002; 2003; Velliste et al. 2008). Decoding grasped objects from neural firing patterns represents a
crucial first step in encoding somatic sensation for neuroprosthetics. Thus, the goal of this work is to
disentangle the various elements of external stimuli that drive somatosensory neural firing during
naturalistic reach-to-grasp movements.


This goal is not trivial: cortical recording experiments have revealed a daunting complexity in the neural
response to a variety of experimental tasks. From some of the earliest recordings in somatosensory and
motor cortical areas in non-human primates, investigators have reported a substantial overlap in response
properties in these two areas. Neurons in primary motor cortex (MI) naturally encode movement, but a
portion of neurons in SI also alter their firing rates prior to movement and appear to have movement-
related activity that is similar to activity in MI neurons (Fromm and Evarts 1982; 1977; Fromm et al.
1984; Soso and Fetz 1980). At the same time, a portion of MI neurons, particularly in the rostral bank of
the central sulcus, have been shown to have tactile responsiveness(Evarts and Fromm 1977; Strick and
Preston 1982). One might argue that the motor-like responses in SI follow from the slowly-adapting type
II (SAII) responses described by Edin et al, but it might also be the case that there is an element of
efference copy in the activity of these neurons. Likewise, sensory responses in MI might participate in
cutaneous reflex loops, but might also enter into the perception of tactile events.


Our primary approach to this question has been to attribute consistent and repeatable patterns of neural
activity to specific elements of the sensory input that occurs in natural haptic manipulations (Rincon-
Gonzalez et al. 2011). We identified cortical neurons with cutaneous receptive fields on the hand and, by
comparing firing rates during movements with similar hand motion but very different tactile experiences,
were able to show that many cutaneous neurons are significantly and repeatably modulated not only by
direct contact, but also by other stimuli such as the movement of the hand itself. We recorded the
responses of 285 single units in SI with cutaneous receptive fields on the dorsal or volar surfaces of the
palm and digits in monkeys (Macaca mulatta) trained to reach for and grasp real (physical) objects
presented in the workspace, while visual only trials (object presented only in the visual display) were
randomly inserted into the task. This allowed us to unambiguously distinguish neural activity resulting
from tactile interaction (“contact-driven”) from activity that was not driven by contact with the grasp
object (“movement-driven”). We identified neurons that encoded only contact-driven stimuli, neurons
that encoded only movement-driven stimuli, and neurons that encoded a combination of the two. We also
found that signals arising from these two modalities appeared to combine linearly in the firing of SI
neurons.
METHODS

The key element in the experiments described here is the separation of somatosensory signals arising
from self-movement from those arising from contact. To achieve this, we used an immersive virtual
environment The experiments described here were carried out on two male rhesus macaques (Macaca
mulatta; monkey F, 11 kg; monkey I, 6.4 kg). One of the animals reported here (F), has had stimulation
arrays implanted into the hand area of SI mapped in these experiments and is currently involved in
somatosensory stimulation experiments. The second animal (I) suffered a clinical event and was
euthanized according to approved methods. For animal I, MR images were obtained at Barrow
Neurological Institute, and the data imported into Monkey Cicerone © (Miocinovic et al. 2007) for
surgical planning and penetration analysis. All experimental protocols were approved and monitored by
the Arizona State University Institutional Animal Care and Use Committee and conformed to the “Guide
for the Care and Use of Laboratory Animals” (National Research Council, 1996).


Behavioral Task

Experimental Setup
Fig. 1 illustrates the overall experimental setup used in this study. Two male rhesus macaques were
trained to perform a behavioral task while seated in a restraining chair with the head fixed. The left arm
was restrained throughout the task. A mirror was located 4 inches in front of the monkey at a 45-degree
angle to reflect the screen image from a 3-dimensional (3D) monitor (SeeReal Technologies) mounted
horizontally and directly above the seating area. The monitor displayed the robot enhanced virtual reality
environment (reVRE) and provided all visual cues during the task, including a virtual hold pad, hand and
grasp objects. Hand position and kinematics were captured using an active marker motion capture system
(Phasespace, Inc.). Markers were placed at the distal phalange of each finger along with a triad on the
back of the hand, and these were sampled 100 Hz. At no time could the animal see its hand or the objects
being presented in the workspace immediately behind the mirror. Grasp objects were presented to a single
location near the body midline and at shoulder height by a 6-axis robotic arm (VS-6556-G, Denso
Robotics) fitted with a pneumatic tool changer (QC-11, ATI Industrial Automation) on the end effector. A
6 degree-of-freedom (DOF) force and torque sensor (Mini85, ATI Industrial Automation) also mounted
on the robot’s end effector sensed contact events with the grasp objects.


Task Time Line
A trial began when a subject placed its right hand on a 4-in square hold pad located at mid-abdominal
height. A small touch sensor (Touch Mini v1.2, Infusion Systems) mounted directly beneath the pad
monitored hand contact. The subjects always operated in the virtual reality environment, and were trained
to recognize cues on the screen, which included a virtual hold pad, object, and hand. The latter was visible
and tracked at all times during the experiment. The virtual hold pad appeared in the reVRE after an inter-
trial interval of one second, but trials were self-paced and no explicit instruction was given to initiate a
trial repetition. Contact with the physical hold pad, indicated by a color change from red to green, was
required for a randomized period of 1500-2000 ms. The trial was immediately stopped if the subject
failed to maintain constant contact with the hold pad at any time during the hold period. Once the hold
period had ended, the virtual hold pad disappeared and an audible go cue was played, signaling that the
subject was free to reach for the object. No maximum time limit was set for the subject to react to the go
cue. The reach portion of the trial began at the moment the hand left the hold pad (Hold Pad Release,
HPR), and the subject was then required to complete the trial within 5 seconds of HPR.


Successful completion of a trial was signaled by an audible success cue and removal of both the physical
and virtual objects. A juice reward was delivered approximately 500 ms after these cues. Failed trials were
signaled by a distinct audible cue and immediate removal of both the physical and virtual objects. No
juice reward was delivered for failed trials.


Grasp Objects
Two objects were used in this task to elicit distinct hand postures during grasping. The small object
required a grasp aperture (linear distance between distal phalanges of thumb and index finger) of 1.50 in,
while the large object required an aperture of 3.0 in. Subjects were trained to grasp the objects using a
precision grip with strict requirements on finger placement to encourage maximum engagement of the
glabrous digital surface with the object. Each object was equipped with three small (∅0.75 in), thin (0.04
in) touch sensors (Touch Mini v1.2, Infusion Systems) mounted to the surface in the desired contact
location of the thumb, index and middle fingers. Successful task completion of the physical task required
simultaneous contact with at least the thumb and index sensors for a period of 250 ms.


In visual-only trials, the animal was required to intersect a virtual rendering of the object using only visual
cues. Successful completion of a visual-only trial required the virtual hand to make virtual contact with
the virtual object in the reVRE. The visual-only task was not modeled in sufficient detail to require the
contact of specific virtual digits with virtual touch sensors, rather an object collision event (virtual model
mesh collision detected by the VR software) was required for a period of 250 ms. Virtual contact was
signaled by a change in the color of the virtual object from white to green.
Block Design
Object presentations were either visual-only (a virtual rendering of the physical object) or physical (an
physical object was also presented in register with the virtual object within reach in the workspace), with
visual-only trials inserted randomly into blocks of object size in a ratio of 60:40 (physical:visual-only).
Blocks were limited to 10-20 trials in order to maximize the possibility that all task conditions would be
covered while maintaining isolation of the unit being recorded. For each unit analyzed, at least 60
successful task repetitions were completed, which guaranteed at least 15 trials to each of the four
conditions: large physical, large visual-only, small physical, small visual-only.


We took two measures to ensure that the animal did not know until the end of the reach whether a trial
was physical or virtual. First, in every trial we ran the robot through a sequence of movements that was
similar to the sequence in the physical trials, but in the visual-only trials ended with the object well out of
the animal’s reach. Second, we broadcast audible white noise throughout the experiment to mask
extraneous noise cues. An analysis of the kinematics of the reach showed that the reach phase of the two
tasks were indistinguishable up to the moment of object contact. The overall hand trajectory was
indistinguishable for both physical and visual-only tasks up to the point of object contact, with the
exception of more corrective movements in the visual-only trials (see Results).


Definition of Task Phases
Four task phases were defined for every repetition of the behavioral task: Hold, Reach, Contact and Grasp
(see Fig. 1B). The Hold phase was defined as the interval [-500, -50] ms with respect to HPR, and was
therefore the same duration for all trials. The short offset from HPR (50 ms) excluded neural activity
caused by slight pre-reach anticipatory movements or from tactile responses caused by changes in contact
with the hold pad itself. Definition of the remaining phases depended on the task and the analysis. For
single trial results, the task phases were tied directly to events, whereas in the grouped-trial analyses,
phases were defined by the distribution of events across blocks of trials. These definitions diverged after
the Hold phase.


For single trial analyses, the Reach phase extended from 50 ms after HPR until the first object contact
(FOC) event. The Grasp phase was defined as the interval from FOC until the trial end. In physical
trials, the FOC event was defined as the moment that the torque on the wrist sensor exceeded 0.2 N-m. In
visual-only trials, FOC was the moment of collision between the hand mesh and the object mesh. Since
this event relied on the virtual environment, which was updated at the frame rate (67 frames/sec), these
events had a temporal resolution of no greater than about 15 msec. Thus, the FOC event for visual-only
trials was temporally uncertain and undefined.
For the analysis of blocked data, we defined task phases in a way which allowed them to be readily
extracted from blocks of trials. Specifically, task phases were defined by the distribution of events in a
group of related trials. For the Reach phase, the duration was bounded on one end by the end of the Hold
phase (HPR + 50 ms) and on the other by the shortest single-trial Reach phase (earliest First Object
Contact) of the trial group. This definition ensured that only non-tactile task activity (reaching, hand
shaping) would fall in the Reach phase since, by definition, the subject’s hand had left the hold pad, but
had not yet contacted the grasp object. The Contact phase for a given unit was defined as the interval
containing the FOC events of all of the physical trials (earliest FOC to latest FOC). Individual trials
having a Reach phase of less than 300 ms or longer than 700 ms were discarded in order to remove
outliers from the Contact phase. Rather than relying on the limited time resolution of the object contact
event in visual-only trials, we assumed that the distribution of contact times was similar to that of the
physical trial contact times, since the visual-only trials were inserted randomly during a block of trials for
object size, and thus used the same bounds determined from the physical trials. Finally, the Grasp phase
for both physical and visual-only trials began at the end of the Contact phase (last FOC event), and lasted
until the time of the success cue for a total trial length of 1200 ms.


Surgical Procedures and Recording

Early in the process of behavioral training, three titanium head holding pedestals (Thomas Recording,
GmbH), were surgically fixed to the skull. A period of at least 2 months was allowed before restraining
the head to allow for sufficient healing and osseointegration of the bone screws. Once a monkey had
reached an acceptable level of competency in task performance (minimum 2 hours of work at 85%
correct), a recording chamber was surgically implanted over the primary sensory cortex contralateral to
the working hand. In monkey F, the rectangular chamber was fabricated from titanium, with inner
dimensions 30 mm x 20 mm. The long axis of the chamber was oriented at an angle of approximately 15-
degrees counterclockwise with respect to the rostral-caudal axis, and the stereotaxic location of the
chamber center was approximately 18.1 mm anterior to interaural zero and 18.8 mm lateral to the midline.
In monkey I, the chamber was fabricated from a medical grade, biocompatible polyetheretherketone
(PEEK) polymer (PEEK-OPTIMA®, Invibio™) to allow a more sophisticated design and to facilitate
fabrication (McAndrew et al.). The inner wall of this chamber had a circular cross-section (∅20 mm) and
the stereotaxic location of the chamber center was approximately 17.0 mm anterior to interaural zero and
18.4 mm lateral to the midline.    In both cases, the chambers were oriented such that the penetration
direction was aligned with stereotaxic vertical.


Parylene-coated tungsten microelectrodes (1.0 MΩ, Harvard Apparatus) were driven into the cortex using
a microdrive (NAN-CMS, NAN Instruments Ltd.) mounted to the chamber. The time occurrence of action
potentials from isolated units was recorded and the instantaneous firing rate was calculated using binned
time intervals of 20 ms, smoothed with a triangular convolution kernel (Nawrot et al. 1999).


Recording location (stereotaxic coordinates, recording depth), cutaneous receptive field location and real-
time neural response attributes were used to guide the selection of recording sites to the hand
representation of SI, similar to the procedure described by Mountcastle et al. (Mountcastle et al. 1990).
Recording began in posterior SI where unit responses to passive joint flexion and extension were common
and cutaneous responses were rarely observed. Motor responses were not observed in this region, and
any electrode penetration deeper than 3.0 mm invariably encountered white matter. Subsequent recording
sites were advanced rostrally until motor responses were consistently observed, thus approximating the
location of the rostral bank of the central sulcus.


In addition for monkey I, MR imaging data were combined using the Monkey Cicerone© software
package (Miocinovic et al. 2007) to reconstruct the chamber location and track electrode penetrations.
Post-mortem examination verified that the bulk of the penetrations were in the hand area of the post-
central gyrus, in agreement with the reconstruction using Monkey Cicerone (see Fig. 2). Finally,
intracortical microstimulation (ICMS) experiments were carried out in both subjects. We observed
several instances of withdrawal reactions of the hand on stimulation, but only with high-amplitude ICMS
(> 90 µA) did we observe any movement twitches. These twitches occurred while stimulating in the
rostral section of the recording sites in monkey I (see Fig. 2C).


The initial search for task related units began in the caudal chamber and proceeded rostrally until motor
responses were encountered. These were readily identified from audio and visual feedback of neural
activity correlated with subjects’ volitional arm and hand movements. Motor responses were identified
using simple observational criteria. Upon isolating a task related unit response, the subject’s hand and
distal limb were searched for cutaneous or deep receptive fields using mechanical stimuli applied to the
skin (gentle pressure, von Frey hairs, light brushing) or by passive flexion and extension of the digits and
wrist. The subjects were accustomed to the procedure and remained passive throughout the process. In
many cases, no manner of cutaneous stimulation or joint manipulation produced a significant or
correlated neural response. Such units were disregarded.


Some units had vigorous responses during movement. Verification of a motor response came when the
subject’s hand was released, at which time vigorous and sustained neural activity correlated with
volitional hand or arm movement resumed. Units with responses during movement but no clear
responses to somatic stimulation were not analyzed in this study. When neurons with these characteristics
were consistently encountered in a particular electrode penetration, that chamber location was disregarded
for the remainder of the experiment and its location was not revisited in subsequent recording sessions.
After establishing the approximate boundary separating primary motor and sensory cortices, subsequent
recording penetrations were made 1-2 mm caudal to the boundary. Lateral and medial boundaries of the
SI hand representation were established by identifying sensory cutaneous receptive fields on the face and
distal forearm, respectively.


Sensory Receptive Field Classification

All units retained and analyzed for this study had clearly identifiable cutaneous receptive fields on the
volar or dorsal surface of the hand or digits. Cutaneous receptive fields were identified using mechanical
stimuli and passive joint manipulation, as described above. Units with only a perceptible response to
passive joint movement were discarded. Receptive fields were classified according to size (area) or
extent (joint crossings) into six categories, with sizes ranging from less than a single digital phalange
(class 1) to fields covering most of the palm and digits (class 6). Receptive fields extending across a joint
were often assigned to the next highest category even if the actual area did not cover the entire phalanges
where they were located. Units were not distinguished on the basis of slowly or rapidly adapting
responses.


Firing Rate Analysis

Individual task trials were grouped into one of four primary task conditions based on the type of object
presentation: Small Physical (SP), Large Physical (LP), Small Visual-Only (SV) and Large Visual-Only
(LV). The instantaneous firing rate for individual trials was calculated as described above, and the
individual trial firing rates for a given task variant were averaged into a single mean firing rate.


Unit Response Classification

A unit response was considered task-related if the mean firing rate during any single task phase was
significantly different from the mean rate during any other task phase. Statistical significance (α = .05)
was assessed using an unbalanced ANOVA test of mean firing rate bins grouped by task phase. Only
task-related units were analyzed in this study.


Task-related units were assigned primary and secondary response traits based on patterns of statistically
significant firing rate modulation. Neurons that had responses occurring during the contact interval in
Physical trials were labeled contact-driven. Neurons with responses during the movement or contact
intervals which were similar in the Physical and Visual-Only trials were defined as movement-driven.
Neurons which had only movement-driven or contact-driven responses were called Simple. Neurons in
which the responses had both movement- and contact-driven characteristics were labeled Mixed. In the
cases of neurons with Mixed responses, an attempt was made to discern whether the contact-driven or
movement-driven response involved a larger change in firing rates, and that response was then termed the
Primary response. Primary response types are indicated here with a capital letter, either C for Contact-
driven or M for Movement driven. The secondary response type was indicated with a lower-case letter.


Statistical Methods

Statistical comparisons of grouped data throughout the study were evaluated using ANOVA at the 95%
confidence level (α = 0.05). Multiple comparisons of grouped data used Tukey's Honestly Significant
Difference criterion (Tukey-Kramer) to provide an upper bound on the probability that any single
comparison would be incorrectly found significant.


RESULTS

Neural Population Analyzed

A total of 371 single units with tactile receptive fields on the hand were recorded in SI in two male rhesus
monkeys (monkey F: 63%, monkey I: 37%). Of these, 285 units (77%) exhibited statistically significant
task related activity and were included in the analysis (monkey F: 59%, monkey I: 41%). The remaining
86 unused units were not significantly modulated by the experimental task.


We evaluated recording locations by a combination of physiological, imaging, and post-mortem anatomic
criteria. A reconstruction of the recording sites for monkey I are shown in Fig. 2. Our verification of the
recording sites determined by physiological criteria was confirmed by both the stereotactic imaging
shown in Fig. 2A and the clear location of penetrations on the surface of the hemisphere, shown in Fig.
2B. These penetrations in monkey I came from a restricted portion of the chamber spanning 6 mm
medial-lateral and 3-4 mm rostral caudal, which agrees with the dimpling observed on the surface of the
cortex. Neurons from monkey F came from a similarly restricted region of the chamber, spanning 3-4
mm rostral-caudal and 2-3 mm medial-lateral. In both cases, anatomic and physiological indicators were
that the bulk of the neurons reported here came from primary somatosensory cortex. It is likely that we
recorded some neurons each in areas 4 and 5, but we could find no trends with respect to the
classifications of the neurons given below and distribution within the recording chambers in either
animal. Thus, we are confident that our observations capture properties of S1.
Kinematics of the Hand and Arm

Analysis of a subset of trials showed that subjects’ hand and arm kinematics up to the moment of object
contact were independent of whether the trial was physical or visual-only. That is, hand and arm
movement kinematics were highly stereotyped during both physical and visual-only tasks. The analysis
further showed that subjects generally stopped moving shortly after the collision event with little or no
additional movement, although enough trials had corrective movements that, on averaging, a second
velocity peak was clearly visible in the visual-only trials.


Examples of trajectories to the small physical (SP) and small visual-only (SV) targets are shown
schematically in Fig. 3A and B. The trajectory shown in B is an example of a worst case correction
occurring in a visual-only trial when the animal does not encounter a physical object. When examining
the mean trajectories over a day’s recordings (panels C and D), a secondary velocity hump does occur in
the visual-only trials, corresponding to some degree of corrective movement in those trials, although the
secondary velocity peak is typically minor. Importantly, the movements for the two tasks are comparable
up to the moment of contact.


Simple Responses

Contact-Driven Responses
Figure 4A shows the responses of a unit with a cutaneous receptive field on the volar surface of the distal
phalange of the thumb that appears to be driven by tactile interaction with the grasp objects. During
physical task conditions (panels 1, 3, and 5) the firing rate was unchanged through the Hold and Reach
phases, but this neuron responded briskly at object contact and throughout the haptic task phases
(Contact, Grasp). The tactile response observed during physical trials is absent during the visual-only
trials (panels 2, 4, and 6), during which this neuron exhibited no significant modulation of neural activity
during any task phase. Thus, firing rates were statistically indistinguishable from the non-haptic phases
of the physical tasks (panel 6). Based on these observations, the primary response trait for this unit was
classified as contact-driven, and no secondary response trait was assigned. Although this particular unit
did not demonstrate significant differential modulation of activity with respect to object size (panels 5 and
6), 91% (81/89) of Simple contact-driven responses did show modulation with respect to object size.
Simple contact-driven responses accounted for 31% (89/285) of all task-related unit responses recorded in
SI.
Movement-Driven Responses
Figure 4B illustrates a complementary response to that of the contact-driven unit of Fig. 4A. The
receptive field of this neuron spanned the full volar surface of the intermediate and distal phalanges of the
index and middle digits. The response to physical and visual trials is identical for a given object size
(panels 7 and 8). Background firing rate during Hold is nearly zero (< 1 spike/s), yet within 50 ms of
hand departure from the hold pad the unit fires with increasing frequency during the Reach phase before
abruptly silencing approximately 100 ms prior to object contact. Neural activity during Contact and
Grasp then returned to the background rate of the Hold phase in all task conditions. Based on these
observations, the primary response trait for this unit was classified as movement-driven, and no secondary
response trait was assigned.


This unit demonstrated significant size-dependent modulation of activity, which was observed in 34%
(11/32) of all Simple movement-driven responses. The difference is evident when comparing firing rates
for tasks with a common object size (e.g., Small Physical vs. Large Physical, subplot 5). In this case, the
SP firing rate is greater than that of the LP task, but both positive and negative correlations of object size
and firing rate were observed in the neural population. Out of 285 task-related neurons, 32 (11%)
responses recorded in SI were classified as Simple movement-driven.


Mixed Responses

Contact-Movement Responses
Figure 5A shows the response of a unit with a cutaneous receptive field on the volar surface of the
proximal phalange of the index finger. The response is similar to that of the contact-driven unit discussed
above, but with the addition of significant pre-contact neural activity in the SP task spike rasters (panel
1). This activity is also evident in the mean firing rate during Physical trials (both LP and SP) (panel 5)
and was effectively isolated in the SV task (panel 6). The response of this unit was primarily driven by
haptic interaction with the grasp objects and was therefore classified as contact-driven. The smaller pre-
contact activity evident in both physical and visual trials was classified as a secondary movement-driven
response trait, indicating that this component of the overall firing rate was driven by a non-contact
stimulus. A slight majority (52%, 149/285) of all task-related unit responses were Mixed, and 24%
(69/285) were classified as Contact-movement.


The Contact-movement unit in Fig. 5A also demonstrated significant size-dependent modulation of
activity, which was observed in 29% (20/69) of all Contact-movement responses. The difference is
evident when comparing firing rates during the Reach phase for tasks with a common object size (e.g.,
SP vs. LP, panel 5), and is isolated in the response to visual-only tasks (panel 6). The unit response to the
Small Physical task was significantly greater than the response to the Large Physical task.


In the comparison of visual-only task responses (panel 6) we also observed a small but statistically
significant “late” movement-driven response in the LV task that was masked in the mean firing rate of the
LP task (panel 5). The response was late only with respect to the SP task, a commonly observed feature
of the unit responses analyzed in this study. Both monkeys took slightly longer (50–100 ms) to reach for
and contact the large object. This observation was true for both physical and visual task conditions and
might be explained by the fact that successfully grasping the large object required the wrist to undergo
greater extension and the grasp aperture to widen considerably more than for the small object. Given the
timing of this response, beginning near the end of the Reach phase, this firing may be related to closing
the hand around the smaller object.


Movement-Contact Responses
Figure 5B shows the response of a unit with a cutaneous receptive field spanning the volar surface of the
distal and intermediate phalanges of digits two through four. In both physical and visual trials a strong
pre-contact response was observed. Physical trials featured a strong tonic response during Contact and
Grasp (panels 1 and 3). In visual-only trials, the tonic response to haptic task phases was absent and
firing rate was essentially zero (< 1 spike/s) during Hold, Contact and Grasp (panels 2 and 4). The larger
pre-contact response was assigned as the primary response trait and designated as movement-driven. The
comparatively smaller magnitude response during haptic task phases was assigned as the secondary
response trait and designated as contact-driven. Movement-contact responses accounted for 28% (80/285)
of all task-related responses.


The Movement-contact (Mc) unit in Fig. 5B also demonstrated significant size-dependent modulation of
activity, which was observed in 40% (32/80) of all Movement-contact responses. The difference is
evident in the primary movement-driven component of the response (panels 5 and 6), where the response
during the Reach phase of the SP task was again significantly greater than the response in the LP task.
Also notable in the response is the previously discussed delay of the large object tasks during the Reach
phase.


Odd Responses

Finally, a small group of neurons (5%, 15/285) exhibited a clear modulation of firing rate during the task
but did not fit the contact or movement-driven response classifications described above. We simply label
these as Odd responses. Figure 6A shows the response of an Odd unit with a cutaneous receptive field on
the volar surface of the distal phalange of the middle finger. The response of this unit was preferentially
elicited for the small object, with almost no modulation during large object tasks. The pre-contact activity
during small object tasks began immediately after the hand left the hold pad and quickly ceased at object
contact (panels 1, 2, and 7). Leaving aside the lack of response to the large object, such a response would
normally have been classified as Simple movement-driven if not for a sudden transient increase of
approximately 40% in the SV firing rate (with respect to the SP rate) at the moment of expected contact
with the object. Firing rate then declined at a rate exceeding that of the SP task, undershooting the SP
response before silencing completely near the end of the Grasp phase. Further analysis showed that the
transient response occurred when a physical trial was followed by a virtual trial, but not when a virtual
trial followed another virtual trial.


Figure 6B shows the analysis of another Odd unit with a cutaneous receptive field on the volar face of the
intermediate phalange of the middle finger. Spike rasters for the small and large physical task conditions
show an almost complete lack of neural activity over the course of 80 trials (panels 1, 3, and 5). This
appears to indicate that the unit response is driven by neither contact nor movement-related stimuli, but
recall that all units recorded and analyzed for this study exhibited unambiguous responses to cutaneous
stimulation on the hand. In contrast the response is strongly modulated during visual-only trials early in
the Contact phase (panels 2, 4, and 6), but silenced before the start of Grasp. Note also that the response
is not differentially modulated with object size.


Odd responses were considerably more varied than the comparatively stereotyped contact-driven and
movement-driven classes into which 95% of all task related units were assigned. Still, a common
characteristic of these responses was the appearance of significant neural activity with no apparent causal
stimulus.


Response Traits and Classes Summary

Table 1 details the distribution of response traits and classes observed in the neural population in SI. A
slight majority of units (52%, 149/285) had Mixed responses divided nearly equally between primarily
contact-driven (Cm, 46%) and primarily movement-driven (Mc, 54%) response classes. Simple unit
responses (42%, 121/285) were mostly contact-driven (74% C vs. 26% M). Odd responses comprised
(5%, 15/285) of all task-related responses.


Fifty-five percent (158/285) of all primary response traits were contact-driven and 46% (69/149) of all
secondary response traits were contact-driven. The distribution of primary response traits was skewed
toward contact-driven responses in monkey I (78% contact vs. 17% movement), but the two traits were
more equally represented in the sample from monkey F (40% contact vs. 54% movement). The
distribution of secondary response traits in monkey I was also skewed, but in favor of movement-driven
responses (26% contact vs. 74% movement). Secondary traits were more equally distributed in monkey F
(65% contact vs. 35% movement). We expect that these differences are largely due to slight differences in
the distribution of recording sites between the two monkeys, and discuss this further below.


Receptive Field Characterization

Figure 7 details the location, size classification and recording depth of all task related units recorded in SI.
Panel 7A shows the location of receptive fields for all units mapped prior to recording in the behavioral
task. The vast majority of cutaneous receptive fields (81%, 231/285) were located on the volar surface of
the palm and digits, most densely concentrated on the distal phalanges and metacarpophalangeal (MCP)
joints of digits one through three. Receptive fields on the dorsal hand surface (8%, 24/285) were
concentrated along the ulnar face of the thumb and radial face of the first digit, especially near the MCP
joint. Receptive fields classified as both (11%, 30/285) occupied both volar and dorsal skin surfaces (see
also panel 7C)


Panel 7B shows the distribution of receptive field size classes in order of increasing area. A majority of
cutaneous receptive fields (85.3%, 243/285) were small (classes 1 and 2), occupying no more than two
digital phalanges or an equivalent area on the palm. Large receptive fields extending across more than
two digits (classes 5 and 6) were uncommon (7%, 21/285) and rarely demonstrated clear or significant
task tuning. Panel 7D shows the distribution of recording depths of all task related units recorded in SI.
Depths were distributed about two primary modes with approximate means of 1.27 ± 0.57 mm (75%,
215/285) and 4.51 ± 0.79 mm (25%, 70/285). Interpretation of these numbers is difficult because
electrode penetrations were oriented in stereotaxic vertical, rather than perpendicular to the cortical
surface.


While the same general trend of small receptive field size in the complete sample was true for all
individual response classes (Fig. 8, A-D, center panels), a few statistically significant (α = .05) differences
were observed. Among individual classes, the receptive fields of Movement-contact cells were somewhat
larger than those of both Contact (∆meanSizeClass = 0.29, p = .015) and Contact-movement units
(∆meanSizeClass = 0.42, p < .001). More generally, units with primary movement-driven response traits had
slightly larger receptive fields than those with contact-driven primary response traits (∆meanSizeClass =
0.30, p < .001).
Several significant differences in mean recording depth (Fig. 8, A-D, right panels) were observed among
individual response classes. Contact units were recorded at comparatively shallower depths than both
Contact-movement units (∆meandepth = 0.74 mm, p < .001) and Movement-contact units (∆meandepth =
1.16 mm, p < .001). Movement units units were encountered at comparatively shallower depths than
Movement-contact units (M: ∆meandepth = 1.0 mm, p = .002) . The general trend is further clarified by a
comparison of primary response traits: units with contact-driven primary responses were recorded at
comparatively shallower depths than units with movement-driven primary responses (∆meandepth = 1.60
mm, p < .001). These data indicate that contact-driven responses were characteristic of neurons located
at relatively shallow cortical depths, while movement-driven and finally odd responses were more
common with increasing depth.


Recording Grid Coordinates
Grid coordinate refers to the alignment grid of the neural recording drive platform used to position and
support electrode guide tubes during neurophysiological recording experiments. Alignment holes were
spaced at 1.0 mm intervals in a grid pattern and assigned integer coordinates along orthogonal axes
corresponding roughly to the caudal-rostral (CR) and medial-lateral (ML) directions. Unique recording
chamber geometry in each experimental subject required separate analyses of the statistical dependence of
response characteristics on grid coordinates. In neither monkey was CR nor ML coordinates a significant
predictor of response class (Monkey F: pCR = .401, pML = .111; Monkey I: pCR = .228, pML = .864).
DISCUSSION

Single units in SI encode multiple sensory phenomena

The major finding of this study is that single units in the hand representation of SI appear to
simultaneously encode multiple distinct sensory phenomena in components of their overall firing rate.
Contact-driven and movement-driven responses to a stereotyped behavioral task were distinguished and
quantified with the aid of a virtual reality simulation by removing the correspondence between the
expected and actual task outcome. The results of our sample population show that a majority of single
units in the hand representation of SI with cutaneous receptive fields encoded information about both
contact-driven and movement-driven sensory modalities. The clear distinction between and
quantification of multiple, simultaneous information streams present in the activity of SI cortical neurons
has not been previously demonstrated. Although firing rate responses were grouped into a limited number
of discrete classes for the purpose of group statistical analysis, the great variety of individual unit
responses observed in our sample of the neural population suggest a broad distribution of response types
in SI with varying combinations of contact-driven and movement-driven response traits.


Additionally, a small percentage of units could not be classified using contact-driven and movement-
driven response traits. These units may encode yet another class of sensory information that is actually
more widely represented in the neural population but was not adequately captured by the experimental
methods of this study.


Limitations of Interpretation

Recording location
Three distinct sensory representations of the body are found within primary somatosensory cortex itself
(areas 3b, 1 and 2), each driven by inputs originating from cutaneous and/or deep receptors in varying
proportion. It is also well established that area 4 of primary motor cortex receives significant input from
both cutaneous and deep receptors and contains at least two complete sensory representations of the body;
a caudal motor area receiving mostly cutaneous inputs and a rostral motor area receiving mostly deep
inputs (Lemon and Porter 1976a; b; c; Strick and Preston 1982; Wise and Tanji 1981).


Despite these considerations, several lines of evidence strongly support the claim that the overwhelming
majority of neurons described in this work were recorded from areas 3b, 1 and 2 of primary
somatosensory cortex. As described in the Methods, criteria used by Mountcastle were used to
empirically identify the hand representation of SI during initial recording experiments (Mountcastle et al.
1990). Subsequently, neurophysiological recordings in the same region encountered almost exclusively
sensory responses with small receptive fields on the hand and distal forearm. Additional post-hoc work
reinforced this conclusion. Three-dimensional recording chamber reconstructions using MRI verified that
SI was the primary recording zone, and ICMS experiments conducted in the recording regions in both
subjects produced movement twitches only at large amplitudes of electrical current. We also observed on
the cortical surface of monkey I a clear region of penetrations (see Fig. 2). Despite these measures, it is
possible that a small proportion of the units analyzed in this study might have been recorded in MI. It is
possible that some movement-driven neurons described in this study were in motor cortex, however
anatomical analyses such as that shown in Fig. 2C indicate that neural response classes were evenly
distributed throughout the cortical recording region. Thus, even if some small proportion of the neurons
analyzed in this study were indeed recorded in motor cortex, the general conclusions still hold; that
neurons in SI encode multiple relevant parameters during naturalistic reach-to-grasp movements.


Origin of Responses
Considerable care was taken throughout this study to include data only from single cortical units with
cutaneous receptive fields. Despite these measures, the possibility remains that some responses were
influenced by inputs from deep receptors, especially considering the extensive pre-cortical divergence and
convergence of sensory afferents in brainstem and thalamic nuclei. For example, it is estimated that a
single primary afferent from the hand may project to ~1,700 cuneate neurons, while a single cuneate
neuron receives projections from ~300 sensory afferents (Johansson and Flanagan 2009). This was also a
factor in the decision not to characterize unit responses with respect to peripheral receptor types (depth or
adaptation rate). Such designations are best suited to describe the response of individual receptors, or that
of several closely associated receptors of the same type recorded from peripheral afferents. They may
have little relevance to cortical responses driven by highly convergent inputs that might represent multiple
sensory modalities.


Another possibility is that movement-driven responses traits (primary or secondary) might actually
represent an efference copy of motor commands originating in frontal motor cortical areas (Flanagan et al.
2003). Recent fMRI data suggest that frontal cortex has at least modulatory access to primary
somatosensory cortex (Christensen et al. 2007). This explanation cannot be ruled out, although certain
aspects of the observed neural responses do not support that conclusion. First, in the case of Simple
movement-driven responses, we did not observe significant neural activity that could be attributed to
secondary movements such as hand trajectory corrections during visual-only trials or hand withdrawal in
the late Grasp phase for both physical and visual-only trials. Second, recall that every unit analyzed in
this study, even those with simple movement-driven responses, had a cutaneous receptive field on the
hand. Attributing all movement-driven activity to a delayed copy of motor output cannot account for the
absence of contact-driven responses during Contact and Grasp for Simple movement-driven units with
cutaneous receptive fields in physical contact with the grasp objects. In the case of simple contact-driven
units, responses were clearly driven by tactile stimuli during Contact and Grasp. Evidence of efference
copy might be expected during Reach or even the late Grasp phase when the hand was removed from the
object and returned to the hold pad, but none was observed. Such activity would have been especially
evident during visual-only trials when contact-driven feedback was not present to mask it, but again, none
was observed. Additionally, such activity was not observed in Mixed responses. From these
observations, one must either conclude that a delayed copy of motor output was actively silenced in just
one of the four major response classes (Simple contact-driven) or that non-contact-driven neural activity
observed in the neural population of SI was at least partially attributable to sources other than efference
copy.


Simple Responses

Contact-driven responses were the most common trait observed in the neural population (primary: 55%,
primary + secondary: 84%). In a certain sense, exteroception was present in exactly 100% of the
population sample, since cutaneous receptive fields were identified for all units. Movement-driven traits
were also commonly observed (primary: 39%, primary + secondary: 64%), indicating that a significant
proportion of somatic sensory feedback, as represented in cortex, is multimodal. Perhaps the most
unexpected result was the discovery of apparently contact-driven units (a clearly identifiable receptive
field) with exclusively movement-driven (Simple movement-driven) responses. It is possible that our
cutaneous stimulation was inducing subtle joint movements, but neurons were selected specifically that
did not have a clearly differentiated response to overt joint movements. Instead, this might imply some
form of neural gating, since the cutaneous response so clearly evident when mapping the receptive field
was absent in the task response. An alternative explanation is that the mechanical stimulus provided by
object contact during the behavioral task was insufficient or sufficiently different from the stimuli used to
identify the receptive field in the first place. However, this explanation does not account for the fact that
most of the cutaneous responses identified for use in this study demonstrated extreme sensitivity to even
the lightest stimuli, often firing at rates in excess of 100 spikes per second. Thus, any contact with the
grasp objects was normally sufficient to evoke a strong neural response.


Mixed Responses

Mixed responses recorded in SI were the rule, rather than the exception, comprising 52% of unit
responses. Mixed responses with movement-driven traits (primary or secondary) also provide evidence of
response gating since movement-driven responses evident during Reach (physical and visual-only trials)
were absent during haptic task phases in visual trials when subjects made brief kinematic adjustments to
complete transition from the expected physical task to the visual task. As an example, consider the strong
movement-driven response shown in Fig. 5B. A comparison of physical and visual trials clearly shows
that the movement-driven portion of the overall response occurred before contact with the object.
Evidence of response gating is provided by the complete lack of a movement-driven response during the
visual tasks. During the reach phase of the task, subjects rapidly extended the arm toward the target
object, with an approach time typically in the range of 300 – 400 ms. In visual trials the hand would often
overshoot the expected object location after which subjects would quickly move to the appropriate
position in space to complete the visual task (see e.g. Fig. 3). Despite the strong movement-driven
response during Reach, no evidence of a similar response was observed while adjusting to the visual-only
task.


The distribution of primary and secondary response traits were highly skewed in monkey I, where 78%
of primary and 26% of secondary traits were contact-driven, with 5 percent of traits classified as “Odd.”.
The distribution in monkey F was somewhat more balanced, where 40% of primary and 65% of
secondary response traits were contact-driven, with 5 percent of traits classified as “Odd.”. The cause of
this apparent difference is unclear and may simply reflect differences among subjects. We note, however,
that the recording region in monkey I was significantly more focused (~12 mm2) than for monkey F (~24
mm2), which may have contributed to a greater diversity of response characteristics. It was also observed
that movement-driven response traits were more prevalent with increasing cortical depth. An analysis of
recording depth by subject revealed that the mean (±σ) unit depth in monkey F was 2.7 ± 0.1 mm, while
the mean unit depth in monkey I was 1.1 ± 0.1 mm (Δ = 1.6 mm, p < .001). On average, units were
recorded at greater depths in monkey F, which may account for the greater occurrence of movement-
driven response traits.


Odd Responses

Odd responses were rarely encountered and difficult to classify in terms of contact-driven or movement-
driven response traits. One example is shown in Fig. 6A, where a unit with a fingertip receptive field
responded only to small object tasks and suddenly increased its firing rate by 40% during visual-only
trials at the moment of expected object contact. This brief phasic response might be explained by the
receptive field contacting the palm of the hand during grasp, or by a comparatively greater range of digit
flexion when the small physical object was not encountered in visual-only trials. Another example is
shown in Fig. 6B, where the only significant response came at the moment of expected object contact
with both small and large objects, and exclusively in visual-only trials. This response cannot be explained
by differential activation of sensory receptors with object size. Odd units might represent a distinct type
of sensory information, perhaps a complex integration of sensory inputs that defies simple classification
by contact-driven and movement-driven response traits. It is possible that these responses are much more
widely present in SI but were inadequately isolated or characterized by the experimental techniques of
this study.


Group Analysis

Group analysis confirmed the major differences between response classes and revealed trends not
apparent in the inspection of individual single unit responses. The most significant of these was a subtle
but revealing difference between Simple and Mixed responses between the Reach and Contact task
phases. During the Reach phase, M units were active at levels far above the task mean and were more
modulated than Mc units, while C units were active at levels far below the task mean and were less
modulated than Cm units. This relationship changed during the Contact phase, when all four response
classes were modulated at levels far above the task mean and Mixed classes exceeded the activity of their
Simple counterparts. This is significant because the Contact phase (especially the leading edge) is the
first time in the task that both contact-driven and movement-driven stimuli coincide. At this moment,
units encoding both contact and movement-driven stimuli consistently demonstrated a greater overall
depth of modulation than units encoding a single modality, suggesting that not only different but
quantitatively more information was relayed by Mixed response types when both types of stimuli were
present.


Significance of Results

A key difficulty in somatosensory research is bridging the gap between understanding the detailed
structure of S1 receptive fields and understanding how neurons behave in naturalistic tasks. It is widely
acknowledged that a key problem in somatosensory neurophysiology is that the medium in which
receptors are embedded, the skin, is prone to a wide range of stimulation and deformation as an animal
moves. In particular, relative to the kinds of deformations which happen when the skin is directly
impinged by an outside object, the stretch and overlap observed with arm movements is likely to be
relatively subtle.


It is also known that cortical neurons with cutaneous receptive fields on the proximal arm can fire in
relation to arm movement, but it remains unclear how that relates to the deformation of the skin or how or
even whether that modulation is used by the central nervous system. Compared to the firing rates elicited
when the skin is directly probed, the depth of modulation of SI neurons with respect to arm movements is
likewise subtle.


Many of the mechanical subtleties become much more dramatic with hand movements. Finger joints can,
and often do, move over a span of 90 degrees, introducing substantial deformations and responses in
primary afferents of both the hairy (Edin 1992; Edin and Abbs 1991) and glabrous (Burke et al. 1988;
Goodwin and Wheat 2004) skin of the hand.        Our concern in designing the experiments reported here
was to separate that element of modulation from the modulation driven specifically by contact between
the skin and objects in the environment. Our findings here of robust activity during movement in
neurons with cutaneous receptive fields on the hand suggests that the movement-related modulation is a
constitutive part of coding in these neurons.


We have shown that the contact and movement-related signals, when multiplexed, can be teased apart
experimentally. In fact, this is an important step towards the extraction of parameterized sensory
information from cortex, and eventually towards providing sensory information back into the cortex.
However, the results also suggest to us that these neurons are coding for something more substantial than
contact with the skin. In order to perform haptic tasks, it is necessary to combine traditionally tactile
signaling (what is the shape, texture, temperature, etc. of the surface) with proprioceptive signaling (how
are these receptive fields organized with respect to one another spatially). Similar skin strain patterns can
mean dramatically different things depending on whether the fingers are in a line or in a circle. And in
fact tactile illusions can be very sensitive to the posture of the fingers (Warren et al. 2011). It is possible
that SI is already carrying out the computations necessary for stereognosis. Several lines of evidence now
suggest how those computations may be structured (Rincon-Gonzalez et al. 2011).


All of this is likely to be important for bidirectional neuroprosthetics (O'Doherty et al.). In the area of
motor control, much experimental work has been carried out to extract useful information from the
activity of neural populations in motor planning and output areas of cortex. These efforts have led to the
successful development of computational techniques for extracting kinematic parameters for volitional
movement in both monkeys and humans based largely on the insight that the direction of intended
movement is represented by the directionally-sensitive tuning of single units in motor cortex
(Georgopoulos et al. 1982). The ability to generate volitional motor output is a necessary, but ultimately
insufficient, criterion for realizing the full potential of the hand as an exquisitely capable motor and
sensory organ, a fact that was convincingly demonstrated when Mott and Sherrington (Mott and
Sherrington 1895), more recently duplicated by Twitchell (Twitchell 1954), removed all sensory
feedback from the hands in monkeys and observed a complete lack of use. Efforts to decode the flood of
sensory information arriving in cortex are crucial not only from a purely scientific perspective, but also
for efforts to restore lost or impaired sensory function in humans. The results of this study advance our
understanding of the particular strategies by which sensory information is represented in primary cortical
areas and represent a step towards the extraction of parameterized sensory information that can be utilized
in engineered neuroprosthetic systems.
CONCLUSIONS

The results of this study demonstrate that single units in primary somatosensory cortex with cutaneous
receptive fields on the hand encode both contact-driven and movement-driven sensory information in
their firing rates. A majority of units simultaneously encoded both sensory phenomena during normal
hand use. An analysis of normalized firing rate modulation amongst different neural response classes also
demonstrated that Mixed responses encode not only different sensory information, but quantifiably more
such information. Unit responses also demonstrated evidence that sensory feedback from the hand is
gated in a task-dependent manner. We hypothesize that the considerable variety of responses observed in
the neural sample of SI represents a continuous distribution of response types, with varying combinations
of contact-driven and movement-driven response traits. Future neuroprosthetic systems must distinguish
and decode multiple layers of sensory information present in the activity of single cortical units to
correctly interpret peripheral sensory events.


ACKNOWLEDGEMENTS

The authors thank Michelle Armenta Salas for invaluable assistance in analyzing the kinematics of this
task and Rachele McAndrew for her work with the animals. This study was supported by the National
Institutes of Health Grant R01 NS-S050256-03.
FIGURES AND TABLES



Table 1. Response traits and classes for all task-related units.
Figure 1A: The robot-enhanced Virtual Reality Environment. The animal sits in a primate chair and
views an immersive virtual environment through a mirror in which is projected the 3d display. The
animal’s motions are captured by a camera-based motion capture system and visualized in the display
along with movement targets. Targets are displayed in some trials in register with an instrumented grasp
object which is presented on some trials by a small robotic arm. B. Time course of a trial in the Reach-to-
Grasp task.
Fig. 2 Recording locations from monkey I. A screen-shot from Monkey Cicerone© showing an MR
section through the hand area of somatosensory cortex and the chamber placement in monkey I. B. A
view of the surface of the left hemisphere in monkey I showing the section of post-central gyrus with
penetrations indicated by the red line. C. The 2-D chamber map for the recordings in monkey I, with a
tentative indication of the location of the central sulcus in the chamber, along with markers showing
penetrations, cell classifications found in each penetration, and the location of the single ICMS site in our
recording area that elicited a twitch, at over 90 µA.
Figure 3: Kinematic analysis. An example trajectory for a small physical trial, B. Example trajectory
for a small visual-only trial. C. A day’s average index finger speed for small physical and small visual-
only trials. D. A day’s average index finger speed for large physical and large visual-only trials. See
text for details.
Figure 4. Simple response type cells. A. Contact-driven neuron. B. Movement-driven neuron. Rasters
(panels 1-4) for this (and the subsequent) figure are broken into trials to each of the four conditions, large
(L) and small (S) objects in physical (P) and visual-only (V) trials. Panels 5 and 6 show overlaid
histograms for comparison between small (S) vs. large (L) trials. Panels 7 and 8 show histograms for P
vs. V trials.
Figure 5. Mixed response type cells. A. Primarily contact-driven neuron, with a small additional
component related to movement to the smaller of the objects (panel 7). B. Primarily movement-driven
neuron, with additional firing during contact with the physical objects (panel 5). Format as Fig. 4.
Figure 6. Odd units: Neurons that were clearly task-related, but which were difficult nonetheless to
classify. A. A neuron which fired only during reach towards the small object, but which fired at a
significantly higher rate for interactions with the virtual object than the physical object (panel 3).B. A
neuron which responded only when the animal interacted with a virtual object (panels 2, 4, and 6).
Format as Fig. 4.
Figure 7. Summary of receptive field types and recording locations. A. Location of the receptive fields
for all of the neurons reported here. B. Classification according to the size of the receptive fields. C.
Breakdown of cell properties based on the location of the receptive fields. D. Depth relative to first
cortical contact for the neurons reported here.
Figure 8: Breakdown of cells by classification type. Receptive field locations, sizes, and recording
depths of each of the four classes of neurons. Format as Fig. 7.
DISCUSSION

Bensmaia SJ, Denchev PV, Dammann JF, 3rd, Craig JC, and Hsiao SS. The representation of
       stimulus orientation in the early stages of somatosensory processing. J Neurosci 28:
       776-786, 2008a.
Bensmaia SJ, Hsiao SS, Denchev PV, Killebrew JH, and Craig JC. The tactile perception of
       stimulus orientation. Somatosens Mot Res 25: 49-59, 2008b.
Burke D, Gandevia SC, and Macefield G. Responses to passive movement of receptors in
       joint, skin and muscle of the human hand. Journal of Physiology (London) 402: 347-361,
       1988.
Carmena JM, Lebedev MA, Crist RE, O'Doherty JE, Santucci DM, Dimitrov DF, Patil PG,
       Henriquez CS, and Nicolelis MA. Learning to control a brain-machine interface for
       reaching and grasping by primates. PLoS Biol 1: E42, 2003.
Christensen MS, Lundbye-Jensen J, Geertsen SS, Petersen TH, Paulson OB, and Nielsen
       JB. Premotor cortex modulates somatosensory cortex during voluntary movements
       without proprioceptive feedback. Nat Neurosci 10: 417-419, 2007.
Cohen DAD, Prud'homme MJL, and Kalaska JF. Tactile activity in primate somatosensory
       cortex during active arm movements: correlation with receptive field properties. J
       Neurophysiol 71: 161-172, 1994.
Cordo PJ, Horn JL, Kunster D, Cherry A, Bratt A, and Gurfinkel V. Contributions of skin
       and muscle afferent input to movement sense in the human hand. J Neurophysiol 105:
       1879-1888, 2011.
Edin BB. Quantitative analysis of static strain sensitivity in human mechanoreceptors from hairy
       skin. J Neurophysiol 67: 1105-1113, 1992.
Edin BB, and Abbs JH. Finger movement responses of cutaneous mechanoreceptors in the
       dorsal skin of the human hand. J Neurophysiol 65: 657-670, 1991.
Edin BB, and Johansson N. Skin strain patterns provide kinaesthetic information to the human
       central nervous system. Journal of Physiology (London) 487: 243-251, 1995.
Evarts EV, and Fromm C. Sensory responses in motor cortex neurons during precise motor
       control. Neurosci Lett 5: 267-272, 1977.
Flanagan JR, Vetter P, Johansson RS, and Wolpert DM. Prediction precedes control in motor
       learning. Curr Biol 13: 146-150, 2003.
Fromm C, and Evarts EV. Pyramidal tract neurons in somatosensory cortex: central and
       peripheral inputs during voluntary movement. Brain Res 238: 186-191, 1982.
Fromm C, and Evarts EV. Relation of motor cortex neurons to precisely controlled and
       ballistic movements. Neurosci Lett 5: 259-265, 1977.
Fromm C, Wise SP, and Evarts EV. Sensory response properties of pyramidal tract neurons in
       the precentral motor cortex and postcentral gyrus of the rhesus monkey. Exp Brain Res
       54: 177-185, 1984.
Georgopoulos AP, Caminiti R, Kalaska JF, and Massey JT. Spatial coding of movement: a
       hypothesis concerning the coding of movement direction by motor cortical populations.
       Exp Brain Res Suppl. 7: 327-336, 1983.
Georgopoulos AP, Kalaska JF, Caminiti R, and Massey JT. On the relations between the
      direction of two-dimensional arm movements and cell discharge in primate motor cortex.
      J Neurosci 2: 1527-1537, 1982.
Georgopoulos AP, Kalaska JF, Crutcher MD, Caminiti R, and Massey JT. The
      representation of movement direction in the motor cortex: single cell and population
      studies. In: Dynamic Aspects of Neocortical Function, edited by Edelman GM, WE Gail,
      and Cowan WM1984, p. 501-524.
Georgopoulos AP, Schwartz AB, and Kettner RE. Neuronal population coding of movement
      direction. Science 233: 1416-1419, 1986.
Goodwin AW, and Wheat HE. Sensory signals in neural populations underlying tactile
      perception and manipulation. Annu Rev Neurosci 27: 53-77, 2004.
Johansson RS, and Flanagan JR. Coding and use of tactile signals from the fingertips in object
      manipulation tasks. Nat Rev Neurosci 10: 345-359, 2009.
Kalaska JF, Cohen DAD, Prud'homme M, and Hyde ML. Parietal area 5 neuronal activity
      encodes movement kinematics, not movement dynamics. Exp Brain Res 80: 351-364,
      1990.
Lawson R, and Bracken S. Haptic object recognition: how important are depth cues and plane
      orientation? Perception 40: 576-597, 2011.
Lemon RN, and Porter R. A comparison of the responsiveness to peripheral stimuli of pre-
      central cortical neurones in anaesthetized and conscious monkeys [proceedings]. J
      Physiol 260: 53P-54P, 1976a.
Lemon RN, and Porter R. Afferent input to movement-related precentral neurones in conscious
      monkeys. Proc R Soc Lond B Biol Sci 194: 313-339, 1976b.
Lemon RN, and Porter R. Proceedings: Natural afferent input to movement-related neurones in
      monkey pre-central cortex. J Physiol 258: 18P-19P, 1976c.
McAndrew RM, Lingo-VanGilder JL, Naufel SN, and Helms Tillery S. Individualized
      recording chambers for non-human primate neurophysiology.
Miocinovic S, Noecker AM, Maks CB, Butson CR, and McIntyre CC. Cicerone: stereotactic
      neurophysiological recording and deep brain stimulation electrode placement software
      system. Acta Neurochir Suppl 97: 561-567, 2007.
Mott FW, and Sherrington CS. Experiments upon the influence of sensory nerves upon
      movement and nutrition of the limbs. Preliminary communication. Proc Roy Soc Lon 57:
      481-488, 1895.
Mountcastle VB, Steinmetz MA, and Romo R. Frequency discrimination in the sense of
      flutter: psychophysical measurements correlated with postcentral events in behaving
      monkeys. J Neurosci 10: 3032-3044, 1990.
Nawrot M, Aertsen A, and Rotter S. Single-trial estimation of neuronal firing rates: from
      single-neuron spike trains to population activity. J Neurosci Methods 94: 81-92, 1999.
O'Doherty JE, Lebedev MA, Ifft PJ, Zhuang KZ, Shokur S, Bleuler H, and Nicolelis MA.
      Active tactile exploration using a brain-machine-brain interface. Nature 479: 228-231.
Pei YC, Hsiao SS, Craig JC, and Bensmaia SJ. Neural mechanisms of tactile motion
      integration in somatosensory cortex. Neuron 69: 536-547, 2011.
Pei YC, Hsiao SS, Craig JC, and Bensmaia SJ. Shape invariant coding of motion direction in
        somatosensory cortex. PLoS Biol 8: e1000305, 2010.
Prud'homme MJL, and Kalaska JF. Proprioceptive activity in primate primary somatosensory
        cortex during active arm reaching movements. J Neurophysiol 72: 2280-2301, 1994.
Rincon-Gonzalez L, Warren JP, Meller DM, and Tillery SH. Haptic interaction of touch and
        proprioception: implications for neuroprosthetics. IEEE Trans Neural Syst Rehabil Eng
        19: 490-500, 2011.
Serruya MD, Hatsopoulos NG, Paninski L, Fellows MR, and Donoghue JP. Brain-machine
        interface: Instant neural control of a movement signal. Nature 416: 141-142., 2002.
Sinclair RJ, and Burton H. Neuronal activity in the second somatosensory cortex of monkeys
        (Macaca mulatta) during active touch of gratings. J Neurophysiol 70: 331-350, 1993.
Sinclair RJ, and Burton W. Neuronal activity in the primary somatosensory cortex in monkeys
        (Macaca mulatta) during active touch of textured surface gratings: responses to groove
        width, applied force, and velocity of motion. J Neurophysiol 66: 153-169, 1991.
Soso MJ, and Fetz EE. Responses of identified cells in postcentral cortex of awake monkeys
        during comparable active and passive joint movements. J Neurophysiol 43: 1090-1110,
        1980.
Strick PL, and Preston JB. Two representations of the hand in area 4 of a primate. II.
        Somatosensory input organization. J Neurophysiol 48: 150-159, 1982.
Taylor DM, Tillery SI, and Schwartz AB. Direct cortical control of 3D neuroprosthetic
        devices. Science 296: 1829-1832., 2002.
Taylor DM, Tillery SI, and Schwartz AB. Information conveyed through brain-control: cursor
        versus robot. IEEE Trans Neural Syst Rehabil Eng 11: 195-199, 2003.
Tillery SI, Soechting JF, and Ebner TJ. Somatosensory cortical activity in relation to arm
        posture: nonuniform spatial tuning. J Neurophysiol 76: 2423-2438., 1996.
Twitchell TE. Sensory factors in purposive movement. J Neurophysiol 17: 239-252, 1954.
Velliste M, Perel S, Spalding MC, Whitford AS, and Schwartz AB. Cortical control of a
        prosthetic arm for self-feeding. Nature 2008.
Warren JP, Santello M, and Helms Tillery S. Effects of fusion between tactile and
        proprioceptive input on tactile perception. Public LIbrary of Science One 6: 2011.
Wise SP, and Tanji J. Neuronal responses in sensorimotor cortex to ramp displacements and
        maintained positions imposed on hindlimb of the unanesthetized monkey. J Neurophysiol
        45: 482-500, 1981.

More Related Content

What's hot

Nerve Gliding Exercises - Excursion and Valuable Indications for Therapy
Nerve Gliding Exercises - Excursion and Valuable Indications for TherapyNerve Gliding Exercises - Excursion and Valuable Indications for Therapy
Nerve Gliding Exercises - Excursion and Valuable Indications for TherapySarah Arnold
 
Duerden Rotman 2009 07 29
Duerden Rotman 2009 07 29Duerden Rotman 2009 07 29
Duerden Rotman 2009 07 29eduerden
 
neurodynamic-testing-and-neural-mobilization.pdf
neurodynamic-testing-and-neural-mobilization.pdfneurodynamic-testing-and-neural-mobilization.pdf
neurodynamic-testing-and-neural-mobilization.pdfmupt77
 
Analysis of Surface Electromyogram Signals during Human Finger Movements
Analysis of Surface Electromyogram Signals during Human Finger MovementsAnalysis of Surface Electromyogram Signals during Human Finger Movements
Analysis of Surface Electromyogram Signals during Human Finger MovementsIJERD Editor
 
Neural mobilization
Neural mobilizationNeural mobilization
Neural mobilizationDinesh Kumar
 

What's hot (7)

My papers
My papersMy papers
My papers
 
Nerve Gliding Exercises - Excursion and Valuable Indications for Therapy
Nerve Gliding Exercises - Excursion and Valuable Indications for TherapyNerve Gliding Exercises - Excursion and Valuable Indications for Therapy
Nerve Gliding Exercises - Excursion and Valuable Indications for Therapy
 
Duerden Rotman 2009 07 29
Duerden Rotman 2009 07 29Duerden Rotman 2009 07 29
Duerden Rotman 2009 07 29
 
neurodynamic-testing-and-neural-mobilization.pdf
neurodynamic-testing-and-neural-mobilization.pdfneurodynamic-testing-and-neural-mobilization.pdf
neurodynamic-testing-and-neural-mobilization.pdf
 
Neurodynamics
NeurodynamicsNeurodynamics
Neurodynamics
 
Analysis of Surface Electromyogram Signals during Human Finger Movements
Analysis of Surface Electromyogram Signals during Human Finger MovementsAnalysis of Surface Electromyogram Signals during Human Finger Movements
Analysis of Surface Electromyogram Signals during Human Finger Movements
 
Neural mobilization
Neural mobilizationNeural mobilization
Neural mobilization
 

Viewers also liked (18)

Luxemburg
Luxemburg Luxemburg
Luxemburg
 
Montenegro
MontenegroMontenegro
Montenegro
 
ATIC Annual Report 2011
ATIC Annual Report 2011ATIC Annual Report 2011
ATIC Annual Report 2011
 
Mimi
MimiMimi
Mimi
 
Best friends forever
Best friends foreverBest friends forever
Best friends forever
 
Lizelle anne
Lizelle anneLizelle anne
Lizelle anne
 
Selected Work Portfolio
Selected Work PortfolioSelected Work Portfolio
Selected Work Portfolio
 
1 introduccion
1   introduccion1   introduccion
1 introduccion
 
Building the SMoRG Lab
Building the SMoRG LabBuilding the SMoRG Lab
Building the SMoRG Lab
 
GHS PTSA After Prom 2013 Presentation
GHS PTSA After Prom 2013 PresentationGHS PTSA After Prom 2013 Presentation
GHS PTSA After Prom 2013 Presentation
 
How the EU takes decisions
How the EU takes decisionsHow the EU takes decisions
How the EU takes decisions
 
Austria
AustriaAustria
Austria
 
Have you washed your hands lately
Have you washed your hands latelyHave you washed your hands lately
Have you washed your hands lately
 
Christmas in austria (ii)
Christmas in austria (ii)Christmas in austria (ii)
Christmas in austria (ii)
 
Finland
FinlandFinland
Finland
 
Ncfm security markets
Ncfm security marketsNcfm security markets
Ncfm security markets
 
Chipre
Chipre Chipre
Chipre
 
Christmas in Sweden
Christmas in SwedenChristmas in Sweden
Christmas in Sweden
 

Similar to Meller et al (2012) Single Unit Firing Rates In Macaque SI (In Review)

Eva Bonda et al Amygdala & Biological Motion, Journal of Neuroscience, 1996
Eva Bonda et al Amygdala & Biological Motion, Journal of Neuroscience, 1996 Eva Bonda et al Amygdala & Biological Motion, Journal of Neuroscience, 1996
Eva Bonda et al Amygdala & Biological Motion, Journal of Neuroscience, 1996 Dr Eva Bonda, PhD
 
19 Jun 2004 1434 AR AR217-NE27-07.tex AR217-NE27-07.sgm LaTeX
19 Jun 2004 1434 AR AR217-NE27-07.tex AR217-NE27-07.sgm LaTeX19 Jun 2004 1434 AR AR217-NE27-07.tex AR217-NE27-07.sgm LaTeX
19 Jun 2004 1434 AR AR217-NE27-07.tex AR217-NE27-07.sgm LaTeXAnastaciaShadelb
 
Eva BONDA et al. (1995) Neural correlates of mental transformations of the bo...
Eva BONDA et al. (1995) Neural correlates of mental transformations of the bo...Eva BONDA et al. (1995) Neural correlates of mental transformations of the bo...
Eva BONDA et al. (1995) Neural correlates of mental transformations of the bo...Dr Eva Bonda, PhD
 
Bionic arm using muscle sensor v3
Bionic arm using muscle sensor v3Bionic arm using muscle sensor v3
Bionic arm using muscle sensor v3IJARIIT
 
Biochemistry In Life Magazine TIB 2022/23
Biochemistry In Life Magazine TIB 2022/23Biochemistry In Life Magazine TIB 2022/23
Biochemistry In Life Magazine TIB 2022/23Madihah Ismail
 
Neurally Driven Prosthetics
Neurally Driven ProstheticsNeurally Driven Prosthetics
Neurally Driven ProstheticsMarshall Benson
 
Kimpo et al_JNsci 2003
Kimpo et al_JNsci 2003Kimpo et al_JNsci 2003
Kimpo et al_JNsci 2003Rhea Kimpo
 
Mc intosh 2003
Mc intosh 2003Mc intosh 2003
Mc intosh 2003Jacob Sheu
 
INFLUENCE OF GENDER ON MUSCLE ACTIVITY PATTERNS DURING NORMAL AND FAST WALKING
INFLUENCE OF GENDER ON MUSCLE ACTIVITY PATTERNS DURING NORMAL AND FAST WALKING INFLUENCE OF GENDER ON MUSCLE ACTIVITY PATTERNS DURING NORMAL AND FAST WALKING
INFLUENCE OF GENDER ON MUSCLE ACTIVITY PATTERNS DURING NORMAL AND FAST WALKING ijbesjournal
 
Brain Finger Printing
Brain Finger PrintingBrain Finger Printing
Brain Finger PrintingGarima Singh
 
Katoh_et_al-2015-Brain_and_Behavior
Katoh_et_al-2015-Brain_and_BehaviorKatoh_et_al-2015-Brain_and_Behavior
Katoh_et_al-2015-Brain_and_BehaviorRhea Kimpo
 
Tognoli: Neuromarkers of hapsis 2017
Tognoli: Neuromarkers of hapsis 2017Tognoli: Neuromarkers of hapsis 2017
Tognoli: Neuromarkers of hapsis 2017EmmanuelleTognoli
 

Similar to Meller et al (2012) Single Unit Firing Rates In Macaque SI (In Review) (20)

Eva Bonda et al Amygdala & Biological Motion, Journal of Neuroscience, 1996
Eva Bonda et al Amygdala & Biological Motion, Journal of Neuroscience, 1996 Eva Bonda et al Amygdala & Biological Motion, Journal of Neuroscience, 1996
Eva Bonda et al Amygdala & Biological Motion, Journal of Neuroscience, 1996
 
NCP_Thesis_Francesca_Bocca.pdf
NCP_Thesis_Francesca_Bocca.pdfNCP_Thesis_Francesca_Bocca.pdf
NCP_Thesis_Francesca_Bocca.pdf
 
Tactile perception
Tactile perceptionTactile perception
Tactile perception
 
19 Jun 2004 1434 AR AR217-NE27-07.tex AR217-NE27-07.sgm LaTeX
19 Jun 2004 1434 AR AR217-NE27-07.tex AR217-NE27-07.sgm LaTeX19 Jun 2004 1434 AR AR217-NE27-07.tex AR217-NE27-07.sgm LaTeX
19 Jun 2004 1434 AR AR217-NE27-07.tex AR217-NE27-07.sgm LaTeX
 
Eva BONDA et al. (1995) Neural correlates of mental transformations of the bo...
Eva BONDA et al. (1995) Neural correlates of mental transformations of the bo...Eva BONDA et al. (1995) Neural correlates of mental transformations of the bo...
Eva BONDA et al. (1995) Neural correlates of mental transformations of the bo...
 
Bionic arm using muscle sensor v3
Bionic arm using muscle sensor v3Bionic arm using muscle sensor v3
Bionic arm using muscle sensor v3
 
Biochemistry In Life Magazine TIB 2022/23
Biochemistry In Life Magazine TIB 2022/23Biochemistry In Life Magazine TIB 2022/23
Biochemistry In Life Magazine TIB 2022/23
 
Neurally Driven Prosthetics
Neurally Driven ProstheticsNeurally Driven Prosthetics
Neurally Driven Prosthetics
 
Kimpo et al_JNsci 2003
Kimpo et al_JNsci 2003Kimpo et al_JNsci 2003
Kimpo et al_JNsci 2003
 
Mc intosh 2003
Mc intosh 2003Mc intosh 2003
Mc intosh 2003
 
INFLUENCE OF GENDER ON MUSCLE ACTIVITY PATTERNS DURING NORMAL AND FAST WALKING
INFLUENCE OF GENDER ON MUSCLE ACTIVITY PATTERNS DURING NORMAL AND FAST WALKING INFLUENCE OF GENDER ON MUSCLE ACTIVITY PATTERNS DURING NORMAL AND FAST WALKING
INFLUENCE OF GENDER ON MUSCLE ACTIVITY PATTERNS DURING NORMAL AND FAST WALKING
 
Is neocortex essentially multisensory?
Is neocortex essentially multisensory?Is neocortex essentially multisensory?
Is neocortex essentially multisensory?
 
Brain Finger Printing
Brain Finger PrintingBrain Finger Printing
Brain Finger Printing
 
Human factors
Human factorsHuman factors
Human factors
 
Katoh_et_al-2015-Brain_and_Behavior
Katoh_et_al-2015-Brain_and_BehaviorKatoh_et_al-2015-Brain_and_Behavior
Katoh_et_al-2015-Brain_and_Behavior
 
172
172172
172
 
172
172172
172
 
172
172172
172
 
Tognoli: Neuromarkers of hapsis 2017
Tognoli: Neuromarkers of hapsis 2017Tognoli: Neuromarkers of hapsis 2017
Tognoli: Neuromarkers of hapsis 2017
 
Thesis
ThesisThesis
Thesis
 

Meller et al (2012) Single Unit Firing Rates In Macaque SI (In Review)

  • 1. SOMATOSENSORY CORTICAL ACTIVITY DURING GRASP: COMBINED PROCESSING OF TACTILE AND MOVEMENT-RELATED SIGNALS. Meller DM1, Naufel SN1, Helms Tillery SI1,2,3 1. Graduate Program in Biomedical Engineering 2. School of Biological & Health Systems Engineering, and 3. Dept. of Psychology Arizona State University Tempe, AZ 85287-9709 CORRESPONDENCE: Stephen I Helms Tillery School of Biological & Health Systems Engineering ECG 334, MS 9709 Arizona State University Tempe, AZ 85287-9709 Email: Steve.HelmsTillery@asu.edu Phone: 480-965-0753
  • 2. INTRODUCTION Daily use of the hand reveals the striking capability of the sensory nervous system to quickly identify objects being haptically manipulated. This skill, stereognosis, allows us to select unseen objects from our pocket, to quickly release one of several held objects, to identify irregularities like bugs on our skin, or to pluck a barely seen berry from the middle of a tangle of thorny branches. A complete understanding of this capability will require a synthetic model of tactile sensation, as haptic discrimination requires the ability to discern the fundamental characteristics of objects including shape, texture, hardness, and temperature, as well as the spatial relationship between local regions of skin experiencing these sensations (Lawson and Bracken 2011; Rincon-Gonzalez et al. 2011) . Constructing such a model is both conceptually and experimentally challenging. Despite careful classification of neurons from the periphery to the cortex in terms of their primary response characteristics, it is clear, even from the mechanical system that binds the receptors, that the various stimuli listed above can drive the activity of multiple receptor systems (Johansson and Flanagan 2009). Thus, each receptor system responds in complex ways to the mixture of physical parameters that occur in natural manipulations and enable stereognosis. The stimulus coding properties of many channels in the somatosensory system have been carefully measured and quantified. These experiments provide good detail about the physical parameters that drive system channel responses. Regarding cutaneous sensory receptors, a long experimental tradition has analyzed the spatiotemporal response properties of sensory channels in response to stimulation with punctate probes, oriented bars, textures, and gratings (Bensmaia et al. 2008a; Bensmaia et al. 2008b; Sinclair and Burton 1993; Sinclair and Burton 1991). For example, recently a high density tactile display has been used to characterize the fine details of responses from neurons in somatosensory cortex (Pei et al. 2011; 2010). For deep systems, characterization has included correlation with changes in joint angle or arm configuration, applied force or perturbation force, and response to sinusoidal or random force application. Proprioceptive parameters have also been decoded from neurons with tactile responses. For example, neurons in SI with cutaneous receptive fields on the skin encode the static posture of the arm expressed in terms of shoulder and elbow joint angles (Cohen et al. 1994; Kalaska et al. 1990; Prud'homme and Kalaska 1994; Tillery et al. 1996). This is not a highly surprising finding given that skin strain manually introduced at digital joints were sufficient to induce proprioceptive psychophysical illusions (Cordo et al. 2011; Edin 1992; Edin and Abbs 1991; Edin and Johansson 1995). In principle, it should be possible to decode the characteristics of a variety of peripheral somatic stimuli from their associated responses in sensory cortex. This would echo the successes of decoding volitional movement direction from the
  • 3. activity of neurons in primary motor cortex (Carmena et al. 2003; Georgopoulos et al. 1983; Georgopoulos et al. 1982; Georgopoulos et al. 1984; Georgopoulos et al. 1986; Serruya et al. 2002; Taylor et al. 2002; 2003; Velliste et al. 2008). Decoding grasped objects from neural firing patterns represents a crucial first step in encoding somatic sensation for neuroprosthetics. Thus, the goal of this work is to disentangle the various elements of external stimuli that drive somatosensory neural firing during naturalistic reach-to-grasp movements. This goal is not trivial: cortical recording experiments have revealed a daunting complexity in the neural response to a variety of experimental tasks. From some of the earliest recordings in somatosensory and motor cortical areas in non-human primates, investigators have reported a substantial overlap in response properties in these two areas. Neurons in primary motor cortex (MI) naturally encode movement, but a portion of neurons in SI also alter their firing rates prior to movement and appear to have movement- related activity that is similar to activity in MI neurons (Fromm and Evarts 1982; 1977; Fromm et al. 1984; Soso and Fetz 1980). At the same time, a portion of MI neurons, particularly in the rostral bank of the central sulcus, have been shown to have tactile responsiveness(Evarts and Fromm 1977; Strick and Preston 1982). One might argue that the motor-like responses in SI follow from the slowly-adapting type II (SAII) responses described by Edin et al, but it might also be the case that there is an element of efference copy in the activity of these neurons. Likewise, sensory responses in MI might participate in cutaneous reflex loops, but might also enter into the perception of tactile events. Our primary approach to this question has been to attribute consistent and repeatable patterns of neural activity to specific elements of the sensory input that occurs in natural haptic manipulations (Rincon- Gonzalez et al. 2011). We identified cortical neurons with cutaneous receptive fields on the hand and, by comparing firing rates during movements with similar hand motion but very different tactile experiences, were able to show that many cutaneous neurons are significantly and repeatably modulated not only by direct contact, but also by other stimuli such as the movement of the hand itself. We recorded the responses of 285 single units in SI with cutaneous receptive fields on the dorsal or volar surfaces of the palm and digits in monkeys (Macaca mulatta) trained to reach for and grasp real (physical) objects presented in the workspace, while visual only trials (object presented only in the visual display) were randomly inserted into the task. This allowed us to unambiguously distinguish neural activity resulting from tactile interaction (“contact-driven”) from activity that was not driven by contact with the grasp object (“movement-driven”). We identified neurons that encoded only contact-driven stimuli, neurons that encoded only movement-driven stimuli, and neurons that encoded a combination of the two. We also found that signals arising from these two modalities appeared to combine linearly in the firing of SI neurons.
  • 4. METHODS The key element in the experiments described here is the separation of somatosensory signals arising from self-movement from those arising from contact. To achieve this, we used an immersive virtual environment The experiments described here were carried out on two male rhesus macaques (Macaca mulatta; monkey F, 11 kg; monkey I, 6.4 kg). One of the animals reported here (F), has had stimulation arrays implanted into the hand area of SI mapped in these experiments and is currently involved in somatosensory stimulation experiments. The second animal (I) suffered a clinical event and was euthanized according to approved methods. For animal I, MR images were obtained at Barrow Neurological Institute, and the data imported into Monkey Cicerone © (Miocinovic et al. 2007) for surgical planning and penetration analysis. All experimental protocols were approved and monitored by the Arizona State University Institutional Animal Care and Use Committee and conformed to the “Guide for the Care and Use of Laboratory Animals” (National Research Council, 1996). Behavioral Task Experimental Setup Fig. 1 illustrates the overall experimental setup used in this study. Two male rhesus macaques were trained to perform a behavioral task while seated in a restraining chair with the head fixed. The left arm was restrained throughout the task. A mirror was located 4 inches in front of the monkey at a 45-degree angle to reflect the screen image from a 3-dimensional (3D) monitor (SeeReal Technologies) mounted horizontally and directly above the seating area. The monitor displayed the robot enhanced virtual reality environment (reVRE) and provided all visual cues during the task, including a virtual hold pad, hand and grasp objects. Hand position and kinematics were captured using an active marker motion capture system (Phasespace, Inc.). Markers were placed at the distal phalange of each finger along with a triad on the back of the hand, and these were sampled 100 Hz. At no time could the animal see its hand or the objects being presented in the workspace immediately behind the mirror. Grasp objects were presented to a single location near the body midline and at shoulder height by a 6-axis robotic arm (VS-6556-G, Denso Robotics) fitted with a pneumatic tool changer (QC-11, ATI Industrial Automation) on the end effector. A 6 degree-of-freedom (DOF) force and torque sensor (Mini85, ATI Industrial Automation) also mounted on the robot’s end effector sensed contact events with the grasp objects. Task Time Line A trial began when a subject placed its right hand on a 4-in square hold pad located at mid-abdominal height. A small touch sensor (Touch Mini v1.2, Infusion Systems) mounted directly beneath the pad monitored hand contact. The subjects always operated in the virtual reality environment, and were trained
  • 5. to recognize cues on the screen, which included a virtual hold pad, object, and hand. The latter was visible and tracked at all times during the experiment. The virtual hold pad appeared in the reVRE after an inter- trial interval of one second, but trials were self-paced and no explicit instruction was given to initiate a trial repetition. Contact with the physical hold pad, indicated by a color change from red to green, was required for a randomized period of 1500-2000 ms. The trial was immediately stopped if the subject failed to maintain constant contact with the hold pad at any time during the hold period. Once the hold period had ended, the virtual hold pad disappeared and an audible go cue was played, signaling that the subject was free to reach for the object. No maximum time limit was set for the subject to react to the go cue. The reach portion of the trial began at the moment the hand left the hold pad (Hold Pad Release, HPR), and the subject was then required to complete the trial within 5 seconds of HPR. Successful completion of a trial was signaled by an audible success cue and removal of both the physical and virtual objects. A juice reward was delivered approximately 500 ms after these cues. Failed trials were signaled by a distinct audible cue and immediate removal of both the physical and virtual objects. No juice reward was delivered for failed trials. Grasp Objects Two objects were used in this task to elicit distinct hand postures during grasping. The small object required a grasp aperture (linear distance between distal phalanges of thumb and index finger) of 1.50 in, while the large object required an aperture of 3.0 in. Subjects were trained to grasp the objects using a precision grip with strict requirements on finger placement to encourage maximum engagement of the glabrous digital surface with the object. Each object was equipped with three small (∅0.75 in), thin (0.04 in) touch sensors (Touch Mini v1.2, Infusion Systems) mounted to the surface in the desired contact location of the thumb, index and middle fingers. Successful task completion of the physical task required simultaneous contact with at least the thumb and index sensors for a period of 250 ms. In visual-only trials, the animal was required to intersect a virtual rendering of the object using only visual cues. Successful completion of a visual-only trial required the virtual hand to make virtual contact with the virtual object in the reVRE. The visual-only task was not modeled in sufficient detail to require the contact of specific virtual digits with virtual touch sensors, rather an object collision event (virtual model mesh collision detected by the VR software) was required for a period of 250 ms. Virtual contact was signaled by a change in the color of the virtual object from white to green.
  • 6. Block Design Object presentations were either visual-only (a virtual rendering of the physical object) or physical (an physical object was also presented in register with the virtual object within reach in the workspace), with visual-only trials inserted randomly into blocks of object size in a ratio of 60:40 (physical:visual-only). Blocks were limited to 10-20 trials in order to maximize the possibility that all task conditions would be covered while maintaining isolation of the unit being recorded. For each unit analyzed, at least 60 successful task repetitions were completed, which guaranteed at least 15 trials to each of the four conditions: large physical, large visual-only, small physical, small visual-only. We took two measures to ensure that the animal did not know until the end of the reach whether a trial was physical or virtual. First, in every trial we ran the robot through a sequence of movements that was similar to the sequence in the physical trials, but in the visual-only trials ended with the object well out of the animal’s reach. Second, we broadcast audible white noise throughout the experiment to mask extraneous noise cues. An analysis of the kinematics of the reach showed that the reach phase of the two tasks were indistinguishable up to the moment of object contact. The overall hand trajectory was indistinguishable for both physical and visual-only tasks up to the point of object contact, with the exception of more corrective movements in the visual-only trials (see Results). Definition of Task Phases Four task phases were defined for every repetition of the behavioral task: Hold, Reach, Contact and Grasp (see Fig. 1B). The Hold phase was defined as the interval [-500, -50] ms with respect to HPR, and was therefore the same duration for all trials. The short offset from HPR (50 ms) excluded neural activity caused by slight pre-reach anticipatory movements or from tactile responses caused by changes in contact with the hold pad itself. Definition of the remaining phases depended on the task and the analysis. For single trial results, the task phases were tied directly to events, whereas in the grouped-trial analyses, phases were defined by the distribution of events across blocks of trials. These definitions diverged after the Hold phase. For single trial analyses, the Reach phase extended from 50 ms after HPR until the first object contact (FOC) event. The Grasp phase was defined as the interval from FOC until the trial end. In physical trials, the FOC event was defined as the moment that the torque on the wrist sensor exceeded 0.2 N-m. In visual-only trials, FOC was the moment of collision between the hand mesh and the object mesh. Since this event relied on the virtual environment, which was updated at the frame rate (67 frames/sec), these events had a temporal resolution of no greater than about 15 msec. Thus, the FOC event for visual-only trials was temporally uncertain and undefined.
  • 7. For the analysis of blocked data, we defined task phases in a way which allowed them to be readily extracted from blocks of trials. Specifically, task phases were defined by the distribution of events in a group of related trials. For the Reach phase, the duration was bounded on one end by the end of the Hold phase (HPR + 50 ms) and on the other by the shortest single-trial Reach phase (earliest First Object Contact) of the trial group. This definition ensured that only non-tactile task activity (reaching, hand shaping) would fall in the Reach phase since, by definition, the subject’s hand had left the hold pad, but had not yet contacted the grasp object. The Contact phase for a given unit was defined as the interval containing the FOC events of all of the physical trials (earliest FOC to latest FOC). Individual trials having a Reach phase of less than 300 ms or longer than 700 ms were discarded in order to remove outliers from the Contact phase. Rather than relying on the limited time resolution of the object contact event in visual-only trials, we assumed that the distribution of contact times was similar to that of the physical trial contact times, since the visual-only trials were inserted randomly during a block of trials for object size, and thus used the same bounds determined from the physical trials. Finally, the Grasp phase for both physical and visual-only trials began at the end of the Contact phase (last FOC event), and lasted until the time of the success cue for a total trial length of 1200 ms. Surgical Procedures and Recording Early in the process of behavioral training, three titanium head holding pedestals (Thomas Recording, GmbH), were surgically fixed to the skull. A period of at least 2 months was allowed before restraining the head to allow for sufficient healing and osseointegration of the bone screws. Once a monkey had reached an acceptable level of competency in task performance (minimum 2 hours of work at 85% correct), a recording chamber was surgically implanted over the primary sensory cortex contralateral to the working hand. In monkey F, the rectangular chamber was fabricated from titanium, with inner dimensions 30 mm x 20 mm. The long axis of the chamber was oriented at an angle of approximately 15- degrees counterclockwise with respect to the rostral-caudal axis, and the stereotaxic location of the chamber center was approximately 18.1 mm anterior to interaural zero and 18.8 mm lateral to the midline. In monkey I, the chamber was fabricated from a medical grade, biocompatible polyetheretherketone (PEEK) polymer (PEEK-OPTIMA®, Invibio™) to allow a more sophisticated design and to facilitate fabrication (McAndrew et al.). The inner wall of this chamber had a circular cross-section (∅20 mm) and the stereotaxic location of the chamber center was approximately 17.0 mm anterior to interaural zero and 18.4 mm lateral to the midline. In both cases, the chambers were oriented such that the penetration direction was aligned with stereotaxic vertical. Parylene-coated tungsten microelectrodes (1.0 MΩ, Harvard Apparatus) were driven into the cortex using a microdrive (NAN-CMS, NAN Instruments Ltd.) mounted to the chamber. The time occurrence of action
  • 8. potentials from isolated units was recorded and the instantaneous firing rate was calculated using binned time intervals of 20 ms, smoothed with a triangular convolution kernel (Nawrot et al. 1999). Recording location (stereotaxic coordinates, recording depth), cutaneous receptive field location and real- time neural response attributes were used to guide the selection of recording sites to the hand representation of SI, similar to the procedure described by Mountcastle et al. (Mountcastle et al. 1990). Recording began in posterior SI where unit responses to passive joint flexion and extension were common and cutaneous responses were rarely observed. Motor responses were not observed in this region, and any electrode penetration deeper than 3.0 mm invariably encountered white matter. Subsequent recording sites were advanced rostrally until motor responses were consistently observed, thus approximating the location of the rostral bank of the central sulcus. In addition for monkey I, MR imaging data were combined using the Monkey Cicerone© software package (Miocinovic et al. 2007) to reconstruct the chamber location and track electrode penetrations. Post-mortem examination verified that the bulk of the penetrations were in the hand area of the post- central gyrus, in agreement with the reconstruction using Monkey Cicerone (see Fig. 2). Finally, intracortical microstimulation (ICMS) experiments were carried out in both subjects. We observed several instances of withdrawal reactions of the hand on stimulation, but only with high-amplitude ICMS (> 90 µA) did we observe any movement twitches. These twitches occurred while stimulating in the rostral section of the recording sites in monkey I (see Fig. 2C). The initial search for task related units began in the caudal chamber and proceeded rostrally until motor responses were encountered. These were readily identified from audio and visual feedback of neural activity correlated with subjects’ volitional arm and hand movements. Motor responses were identified using simple observational criteria. Upon isolating a task related unit response, the subject’s hand and distal limb were searched for cutaneous or deep receptive fields using mechanical stimuli applied to the skin (gentle pressure, von Frey hairs, light brushing) or by passive flexion and extension of the digits and wrist. The subjects were accustomed to the procedure and remained passive throughout the process. In many cases, no manner of cutaneous stimulation or joint manipulation produced a significant or correlated neural response. Such units were disregarded. Some units had vigorous responses during movement. Verification of a motor response came when the subject’s hand was released, at which time vigorous and sustained neural activity correlated with volitional hand or arm movement resumed. Units with responses during movement but no clear responses to somatic stimulation were not analyzed in this study. When neurons with these characteristics were consistently encountered in a particular electrode penetration, that chamber location was disregarded
  • 9. for the remainder of the experiment and its location was not revisited in subsequent recording sessions. After establishing the approximate boundary separating primary motor and sensory cortices, subsequent recording penetrations were made 1-2 mm caudal to the boundary. Lateral and medial boundaries of the SI hand representation were established by identifying sensory cutaneous receptive fields on the face and distal forearm, respectively. Sensory Receptive Field Classification All units retained and analyzed for this study had clearly identifiable cutaneous receptive fields on the volar or dorsal surface of the hand or digits. Cutaneous receptive fields were identified using mechanical stimuli and passive joint manipulation, as described above. Units with only a perceptible response to passive joint movement were discarded. Receptive fields were classified according to size (area) or extent (joint crossings) into six categories, with sizes ranging from less than a single digital phalange (class 1) to fields covering most of the palm and digits (class 6). Receptive fields extending across a joint were often assigned to the next highest category even if the actual area did not cover the entire phalanges where they were located. Units were not distinguished on the basis of slowly or rapidly adapting responses. Firing Rate Analysis Individual task trials were grouped into one of four primary task conditions based on the type of object presentation: Small Physical (SP), Large Physical (LP), Small Visual-Only (SV) and Large Visual-Only (LV). The instantaneous firing rate for individual trials was calculated as described above, and the individual trial firing rates for a given task variant were averaged into a single mean firing rate. Unit Response Classification A unit response was considered task-related if the mean firing rate during any single task phase was significantly different from the mean rate during any other task phase. Statistical significance (α = .05) was assessed using an unbalanced ANOVA test of mean firing rate bins grouped by task phase. Only task-related units were analyzed in this study. Task-related units were assigned primary and secondary response traits based on patterns of statistically significant firing rate modulation. Neurons that had responses occurring during the contact interval in Physical trials were labeled contact-driven. Neurons with responses during the movement or contact intervals which were similar in the Physical and Visual-Only trials were defined as movement-driven. Neurons which had only movement-driven or contact-driven responses were called Simple. Neurons in
  • 10. which the responses had both movement- and contact-driven characteristics were labeled Mixed. In the cases of neurons with Mixed responses, an attempt was made to discern whether the contact-driven or movement-driven response involved a larger change in firing rates, and that response was then termed the Primary response. Primary response types are indicated here with a capital letter, either C for Contact- driven or M for Movement driven. The secondary response type was indicated with a lower-case letter. Statistical Methods Statistical comparisons of grouped data throughout the study were evaluated using ANOVA at the 95% confidence level (α = 0.05). Multiple comparisons of grouped data used Tukey's Honestly Significant Difference criterion (Tukey-Kramer) to provide an upper bound on the probability that any single comparison would be incorrectly found significant. RESULTS Neural Population Analyzed A total of 371 single units with tactile receptive fields on the hand were recorded in SI in two male rhesus monkeys (monkey F: 63%, monkey I: 37%). Of these, 285 units (77%) exhibited statistically significant task related activity and were included in the analysis (monkey F: 59%, monkey I: 41%). The remaining 86 unused units were not significantly modulated by the experimental task. We evaluated recording locations by a combination of physiological, imaging, and post-mortem anatomic criteria. A reconstruction of the recording sites for monkey I are shown in Fig. 2. Our verification of the recording sites determined by physiological criteria was confirmed by both the stereotactic imaging shown in Fig. 2A and the clear location of penetrations on the surface of the hemisphere, shown in Fig. 2B. These penetrations in monkey I came from a restricted portion of the chamber spanning 6 mm medial-lateral and 3-4 mm rostral caudal, which agrees with the dimpling observed on the surface of the cortex. Neurons from monkey F came from a similarly restricted region of the chamber, spanning 3-4 mm rostral-caudal and 2-3 mm medial-lateral. In both cases, anatomic and physiological indicators were that the bulk of the neurons reported here came from primary somatosensory cortex. It is likely that we recorded some neurons each in areas 4 and 5, but we could find no trends with respect to the classifications of the neurons given below and distribution within the recording chambers in either animal. Thus, we are confident that our observations capture properties of S1.
  • 11. Kinematics of the Hand and Arm Analysis of a subset of trials showed that subjects’ hand and arm kinematics up to the moment of object contact were independent of whether the trial was physical or visual-only. That is, hand and arm movement kinematics were highly stereotyped during both physical and visual-only tasks. The analysis further showed that subjects generally stopped moving shortly after the collision event with little or no additional movement, although enough trials had corrective movements that, on averaging, a second velocity peak was clearly visible in the visual-only trials. Examples of trajectories to the small physical (SP) and small visual-only (SV) targets are shown schematically in Fig. 3A and B. The trajectory shown in B is an example of a worst case correction occurring in a visual-only trial when the animal does not encounter a physical object. When examining the mean trajectories over a day’s recordings (panels C and D), a secondary velocity hump does occur in the visual-only trials, corresponding to some degree of corrective movement in those trials, although the secondary velocity peak is typically minor. Importantly, the movements for the two tasks are comparable up to the moment of contact. Simple Responses Contact-Driven Responses Figure 4A shows the responses of a unit with a cutaneous receptive field on the volar surface of the distal phalange of the thumb that appears to be driven by tactile interaction with the grasp objects. During physical task conditions (panels 1, 3, and 5) the firing rate was unchanged through the Hold and Reach phases, but this neuron responded briskly at object contact and throughout the haptic task phases (Contact, Grasp). The tactile response observed during physical trials is absent during the visual-only trials (panels 2, 4, and 6), during which this neuron exhibited no significant modulation of neural activity during any task phase. Thus, firing rates were statistically indistinguishable from the non-haptic phases of the physical tasks (panel 6). Based on these observations, the primary response trait for this unit was classified as contact-driven, and no secondary response trait was assigned. Although this particular unit did not demonstrate significant differential modulation of activity with respect to object size (panels 5 and 6), 91% (81/89) of Simple contact-driven responses did show modulation with respect to object size. Simple contact-driven responses accounted for 31% (89/285) of all task-related unit responses recorded in SI.
  • 12. Movement-Driven Responses Figure 4B illustrates a complementary response to that of the contact-driven unit of Fig. 4A. The receptive field of this neuron spanned the full volar surface of the intermediate and distal phalanges of the index and middle digits. The response to physical and visual trials is identical for a given object size (panels 7 and 8). Background firing rate during Hold is nearly zero (< 1 spike/s), yet within 50 ms of hand departure from the hold pad the unit fires with increasing frequency during the Reach phase before abruptly silencing approximately 100 ms prior to object contact. Neural activity during Contact and Grasp then returned to the background rate of the Hold phase in all task conditions. Based on these observations, the primary response trait for this unit was classified as movement-driven, and no secondary response trait was assigned. This unit demonstrated significant size-dependent modulation of activity, which was observed in 34% (11/32) of all Simple movement-driven responses. The difference is evident when comparing firing rates for tasks with a common object size (e.g., Small Physical vs. Large Physical, subplot 5). In this case, the SP firing rate is greater than that of the LP task, but both positive and negative correlations of object size and firing rate were observed in the neural population. Out of 285 task-related neurons, 32 (11%) responses recorded in SI were classified as Simple movement-driven. Mixed Responses Contact-Movement Responses Figure 5A shows the response of a unit with a cutaneous receptive field on the volar surface of the proximal phalange of the index finger. The response is similar to that of the contact-driven unit discussed above, but with the addition of significant pre-contact neural activity in the SP task spike rasters (panel 1). This activity is also evident in the mean firing rate during Physical trials (both LP and SP) (panel 5) and was effectively isolated in the SV task (panel 6). The response of this unit was primarily driven by haptic interaction with the grasp objects and was therefore classified as contact-driven. The smaller pre- contact activity evident in both physical and visual trials was classified as a secondary movement-driven response trait, indicating that this component of the overall firing rate was driven by a non-contact stimulus. A slight majority (52%, 149/285) of all task-related unit responses were Mixed, and 24% (69/285) were classified as Contact-movement. The Contact-movement unit in Fig. 5A also demonstrated significant size-dependent modulation of activity, which was observed in 29% (20/69) of all Contact-movement responses. The difference is evident when comparing firing rates during the Reach phase for tasks with a common object size (e.g.,
  • 13. SP vs. LP, panel 5), and is isolated in the response to visual-only tasks (panel 6). The unit response to the Small Physical task was significantly greater than the response to the Large Physical task. In the comparison of visual-only task responses (panel 6) we also observed a small but statistically significant “late” movement-driven response in the LV task that was masked in the mean firing rate of the LP task (panel 5). The response was late only with respect to the SP task, a commonly observed feature of the unit responses analyzed in this study. Both monkeys took slightly longer (50–100 ms) to reach for and contact the large object. This observation was true for both physical and visual task conditions and might be explained by the fact that successfully grasping the large object required the wrist to undergo greater extension and the grasp aperture to widen considerably more than for the small object. Given the timing of this response, beginning near the end of the Reach phase, this firing may be related to closing the hand around the smaller object. Movement-Contact Responses Figure 5B shows the response of a unit with a cutaneous receptive field spanning the volar surface of the distal and intermediate phalanges of digits two through four. In both physical and visual trials a strong pre-contact response was observed. Physical trials featured a strong tonic response during Contact and Grasp (panels 1 and 3). In visual-only trials, the tonic response to haptic task phases was absent and firing rate was essentially zero (< 1 spike/s) during Hold, Contact and Grasp (panels 2 and 4). The larger pre-contact response was assigned as the primary response trait and designated as movement-driven. The comparatively smaller magnitude response during haptic task phases was assigned as the secondary response trait and designated as contact-driven. Movement-contact responses accounted for 28% (80/285) of all task-related responses. The Movement-contact (Mc) unit in Fig. 5B also demonstrated significant size-dependent modulation of activity, which was observed in 40% (32/80) of all Movement-contact responses. The difference is evident in the primary movement-driven component of the response (panels 5 and 6), where the response during the Reach phase of the SP task was again significantly greater than the response in the LP task. Also notable in the response is the previously discussed delay of the large object tasks during the Reach phase. Odd Responses Finally, a small group of neurons (5%, 15/285) exhibited a clear modulation of firing rate during the task but did not fit the contact or movement-driven response classifications described above. We simply label these as Odd responses. Figure 6A shows the response of an Odd unit with a cutaneous receptive field on
  • 14. the volar surface of the distal phalange of the middle finger. The response of this unit was preferentially elicited for the small object, with almost no modulation during large object tasks. The pre-contact activity during small object tasks began immediately after the hand left the hold pad and quickly ceased at object contact (panels 1, 2, and 7). Leaving aside the lack of response to the large object, such a response would normally have been classified as Simple movement-driven if not for a sudden transient increase of approximately 40% in the SV firing rate (with respect to the SP rate) at the moment of expected contact with the object. Firing rate then declined at a rate exceeding that of the SP task, undershooting the SP response before silencing completely near the end of the Grasp phase. Further analysis showed that the transient response occurred when a physical trial was followed by a virtual trial, but not when a virtual trial followed another virtual trial. Figure 6B shows the analysis of another Odd unit with a cutaneous receptive field on the volar face of the intermediate phalange of the middle finger. Spike rasters for the small and large physical task conditions show an almost complete lack of neural activity over the course of 80 trials (panels 1, 3, and 5). This appears to indicate that the unit response is driven by neither contact nor movement-related stimuli, but recall that all units recorded and analyzed for this study exhibited unambiguous responses to cutaneous stimulation on the hand. In contrast the response is strongly modulated during visual-only trials early in the Contact phase (panels 2, 4, and 6), but silenced before the start of Grasp. Note also that the response is not differentially modulated with object size. Odd responses were considerably more varied than the comparatively stereotyped contact-driven and movement-driven classes into which 95% of all task related units were assigned. Still, a common characteristic of these responses was the appearance of significant neural activity with no apparent causal stimulus. Response Traits and Classes Summary Table 1 details the distribution of response traits and classes observed in the neural population in SI. A slight majority of units (52%, 149/285) had Mixed responses divided nearly equally between primarily contact-driven (Cm, 46%) and primarily movement-driven (Mc, 54%) response classes. Simple unit responses (42%, 121/285) were mostly contact-driven (74% C vs. 26% M). Odd responses comprised (5%, 15/285) of all task-related responses. Fifty-five percent (158/285) of all primary response traits were contact-driven and 46% (69/149) of all secondary response traits were contact-driven. The distribution of primary response traits was skewed toward contact-driven responses in monkey I (78% contact vs. 17% movement), but the two traits were
  • 15. more equally represented in the sample from monkey F (40% contact vs. 54% movement). The distribution of secondary response traits in monkey I was also skewed, but in favor of movement-driven responses (26% contact vs. 74% movement). Secondary traits were more equally distributed in monkey F (65% contact vs. 35% movement). We expect that these differences are largely due to slight differences in the distribution of recording sites between the two monkeys, and discuss this further below. Receptive Field Characterization Figure 7 details the location, size classification and recording depth of all task related units recorded in SI. Panel 7A shows the location of receptive fields for all units mapped prior to recording in the behavioral task. The vast majority of cutaneous receptive fields (81%, 231/285) were located on the volar surface of the palm and digits, most densely concentrated on the distal phalanges and metacarpophalangeal (MCP) joints of digits one through three. Receptive fields on the dorsal hand surface (8%, 24/285) were concentrated along the ulnar face of the thumb and radial face of the first digit, especially near the MCP joint. Receptive fields classified as both (11%, 30/285) occupied both volar and dorsal skin surfaces (see also panel 7C) Panel 7B shows the distribution of receptive field size classes in order of increasing area. A majority of cutaneous receptive fields (85.3%, 243/285) were small (classes 1 and 2), occupying no more than two digital phalanges or an equivalent area on the palm. Large receptive fields extending across more than two digits (classes 5 and 6) were uncommon (7%, 21/285) and rarely demonstrated clear or significant task tuning. Panel 7D shows the distribution of recording depths of all task related units recorded in SI. Depths were distributed about two primary modes with approximate means of 1.27 ± 0.57 mm (75%, 215/285) and 4.51 ± 0.79 mm (25%, 70/285). Interpretation of these numbers is difficult because electrode penetrations were oriented in stereotaxic vertical, rather than perpendicular to the cortical surface. While the same general trend of small receptive field size in the complete sample was true for all individual response classes (Fig. 8, A-D, center panels), a few statistically significant (α = .05) differences were observed. Among individual classes, the receptive fields of Movement-contact cells were somewhat larger than those of both Contact (∆meanSizeClass = 0.29, p = .015) and Contact-movement units (∆meanSizeClass = 0.42, p < .001). More generally, units with primary movement-driven response traits had slightly larger receptive fields than those with contact-driven primary response traits (∆meanSizeClass = 0.30, p < .001).
  • 16. Several significant differences in mean recording depth (Fig. 8, A-D, right panels) were observed among individual response classes. Contact units were recorded at comparatively shallower depths than both Contact-movement units (∆meandepth = 0.74 mm, p < .001) and Movement-contact units (∆meandepth = 1.16 mm, p < .001). Movement units units were encountered at comparatively shallower depths than Movement-contact units (M: ∆meandepth = 1.0 mm, p = .002) . The general trend is further clarified by a comparison of primary response traits: units with contact-driven primary responses were recorded at comparatively shallower depths than units with movement-driven primary responses (∆meandepth = 1.60 mm, p < .001). These data indicate that contact-driven responses were characteristic of neurons located at relatively shallow cortical depths, while movement-driven and finally odd responses were more common with increasing depth. Recording Grid Coordinates Grid coordinate refers to the alignment grid of the neural recording drive platform used to position and support electrode guide tubes during neurophysiological recording experiments. Alignment holes were spaced at 1.0 mm intervals in a grid pattern and assigned integer coordinates along orthogonal axes corresponding roughly to the caudal-rostral (CR) and medial-lateral (ML) directions. Unique recording chamber geometry in each experimental subject required separate analyses of the statistical dependence of response characteristics on grid coordinates. In neither monkey was CR nor ML coordinates a significant predictor of response class (Monkey F: pCR = .401, pML = .111; Monkey I: pCR = .228, pML = .864).
  • 17. DISCUSSION Single units in SI encode multiple sensory phenomena The major finding of this study is that single units in the hand representation of SI appear to simultaneously encode multiple distinct sensory phenomena in components of their overall firing rate. Contact-driven and movement-driven responses to a stereotyped behavioral task were distinguished and quantified with the aid of a virtual reality simulation by removing the correspondence between the expected and actual task outcome. The results of our sample population show that a majority of single units in the hand representation of SI with cutaneous receptive fields encoded information about both contact-driven and movement-driven sensory modalities. The clear distinction between and quantification of multiple, simultaneous information streams present in the activity of SI cortical neurons has not been previously demonstrated. Although firing rate responses were grouped into a limited number of discrete classes for the purpose of group statistical analysis, the great variety of individual unit responses observed in our sample of the neural population suggest a broad distribution of response types in SI with varying combinations of contact-driven and movement-driven response traits. Additionally, a small percentage of units could not be classified using contact-driven and movement- driven response traits. These units may encode yet another class of sensory information that is actually more widely represented in the neural population but was not adequately captured by the experimental methods of this study. Limitations of Interpretation Recording location Three distinct sensory representations of the body are found within primary somatosensory cortex itself (areas 3b, 1 and 2), each driven by inputs originating from cutaneous and/or deep receptors in varying proportion. It is also well established that area 4 of primary motor cortex receives significant input from both cutaneous and deep receptors and contains at least two complete sensory representations of the body; a caudal motor area receiving mostly cutaneous inputs and a rostral motor area receiving mostly deep inputs (Lemon and Porter 1976a; b; c; Strick and Preston 1982; Wise and Tanji 1981). Despite these considerations, several lines of evidence strongly support the claim that the overwhelming majority of neurons described in this work were recorded from areas 3b, 1 and 2 of primary somatosensory cortex. As described in the Methods, criteria used by Mountcastle were used to empirically identify the hand representation of SI during initial recording experiments (Mountcastle et al. 1990). Subsequently, neurophysiological recordings in the same region encountered almost exclusively
  • 18. sensory responses with small receptive fields on the hand and distal forearm. Additional post-hoc work reinforced this conclusion. Three-dimensional recording chamber reconstructions using MRI verified that SI was the primary recording zone, and ICMS experiments conducted in the recording regions in both subjects produced movement twitches only at large amplitudes of electrical current. We also observed on the cortical surface of monkey I a clear region of penetrations (see Fig. 2). Despite these measures, it is possible that a small proportion of the units analyzed in this study might have been recorded in MI. It is possible that some movement-driven neurons described in this study were in motor cortex, however anatomical analyses such as that shown in Fig. 2C indicate that neural response classes were evenly distributed throughout the cortical recording region. Thus, even if some small proportion of the neurons analyzed in this study were indeed recorded in motor cortex, the general conclusions still hold; that neurons in SI encode multiple relevant parameters during naturalistic reach-to-grasp movements. Origin of Responses Considerable care was taken throughout this study to include data only from single cortical units with cutaneous receptive fields. Despite these measures, the possibility remains that some responses were influenced by inputs from deep receptors, especially considering the extensive pre-cortical divergence and convergence of sensory afferents in brainstem and thalamic nuclei. For example, it is estimated that a single primary afferent from the hand may project to ~1,700 cuneate neurons, while a single cuneate neuron receives projections from ~300 sensory afferents (Johansson and Flanagan 2009). This was also a factor in the decision not to characterize unit responses with respect to peripheral receptor types (depth or adaptation rate). Such designations are best suited to describe the response of individual receptors, or that of several closely associated receptors of the same type recorded from peripheral afferents. They may have little relevance to cortical responses driven by highly convergent inputs that might represent multiple sensory modalities. Another possibility is that movement-driven responses traits (primary or secondary) might actually represent an efference copy of motor commands originating in frontal motor cortical areas (Flanagan et al. 2003). Recent fMRI data suggest that frontal cortex has at least modulatory access to primary somatosensory cortex (Christensen et al. 2007). This explanation cannot be ruled out, although certain aspects of the observed neural responses do not support that conclusion. First, in the case of Simple movement-driven responses, we did not observe significant neural activity that could be attributed to secondary movements such as hand trajectory corrections during visual-only trials or hand withdrawal in the late Grasp phase for both physical and visual-only trials. Second, recall that every unit analyzed in this study, even those with simple movement-driven responses, had a cutaneous receptive field on the hand. Attributing all movement-driven activity to a delayed copy of motor output cannot account for the absence of contact-driven responses during Contact and Grasp for Simple movement-driven units with
  • 19. cutaneous receptive fields in physical contact with the grasp objects. In the case of simple contact-driven units, responses were clearly driven by tactile stimuli during Contact and Grasp. Evidence of efference copy might be expected during Reach or even the late Grasp phase when the hand was removed from the object and returned to the hold pad, but none was observed. Such activity would have been especially evident during visual-only trials when contact-driven feedback was not present to mask it, but again, none was observed. Additionally, such activity was not observed in Mixed responses. From these observations, one must either conclude that a delayed copy of motor output was actively silenced in just one of the four major response classes (Simple contact-driven) or that non-contact-driven neural activity observed in the neural population of SI was at least partially attributable to sources other than efference copy. Simple Responses Contact-driven responses were the most common trait observed in the neural population (primary: 55%, primary + secondary: 84%). In a certain sense, exteroception was present in exactly 100% of the population sample, since cutaneous receptive fields were identified for all units. Movement-driven traits were also commonly observed (primary: 39%, primary + secondary: 64%), indicating that a significant proportion of somatic sensory feedback, as represented in cortex, is multimodal. Perhaps the most unexpected result was the discovery of apparently contact-driven units (a clearly identifiable receptive field) with exclusively movement-driven (Simple movement-driven) responses. It is possible that our cutaneous stimulation was inducing subtle joint movements, but neurons were selected specifically that did not have a clearly differentiated response to overt joint movements. Instead, this might imply some form of neural gating, since the cutaneous response so clearly evident when mapping the receptive field was absent in the task response. An alternative explanation is that the mechanical stimulus provided by object contact during the behavioral task was insufficient or sufficiently different from the stimuli used to identify the receptive field in the first place. However, this explanation does not account for the fact that most of the cutaneous responses identified for use in this study demonstrated extreme sensitivity to even the lightest stimuli, often firing at rates in excess of 100 spikes per second. Thus, any contact with the grasp objects was normally sufficient to evoke a strong neural response. Mixed Responses Mixed responses recorded in SI were the rule, rather than the exception, comprising 52% of unit responses. Mixed responses with movement-driven traits (primary or secondary) also provide evidence of response gating since movement-driven responses evident during Reach (physical and visual-only trials) were absent during haptic task phases in visual trials when subjects made brief kinematic adjustments to complete transition from the expected physical task to the visual task. As an example, consider the strong
  • 20. movement-driven response shown in Fig. 5B. A comparison of physical and visual trials clearly shows that the movement-driven portion of the overall response occurred before contact with the object. Evidence of response gating is provided by the complete lack of a movement-driven response during the visual tasks. During the reach phase of the task, subjects rapidly extended the arm toward the target object, with an approach time typically in the range of 300 – 400 ms. In visual trials the hand would often overshoot the expected object location after which subjects would quickly move to the appropriate position in space to complete the visual task (see e.g. Fig. 3). Despite the strong movement-driven response during Reach, no evidence of a similar response was observed while adjusting to the visual-only task. The distribution of primary and secondary response traits were highly skewed in monkey I, where 78% of primary and 26% of secondary traits were contact-driven, with 5 percent of traits classified as “Odd.”. The distribution in monkey F was somewhat more balanced, where 40% of primary and 65% of secondary response traits were contact-driven, with 5 percent of traits classified as “Odd.”. The cause of this apparent difference is unclear and may simply reflect differences among subjects. We note, however, that the recording region in monkey I was significantly more focused (~12 mm2) than for monkey F (~24 mm2), which may have contributed to a greater diversity of response characteristics. It was also observed that movement-driven response traits were more prevalent with increasing cortical depth. An analysis of recording depth by subject revealed that the mean (±σ) unit depth in monkey F was 2.7 ± 0.1 mm, while the mean unit depth in monkey I was 1.1 ± 0.1 mm (Δ = 1.6 mm, p < .001). On average, units were recorded at greater depths in monkey F, which may account for the greater occurrence of movement- driven response traits. Odd Responses Odd responses were rarely encountered and difficult to classify in terms of contact-driven or movement- driven response traits. One example is shown in Fig. 6A, where a unit with a fingertip receptive field responded only to small object tasks and suddenly increased its firing rate by 40% during visual-only trials at the moment of expected object contact. This brief phasic response might be explained by the receptive field contacting the palm of the hand during grasp, or by a comparatively greater range of digit flexion when the small physical object was not encountered in visual-only trials. Another example is shown in Fig. 6B, where the only significant response came at the moment of expected object contact with both small and large objects, and exclusively in visual-only trials. This response cannot be explained by differential activation of sensory receptors with object size. Odd units might represent a distinct type of sensory information, perhaps a complex integration of sensory inputs that defies simple classification by contact-driven and movement-driven response traits. It is possible that these responses are much more
  • 21. widely present in SI but were inadequately isolated or characterized by the experimental techniques of this study. Group Analysis Group analysis confirmed the major differences between response classes and revealed trends not apparent in the inspection of individual single unit responses. The most significant of these was a subtle but revealing difference between Simple and Mixed responses between the Reach and Contact task phases. During the Reach phase, M units were active at levels far above the task mean and were more modulated than Mc units, while C units were active at levels far below the task mean and were less modulated than Cm units. This relationship changed during the Contact phase, when all four response classes were modulated at levels far above the task mean and Mixed classes exceeded the activity of their Simple counterparts. This is significant because the Contact phase (especially the leading edge) is the first time in the task that both contact-driven and movement-driven stimuli coincide. At this moment, units encoding both contact and movement-driven stimuli consistently demonstrated a greater overall depth of modulation than units encoding a single modality, suggesting that not only different but quantitatively more information was relayed by Mixed response types when both types of stimuli were present. Significance of Results A key difficulty in somatosensory research is bridging the gap between understanding the detailed structure of S1 receptive fields and understanding how neurons behave in naturalistic tasks. It is widely acknowledged that a key problem in somatosensory neurophysiology is that the medium in which receptors are embedded, the skin, is prone to a wide range of stimulation and deformation as an animal moves. In particular, relative to the kinds of deformations which happen when the skin is directly impinged by an outside object, the stretch and overlap observed with arm movements is likely to be relatively subtle. It is also known that cortical neurons with cutaneous receptive fields on the proximal arm can fire in relation to arm movement, but it remains unclear how that relates to the deformation of the skin or how or even whether that modulation is used by the central nervous system. Compared to the firing rates elicited when the skin is directly probed, the depth of modulation of SI neurons with respect to arm movements is likewise subtle. Many of the mechanical subtleties become much more dramatic with hand movements. Finger joints can, and often do, move over a span of 90 degrees, introducing substantial deformations and responses in
  • 22. primary afferents of both the hairy (Edin 1992; Edin and Abbs 1991) and glabrous (Burke et al. 1988; Goodwin and Wheat 2004) skin of the hand. Our concern in designing the experiments reported here was to separate that element of modulation from the modulation driven specifically by contact between the skin and objects in the environment. Our findings here of robust activity during movement in neurons with cutaneous receptive fields on the hand suggests that the movement-related modulation is a constitutive part of coding in these neurons. We have shown that the contact and movement-related signals, when multiplexed, can be teased apart experimentally. In fact, this is an important step towards the extraction of parameterized sensory information from cortex, and eventually towards providing sensory information back into the cortex. However, the results also suggest to us that these neurons are coding for something more substantial than contact with the skin. In order to perform haptic tasks, it is necessary to combine traditionally tactile signaling (what is the shape, texture, temperature, etc. of the surface) with proprioceptive signaling (how are these receptive fields organized with respect to one another spatially). Similar skin strain patterns can mean dramatically different things depending on whether the fingers are in a line or in a circle. And in fact tactile illusions can be very sensitive to the posture of the fingers (Warren et al. 2011). It is possible that SI is already carrying out the computations necessary for stereognosis. Several lines of evidence now suggest how those computations may be structured (Rincon-Gonzalez et al. 2011). All of this is likely to be important for bidirectional neuroprosthetics (O'Doherty et al.). In the area of motor control, much experimental work has been carried out to extract useful information from the activity of neural populations in motor planning and output areas of cortex. These efforts have led to the successful development of computational techniques for extracting kinematic parameters for volitional movement in both monkeys and humans based largely on the insight that the direction of intended movement is represented by the directionally-sensitive tuning of single units in motor cortex (Georgopoulos et al. 1982). The ability to generate volitional motor output is a necessary, but ultimately insufficient, criterion for realizing the full potential of the hand as an exquisitely capable motor and sensory organ, a fact that was convincingly demonstrated when Mott and Sherrington (Mott and Sherrington 1895), more recently duplicated by Twitchell (Twitchell 1954), removed all sensory feedback from the hands in monkeys and observed a complete lack of use. Efforts to decode the flood of sensory information arriving in cortex are crucial not only from a purely scientific perspective, but also for efforts to restore lost or impaired sensory function in humans. The results of this study advance our understanding of the particular strategies by which sensory information is represented in primary cortical areas and represent a step towards the extraction of parameterized sensory information that can be utilized in engineered neuroprosthetic systems.
  • 23. CONCLUSIONS The results of this study demonstrate that single units in primary somatosensory cortex with cutaneous receptive fields on the hand encode both contact-driven and movement-driven sensory information in their firing rates. A majority of units simultaneously encoded both sensory phenomena during normal hand use. An analysis of normalized firing rate modulation amongst different neural response classes also demonstrated that Mixed responses encode not only different sensory information, but quantifiably more such information. Unit responses also demonstrated evidence that sensory feedback from the hand is gated in a task-dependent manner. We hypothesize that the considerable variety of responses observed in the neural sample of SI represents a continuous distribution of response types, with varying combinations of contact-driven and movement-driven response traits. Future neuroprosthetic systems must distinguish and decode multiple layers of sensory information present in the activity of single cortical units to correctly interpret peripheral sensory events. ACKNOWLEDGEMENTS The authors thank Michelle Armenta Salas for invaluable assistance in analyzing the kinematics of this task and Rachele McAndrew for her work with the animals. This study was supported by the National Institutes of Health Grant R01 NS-S050256-03.
  • 24. FIGURES AND TABLES Table 1. Response traits and classes for all task-related units.
  • 25. Figure 1A: The robot-enhanced Virtual Reality Environment. The animal sits in a primate chair and views an immersive virtual environment through a mirror in which is projected the 3d display. The animal’s motions are captured by a camera-based motion capture system and visualized in the display along with movement targets. Targets are displayed in some trials in register with an instrumented grasp object which is presented on some trials by a small robotic arm. B. Time course of a trial in the Reach-to- Grasp task.
  • 26. Fig. 2 Recording locations from monkey I. A screen-shot from Monkey Cicerone© showing an MR section through the hand area of somatosensory cortex and the chamber placement in monkey I. B. A view of the surface of the left hemisphere in monkey I showing the section of post-central gyrus with penetrations indicated by the red line. C. The 2-D chamber map for the recordings in monkey I, with a tentative indication of the location of the central sulcus in the chamber, along with markers showing penetrations, cell classifications found in each penetration, and the location of the single ICMS site in our recording area that elicited a twitch, at over 90 µA.
  • 27. Figure 3: Kinematic analysis. An example trajectory for a small physical trial, B. Example trajectory for a small visual-only trial. C. A day’s average index finger speed for small physical and small visual- only trials. D. A day’s average index finger speed for large physical and large visual-only trials. See text for details.
  • 28. Figure 4. Simple response type cells. A. Contact-driven neuron. B. Movement-driven neuron. Rasters (panels 1-4) for this (and the subsequent) figure are broken into trials to each of the four conditions, large (L) and small (S) objects in physical (P) and visual-only (V) trials. Panels 5 and 6 show overlaid histograms for comparison between small (S) vs. large (L) trials. Panels 7 and 8 show histograms for P vs. V trials.
  • 29. Figure 5. Mixed response type cells. A. Primarily contact-driven neuron, with a small additional component related to movement to the smaller of the objects (panel 7). B. Primarily movement-driven neuron, with additional firing during contact with the physical objects (panel 5). Format as Fig. 4.
  • 30. Figure 6. Odd units: Neurons that were clearly task-related, but which were difficult nonetheless to classify. A. A neuron which fired only during reach towards the small object, but which fired at a significantly higher rate for interactions with the virtual object than the physical object (panel 3).B. A neuron which responded only when the animal interacted with a virtual object (panels 2, 4, and 6). Format as Fig. 4.
  • 31. Figure 7. Summary of receptive field types and recording locations. A. Location of the receptive fields for all of the neurons reported here. B. Classification according to the size of the receptive fields. C. Breakdown of cell properties based on the location of the receptive fields. D. Depth relative to first cortical contact for the neurons reported here.
  • 32. Figure 8: Breakdown of cells by classification type. Receptive field locations, sizes, and recording depths of each of the four classes of neurons. Format as Fig. 7.
  • 33. DISCUSSION Bensmaia SJ, Denchev PV, Dammann JF, 3rd, Craig JC, and Hsiao SS. The representation of stimulus orientation in the early stages of somatosensory processing. J Neurosci 28: 776-786, 2008a. Bensmaia SJ, Hsiao SS, Denchev PV, Killebrew JH, and Craig JC. The tactile perception of stimulus orientation. Somatosens Mot Res 25: 49-59, 2008b. Burke D, Gandevia SC, and Macefield G. Responses to passive movement of receptors in joint, skin and muscle of the human hand. Journal of Physiology (London) 402: 347-361, 1988. Carmena JM, Lebedev MA, Crist RE, O'Doherty JE, Santucci DM, Dimitrov DF, Patil PG, Henriquez CS, and Nicolelis MA. Learning to control a brain-machine interface for reaching and grasping by primates. PLoS Biol 1: E42, 2003. Christensen MS, Lundbye-Jensen J, Geertsen SS, Petersen TH, Paulson OB, and Nielsen JB. Premotor cortex modulates somatosensory cortex during voluntary movements without proprioceptive feedback. Nat Neurosci 10: 417-419, 2007. Cohen DAD, Prud'homme MJL, and Kalaska JF. Tactile activity in primate somatosensory cortex during active arm movements: correlation with receptive field properties. J Neurophysiol 71: 161-172, 1994. Cordo PJ, Horn JL, Kunster D, Cherry A, Bratt A, and Gurfinkel V. Contributions of skin and muscle afferent input to movement sense in the human hand. J Neurophysiol 105: 1879-1888, 2011. Edin BB. Quantitative analysis of static strain sensitivity in human mechanoreceptors from hairy skin. J Neurophysiol 67: 1105-1113, 1992. Edin BB, and Abbs JH. Finger movement responses of cutaneous mechanoreceptors in the dorsal skin of the human hand. J Neurophysiol 65: 657-670, 1991. Edin BB, and Johansson N. Skin strain patterns provide kinaesthetic information to the human central nervous system. Journal of Physiology (London) 487: 243-251, 1995. Evarts EV, and Fromm C. Sensory responses in motor cortex neurons during precise motor control. Neurosci Lett 5: 267-272, 1977. Flanagan JR, Vetter P, Johansson RS, and Wolpert DM. Prediction precedes control in motor learning. Curr Biol 13: 146-150, 2003. Fromm C, and Evarts EV. Pyramidal tract neurons in somatosensory cortex: central and peripheral inputs during voluntary movement. Brain Res 238: 186-191, 1982. Fromm C, and Evarts EV. Relation of motor cortex neurons to precisely controlled and ballistic movements. Neurosci Lett 5: 259-265, 1977. Fromm C, Wise SP, and Evarts EV. Sensory response properties of pyramidal tract neurons in the precentral motor cortex and postcentral gyrus of the rhesus monkey. Exp Brain Res 54: 177-185, 1984. Georgopoulos AP, Caminiti R, Kalaska JF, and Massey JT. Spatial coding of movement: a hypothesis concerning the coding of movement direction by motor cortical populations. Exp Brain Res Suppl. 7: 327-336, 1983.
  • 34. Georgopoulos AP, Kalaska JF, Caminiti R, and Massey JT. On the relations between the direction of two-dimensional arm movements and cell discharge in primate motor cortex. J Neurosci 2: 1527-1537, 1982. Georgopoulos AP, Kalaska JF, Crutcher MD, Caminiti R, and Massey JT. The representation of movement direction in the motor cortex: single cell and population studies. In: Dynamic Aspects of Neocortical Function, edited by Edelman GM, WE Gail, and Cowan WM1984, p. 501-524. Georgopoulos AP, Schwartz AB, and Kettner RE. Neuronal population coding of movement direction. Science 233: 1416-1419, 1986. Goodwin AW, and Wheat HE. Sensory signals in neural populations underlying tactile perception and manipulation. Annu Rev Neurosci 27: 53-77, 2004. Johansson RS, and Flanagan JR. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nat Rev Neurosci 10: 345-359, 2009. Kalaska JF, Cohen DAD, Prud'homme M, and Hyde ML. Parietal area 5 neuronal activity encodes movement kinematics, not movement dynamics. Exp Brain Res 80: 351-364, 1990. Lawson R, and Bracken S. Haptic object recognition: how important are depth cues and plane orientation? Perception 40: 576-597, 2011. Lemon RN, and Porter R. A comparison of the responsiveness to peripheral stimuli of pre- central cortical neurones in anaesthetized and conscious monkeys [proceedings]. J Physiol 260: 53P-54P, 1976a. Lemon RN, and Porter R. Afferent input to movement-related precentral neurones in conscious monkeys. Proc R Soc Lond B Biol Sci 194: 313-339, 1976b. Lemon RN, and Porter R. Proceedings: Natural afferent input to movement-related neurones in monkey pre-central cortex. J Physiol 258: 18P-19P, 1976c. McAndrew RM, Lingo-VanGilder JL, Naufel SN, and Helms Tillery S. Individualized recording chambers for non-human primate neurophysiology. Miocinovic S, Noecker AM, Maks CB, Butson CR, and McIntyre CC. Cicerone: stereotactic neurophysiological recording and deep brain stimulation electrode placement software system. Acta Neurochir Suppl 97: 561-567, 2007. Mott FW, and Sherrington CS. Experiments upon the influence of sensory nerves upon movement and nutrition of the limbs. Preliminary communication. Proc Roy Soc Lon 57: 481-488, 1895. Mountcastle VB, Steinmetz MA, and Romo R. Frequency discrimination in the sense of flutter: psychophysical measurements correlated with postcentral events in behaving monkeys. J Neurosci 10: 3032-3044, 1990. Nawrot M, Aertsen A, and Rotter S. Single-trial estimation of neuronal firing rates: from single-neuron spike trains to population activity. J Neurosci Methods 94: 81-92, 1999. O'Doherty JE, Lebedev MA, Ifft PJ, Zhuang KZ, Shokur S, Bleuler H, and Nicolelis MA. Active tactile exploration using a brain-machine-brain interface. Nature 479: 228-231. Pei YC, Hsiao SS, Craig JC, and Bensmaia SJ. Neural mechanisms of tactile motion integration in somatosensory cortex. Neuron 69: 536-547, 2011.
  • 35. Pei YC, Hsiao SS, Craig JC, and Bensmaia SJ. Shape invariant coding of motion direction in somatosensory cortex. PLoS Biol 8: e1000305, 2010. Prud'homme MJL, and Kalaska JF. Proprioceptive activity in primate primary somatosensory cortex during active arm reaching movements. J Neurophysiol 72: 2280-2301, 1994. Rincon-Gonzalez L, Warren JP, Meller DM, and Tillery SH. Haptic interaction of touch and proprioception: implications for neuroprosthetics. IEEE Trans Neural Syst Rehabil Eng 19: 490-500, 2011. Serruya MD, Hatsopoulos NG, Paninski L, Fellows MR, and Donoghue JP. Brain-machine interface: Instant neural control of a movement signal. Nature 416: 141-142., 2002. Sinclair RJ, and Burton H. Neuronal activity in the second somatosensory cortex of monkeys (Macaca mulatta) during active touch of gratings. J Neurophysiol 70: 331-350, 1993. Sinclair RJ, and Burton W. Neuronal activity in the primary somatosensory cortex in monkeys (Macaca mulatta) during active touch of textured surface gratings: responses to groove width, applied force, and velocity of motion. J Neurophysiol 66: 153-169, 1991. Soso MJ, and Fetz EE. Responses of identified cells in postcentral cortex of awake monkeys during comparable active and passive joint movements. J Neurophysiol 43: 1090-1110, 1980. Strick PL, and Preston JB. Two representations of the hand in area 4 of a primate. II. Somatosensory input organization. J Neurophysiol 48: 150-159, 1982. Taylor DM, Tillery SI, and Schwartz AB. Direct cortical control of 3D neuroprosthetic devices. Science 296: 1829-1832., 2002. Taylor DM, Tillery SI, and Schwartz AB. Information conveyed through brain-control: cursor versus robot. IEEE Trans Neural Syst Rehabil Eng 11: 195-199, 2003. Tillery SI, Soechting JF, and Ebner TJ. Somatosensory cortical activity in relation to arm posture: nonuniform spatial tuning. J Neurophysiol 76: 2423-2438., 1996. Twitchell TE. Sensory factors in purposive movement. J Neurophysiol 17: 239-252, 1954. Velliste M, Perel S, Spalding MC, Whitford AS, and Schwartz AB. Cortical control of a prosthetic arm for self-feeding. Nature 2008. Warren JP, Santello M, and Helms Tillery S. Effects of fusion between tactile and proprioceptive input on tactile perception. Public LIbrary of Science One 6: 2011. Wise SP, and Tanji J. Neuronal responses in sensorimotor cortex to ramp displacements and maintained positions imposed on hindlimb of the unanesthetized monkey. J Neurophysiol 45: 482-500, 1981.