Working with Brain Scans, Part 2

Morana AlacIn the latest issue of Social Studies of Science, 38(4), Morana Alac adds a new dimension to the history of visualization provided by the recent special issue of JHN (at AHP here).  She explains:

A significant part of functional magnetic resonance imaging (fMRI) practice in neuroscience is spent in front of computer screens. To investigate the brain, neuroscientists work with digital images. This paper recovers practical dealings with brain scans in fMRI laboratories to focus on the achievement of seeing in the digital realm. While looking at brain images, neuroscientists gesture and manipulate digital displays to manage and make sense of their experimental data. Their gestural engagements are seen as dynamical phenomenal objects enacted at the junction between the digital world of technology and the world of embodied action.

This latest essay builds on previous work published in, among other places, the Journal of Cognition and Culture and Social Epistemology.

Additional readings on the role of gesture in meaning-making are provided below the fold.


Historical Readings.

  • Knowlson, J. R. (1965). The Idea of Gesture as a Universal Language in the XVIIth and XVIIIth Centuries.  Journal of the History of Ideas, 26(4), 495-508. The idea that someone unable to speak or understand another person’s language might nonetheless communicate with him by the use of gesture is one that has occurred fairly often from classical times to the present day. Personal experience has usually been enough to show that gestures may sometimes succeed when words have failed. On the other hand, one has only to think of the misunderstandings and frustrations that can result from efforts to express by means of gesture anything in the least complex or abstract to realize very clearly the limitations of this mode of communication….  It is clearly impossible in a short article to trace the entire history of this particular idea. Our intention is therefore to examine here its emergence in the XVIIth century and to show particularly how, in the XVIIth and XVIIIth centuries, it was related to the development of gesture as a method of teaching the deaf.


Psychological Readings.

  • Garber, P., Alibali, M. W., & Goldin-Meadow, S. (1998). Knowledge Conveyed in Gesture Is Not Tied to the Hands. Child Development, 69(1), 75-84. Children frequently gesture when they explain what they know, and their gestures sometimes convey different information than their speech does. In this study, we investigate whether children’s gestures convey knowledge that the children themselves can recognize in another context. We asked fourth-grade children to explain their solutions to a set of math problems and identified the solution procedures each child conveyed only in gesture (and not in speech) during the explanations. We then examined whether those procedures could be accessed by the same child on a rating task that did not involve gesture at all. Children rated solutions derived from procedures they conveyed uniquely in gesture higher than solutions derived from procedures they did not convey at all. Thus, gesture is indeed a vehicle through which children express their knowledge. The knowledge children express uniquely in gesture is accessible on other tasks, and in this sense, is not tied to the hands.
  • Morsella, E. & Krauss, R. M. (2004). The Role of Gestures in Spatial Working Memory and Speech. American Journal of Psychology, 117(3), 411-424. Co-speech gestures traditionally have been considered communicative, but they may also serve other functions. For example, hand-arm movements seem to facilitate both spatial working memory and speech production. It has been proposed that gestures facilitate speech indirectly by sustaining spatial representations in working memory. Alternatively, gestures may affect speech production directly by activating embodied semantic representations involved in lexical search. Consistent with the first hypothesis, we found participants gestured more when describing visual objects from memory and when describing objects that were difficult to remember and encode verbally. However, they also gestured when describing a visually accessible object, and gesture restriction produced dysfluent speech even when spatial memory was untaxed, suggesting that gestures can directly affect both spatial memory and lexical retrieval.


See also:

  • Rebaglia, A. (2006). If you can manipulate them, must they be real? The epistemological role of instruments in nanotechnological research. In C. Garola, A. Rossi, & S. Sozzo, The Foundations of Quantum Mechanics (pp. 281-292).  Singapore: World Scientific Publishing. “So far as I’m concerned, if you can spray them then they are real” (Hacking, 1983, p. 23).  This statement embodies a well-known key point in Ian Hacking’s contemporary reading of scientific realism: scientific instruments assume a fundamental role in characterizing the ontological scenario to believe in.  This paper focuses on the challenges of nanotechnology to this standpoint.  Scanning tunneling microscopy, as opposed to traditional microscopy (from optical to electron microscope), is not an imaging but a “touching and rearranging” technique.  It requires a deep appraisal of epistemological ideas such as “representing” and “intervening,” “knowing” “natural” entities and “creating” “artificial” ones.

-JTB.

About Jeremy Burman

Jeremy Trevelyan Burman is a senior doctoral student in York University’s Department of Psychology, specializing in the history of developmental psychology and its theory (especially that pertaining to Jean Piaget). Prior to returning to academia, he was a producer at the Canadian Broadcasting Corporation.