Cerebral Control of Limb Movement and Coordination.
The range and complexity of processes that are involved in motor control makes it a fundamental challenge in neuroscience. There are various ways to consider examination of this wide field. The aim of the current research proposal is to obtain a comprehensive understanding of the neuronal networks and interactions that underlie few aspects of movement control that might have implications on motor learning and rehabilitation. We plan to combine a variety of novel technologies including virtual reality (VR) devices, electroencephalogram (EEG), functional magnetic resonance imaging (fMRI) and electrocorticography (ECoG) in patients with intractable epilepsy, along with machine learning techniques, to study four domains: (A) the facilitation of motor learning transfer by visual feedback: We will decouple motor movements from sensory visual feedback to create novel learning transfer effect as a result of congruency level between execution and observation; (B) Revealing neurophysiological features that correspond with asymmetrical coordination interference. Understanding this mechanism is the first step in the development of novel bimanual learning techniques; (C) Studying the modulation of movement dynamics by sensory feedback: perception of movement heavily relies on sensory feedback. We will characterize correlation between neural activity and movement dynamics by manipulation of visual and auditory feedback to the same movement; and (D) Identifying the neural ‘building-blocks’ of human complex movements: recent studies have been successful in extracting muscle synergies and decomposing point-to-point movements into sub-movements. Yet, further mathematical and electrophysiological tools are needed to unravel the neural representations used for encoding complex movements as well as the mechanisms used for combining primitive movements. We will argue that EEG signal recorded during complex movements can be represented as a combination of a number of signals generated during ‘building-blocks’ moves which might allow integration with more sophisticated Brain-Machine Interfaces (BMIs). The proposed research could serve as a point of departure for exploration of observational learning-based techniques and have key consequences on real-word learning such as children motor acquisition, high-level sport skills and patients rehabilitations.
Ossmy, O., Fried, I., Mukamel, R. (2015). Decoding Speech Perception from Single Cell Activity in Humans. NeuroImage, 117, 151-159.
Reznik, D., Ossmy, O., Mukamel, R. (2015). Enhanced Auditory Evoked Activity to Self-Generated Sounds Is Mediated by Primary and Supplementary Motor Cortices. The Journal of Neuroscience, 35(5), 2173-2180.
Ossmy, O., Ben-Shachar, M., Mukamel, R. (2014). Decoding letter position in word reading.Cortex, 59, 74-83.
Ossmy, O., Moran, R., Pfeffer, T., Tsetsos, K., Usher, M., & Donner, T. H. (2013). The timescale of perceptual evidence integration can be adapted to the environment. Current Biology, 23(11), 981-986.
Ossmy, O., Tam, O., Puzis, R., Rokach, L., Inbar, O., & Elovici, Y. (2011). MindDesktop-Computer Accessibility for Severely Handicapped. ICEIS (4)(pp. 316-320).
Research Categories: Cognitive neuroscience