Multi-modal sensory decision making in bats
How sensory information acquired by one sense is translated into a brain representation available to another sense and how input from different sensory modalities is weighted and integrated are fundamental questions in neuroscience. Bats are ideal animal models to study such questions because of their reliance on two sensory systems (vision and echolocation) which can both provide a high spatial resolution representation of the distal environment. In this work, we focus on Rousettus aegyptiacus (Egyptian fruit bat), which relies to a large degree on both echolocation and vision to build a sensory representation of the world.
In the first project, we examined how visual information, regulated by altering ambient light level, influences biosonar sampling. We found that the bats increased echolocation click rate and intensity at lower light levels, where visual information was limited. These findings demonstrate how sensory information from one modality (vision) may influence sensory sampling of another (biosonar). Additionally, the bats increased click rate prior to landing, which leads us to hypothesize that Egyptian fruit bats use echolocation to complement vision for accurate estimation of distance.
In the second project, we study cross-modal recognition between vision and echolocation by training bats to discriminate between objects using one modality, and then testing them in the other modality. In addition, we exploit the fact that the use of echolocation requires the bat to emit sound in order to perceive the environment. This allows us to tap into the bats’ sensory acquisition by recording echolocation and to assess the information the bats collect before making a decision.
The third project is dedicated to study the sensory properties that define an object. Here, we independently manipulate two different sensory properties of an object (i.e., its surface-area and acoustic reflectivity) and show that the two must coincide to allow perception. In this study, we compare two species of bats that differ in their sensory systems (Rousettus aegyptiacus and Pipistrellus kuhlii).
Finally, in the fourth project we test how bats weight information from vision and echolocation when different tasks are performed. To do this, two behavioral tasks were designed (i.e., orienting vs. landing) in which a conflict between visual and echolocation-based information exists. The bats’ behavior in each case will indicate the modality in use. We hypothesize that bats will trust visual information over acoustics when orientating – a task that requires high azimuthal resolution - and will prefer echolocation when landing – a task that requires high ranging resolution.
Danilovich S. & Yovel Y. Multi-sensory and multi-dimensional sensory perception in bats. Sensory Ecology International Course, Lund, Sweden, 2016 (poster).
Danilovich S. & Yovel Y. Multi-sensory and multi-dimensional sensory perception in bats. International Congress of Neuroethology, Montevideo, Uruguay, 2016 (talk).
Danilovich S. & Yovel Y. Multi-dimensional sensory perception in bats. 2nd Sagol School of Neuroscience Retreat, Ma’ale HaHamisha, Israel, 2016 (Blitz presentation)
Danilovich S., Bar Yossef M., Lewin T. & Yovel Y. Multi-modal perception and decision making in the Egyptian fruit bat. Israel Society for Neuroscience, Eilat, Israel, 2013 (talk).
Danilovich S., Bar Yossef M., Lewin T. & Yovel Y. Multi-modal perception and decision making in the Egyptian fruit bat. International Multisensory Research Forum, Jerusalem, Israel, 2013 (talk).
Danilovich S., Krishnan A., Wu-Jung L., Borrisov I., Eitan O., Kosa G., Moss C.F. & Yovel Y. (2015). Bats regulate biosonar based on the availability of visual information. Current Biology, 25(23), R1124-R1125.
Trozky foundation, 2016, Tel-Aviv University.
Sagol School of Neuroscience award for best student presentation, 2013, 2015, Tel-Aviv University.
Sieratzki prize for advances in neuroscience, 2014, Tel-Aviv University.
Research Categories: Animal Behavior, Behavioral Neuroscience