Zoom registration link: https://harvard.zoom.us/meeting/register/tJcoceGtqzMrHtM2S2qcQ8fh-Giyjr02gSfp#/registration

How does the brain represent dynamic sensory information from the natural environment?  How are sensory signals and motor commands coordinated to direct actions in 3D space? The echolocating bat presents a powerful animal model to address these questions, as it produces the very acoustic signals that guide its behaviors. Importantly, the echolocating bat adapts its echolocation signal design with changes in behavioral state, providing an experimental window to quantify on a millisecond time scale the information an animal has processed and the information it is seeking. My talk will review a series of experiments that probe the echolocating big brown bat’s sonar target tracking behaviors, both in flight and from a stationary perch. These experiments employed high-speed stereo IR video to reconstruct the bat’s flight trajectories and microphone array recordings to measure its sonar beam aim (acoustic gaze), as it steered around obstacles and prepared to intercept prey. Some experiments combined behavioral studies with neural telemetry recordings, which revealed dynamic remapping of echo response areas with shifts in the bat’s sonar-guided attention. Collectively, results from these experiments provide evidence that the echolocating big brown bat 1) actively controls its acoustic gaze to inspect and track objects, 2) couples its acoustic gaze and locomotor plan, and 3) relies on internal models of target motion to enable tracking under occlusion and to predict its point of interception with prey. These findings show parallels with visual tracking in humans and other animals.

0 people are interested in this event