Neural dynamics in superior colliculus of freely moving mice DOI Creative Commons
Shelby L. Sharp, Jhoseph Shin, Dylan M. Martins

и другие.

bioRxiv (Cold Spring Harbor Laboratory), Год журнала: 2025, Номер unknown

Опубликована: Апрель 19, 2025

Abstract Vision is an active process that depends on head and eye movements to explore the visual environment. Superior colliculus (SC) known for its role in generating these movements, as well processing information, but has not been studied extensively during free movement complex environments. To determine impact of vision, we recorded neural activity across depth SC while simultaneously recording position. We find superficial (sSC) neurons respond input following gaze-shifting saccadic whereas deep (dSC) themselves, demonstrated by their sustained response darkness. Additionally, motor responses dSC are more correlated rather than movements. Furthermore, compared sSC gaze shift primary cortex (V1), finding similarities key types, although temporal sequences shifts differ between regions. Our results demonstrate distinct differences V1 highlighting various roles plays vision. Highlights depths superior freely moving mice measuring Neurons mouse strongly shifts, layers. primarily a movement. generally represent movement, independent input. While share with there unique profiles suggest

Язык: Английский

Sound elicits stereotyped facial movements that provide a sensitive index of hearing abilities in mice DOI
Kameron K. Clayton, Kamryn S. Stecyk,

Anna Guo

и другие.

Current Biology, Год журнала: 2024, Номер 34(8), С. 1605 - 1620.e5

Опубликована: Март 15, 2024

Язык: Английский

Процитировано

19

Eye saccades align optic flow with retinal specializations during object pursuit in freely moving ferrets DOI Creative Commons
Damian J. Wallace,

Kay-Michael Voit,

Danuza de Oliveira Machado

и другие.

Current Biology, Год журнала: 2025, Номер unknown

Опубликована: Фев. 1, 2025

Highlights•Saccades during target pursuit align area centralis with intended direction of travel•Saccades simultaneously also retinal pattern optic flow•Post-saccade eye and head rotation reduce image blur limit information loss•Tree shrews, mice, rats have the same coordinated kineticsSummaryDuring prey pursuit, how rotations, such as saccades, enable continuous tracking erratically moving targets while enabling an animal to navigate through environment is unknown. To better understand this, we measured rotations in freely running ferrets behavior. By all environmental features, reconstructed animal's visual fields their relationship structures. In fields, position clustered on around high-acuity location, centralis, surprisingly, this cluster was not significantly shifted by digital removal either exclusively elicited when made turns, or that were tightly synchronized saccades. Here, show that, saccades did fixate they instead aligned travel. This features flow pattern, focus expansion, used for navigation many species. While initially rotated eyes turn, followed countering ongoing rotation, which reduced limited loss across field turns. As rotational tree rats, suggest these counter-rotations are a generalized mechanism mammals complex environments pursuit.Graphical abstract

Язык: Английский

Процитировано

2

Active vision in freely moving marmosets using head-mounted eye tracking DOI Creative Commons
Vikram Pal Singh,

Jingwen Li,

Kyle Dawson

и другие.

Proceedings of the National Academy of Sciences, Год журнала: 2025, Номер 122(6)

Опубликована: Фев. 3, 2025

Our understanding of how vision functions as primates actively navigate the real-world is remarkably sparse. As most data have been limited to chaired and typically head-restrained animals, synergistic interactions different motor actions/plans inherent active sensing—e.g., eyes, head, posture, movement, etc.—on visual perception are largely unknown. To address this considerable gap in knowledge, we developed an innovative wireless head-mounted eye-tracking system that performs Chair-free Eye-Recording using Backpack mounted micROcontrollers (CEREBRO) for small mammals, such marmoset monkeys. Because eye illumination environment lighting change continuously natural contexts, a segmentation artificial neural network perform robust pupil tracking these conditions. Leveraging investigate vision, demonstrate although freely moving marmosets exhibit frequent compensatory movements equivalent other primates, including humans, predictability behavior (gaze) higher when animals relative they head-fixed. Moreover, despite increases eye/head-motion during locomotion, gaze stabilization remains steady because increase vestibularocular reflex gain locomotion. These results efficient, dynamic visuo-motor mechanisms related behaviors enable stable, high-resolution foveal explore world.

Язык: Английский

Процитировано

1

Interactions between rodent visual and spatial systems during navigation DOI
Aman B. Saleem, Laura Busse

Nature reviews. Neuroscience, Год журнала: 2023, Номер 24(8), С. 487 - 501

Опубликована: Июнь 28, 2023

Язык: Английский

Процитировано

21

Detailed characterization of neural selectivity in free viewing primates DOI Creative Commons
Jacob L. Yates, Shanna Coop, Gabriel Sarch

и другие.

Nature Communications, Год журнала: 2023, Номер 14(1)

Опубликована: Июнь 20, 2023

Abstract Fixation constraints in visual tasks are ubiquitous and cognitive neuroscience. Despite its widespread use, fixation requires trained subjects, is limited by the accuracy of fixational eye movements, ignores role movements shaping input. To overcome these limitations, we developed a suite hardware software tools to study vision during natural behavior untrained subjects. We measured receptive fields tuning properties from multiple cortical areas marmoset monkeys who freely viewed full-field noise stimuli. The resulting curves primary cortex (V1) area MT match reported selectivity literature which was using conventional approaches. then combined free viewing with high-resolution tracking make first detailed 2D spatiotemporal measurements foveal V1. These findings demonstrate power characterize neural responses animals while simultaneously studying dynamics behavior.

Язык: Английский

Процитировано

20

A dynamic sequence of visual processing initiated by gaze shifts DOI
Philip R. L. Parker, Dylan M. Martins, Emmalyn S. P. Leonard

и другие.

Nature Neuroscience, Год журнала: 2023, Номер 26(12), С. 2192 - 2202

Опубликована: Ноя. 23, 2023

Язык: Английский

Процитировано

20

Fast prediction in marmoset reach-to-grasp movements for dynamic prey DOI Creative Commons
Luke Shaw, Kuan Hong Wang, Jude F. Mitchell

и другие.

Current Biology, Год журнала: 2023, Номер 33(12), С. 2557 - 2565.e4

Опубликована: Июнь 1, 2023

Язык: Английский

Процитировано

16

Coding of latent variables in sensory, parietal, and frontal cortices during closed-loop virtual navigation DOI Creative Commons
Jean‐Paul Noel, Edoardo Balzani, Eric Avila

и другие.

eLife, Год журнала: 2022, Номер 11

Опубликована: Окт. 25, 2022

We do not understand how neural nodes operate and coordinate within the recurrent action-perception loops that characterize naturalistic self-environment interactions. Here, we record single-unit spiking activity local field potentials (LFPs) simultaneously from dorsomedial superior temporal area (MSTd), parietal 7a, dorsolateral prefrontal cortex (dlPFC) as monkeys navigate in virtual reality to ‘catch fireflies’. This task requires animals actively sample a closed-loop environment while concurrently computing continuous latent variables: (i) distance angle travelled (i.e., path integration) (ii) memorized firefly location hidden spatial goal). observed patterned mixed selectivity, with most prominently coding for variables, sensorimotor MSTd often eye movements. However, even traditionally considered sensory MSTd) tracked demonstrating integration vector of goals. Further, global encoding profiles unit-to-unit coupling noise correlations) suggested functional subnetwork composed by dlPFC, between these anatomy would suggest. show greater more animals’ gaze position was indicative ongoing goal. suggest this MSTd-dlPFC reflects monkeys’ natural adaptive strategy wherein they continuously toward (invisible) target. Together, results highlight distributed nature during closed fine-grain subnetworks may be dynamically established subserve (embodied) strategies.

Язык: Английский

Процитировано

21

Cortical Integration of Vestibular and Visual Cues for Navigation, Visual Processing, and Perception DOI Creative Commons

Sepiedeh Keshavarzi,

Mateo Vélez‐Fort, Troy W. Margrie

и другие.

Annual Review of Neuroscience, Год журнала: 2023, Номер 46(1), С. 301 - 320

Опубликована: Июль 10, 2023

Despite increasing evidence of its involvement in several key functions the cerebral cortex, vestibular sense rarely enters our consciousness. Indeed, extent to which these internal signals are incorporated within cortical sensory representation and how they might be relied upon for sensory-driven decision-making, during, example, spatial navigation, is yet understood. Recent novel experimental approaches rodents have probed both physiological behavioral significance indicate that their widespread integration with vision improves perceptual accuracy self-motion orientation. Here, we summarize recent findings a focus on circuits involved visual perception navigation highlight major remaining knowledge gaps. We suggest vestibulo-visual reflects process constant updating regarding status self-motion, access such information by cortex used predictions may implemented rapid, navigation-related decision-making.

Язык: Английский

Процитировано

12

How ‘visual’ is the visual cortex? The interactions between the visual cortex and other sensory, motivational and motor systems as enabling factors for visual perception DOI
Cyriel M. A. Pennartz, Matthijs N. Oude Lohuis, Umberto Olcese

и другие.

Philosophical Transactions of the Royal Society B Biological Sciences, Год журнала: 2023, Номер 378(1886)

Опубликована: Авг. 7, 2023

The definition of the visual cortex is primarily based on evidence that lesions this area impair perception. However, does not exclude may process more information than retinal origin alone, or other brain structures contribute to vision. Indeed, research across past decades has shown non-visual information, such as neural activity related reward expectation and value, locomotion, working memory sensory modalities, can modulate primary cortical responses inputs. Nevertheless, function poorly understood. Here we review recent evidence, coming from studies in rodents, arguing motor effects play a role processing itself, for instance disentangling direct auditory sound-evoked orofacial movement. These findings are placed broader framework casting vision terms predictive under control frontal, reward- motor-related systems. In contrast prevalent notion exclusively constructed by system, propose percepts generated larger network-the extended system-spanning cortices, supramodal areas frontal This article part theme issue 'Decision processes multisensory perception'.

Язык: Английский

Процитировано

12