Fast prediction in marmoset reach-to-grasp movements for dynamic prey DOI Creative Commons
Luke Shaw, Kuan Hong Wang, Jude F. Mitchell

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2022, Volume and Issue: unknown

Published: Oct. 10, 2022

Summary Primates have evolved sophisticated visually guided reaching behaviors for interacting with dynamic objects, such as insects during foraging(P. S. Archambault, Ferrari-Toniolo, & Battaglia-Mayer, 2011; Bicca-Marques, 1999; Ngo et al., 2022; Smith Smith, 2013; Sustaita 2013). Reaching control in natural conditions requires active prediction of the target’s future position to compensate visuo-motor processing delays and enhance online movement adjustments(Catania, 2009; Desmurget Grafton, 2000; Fujioka, Aihara, Sumiya, Hiryu, 2016; Merchant Georgopoulos, 2006; Mischiati 2015; R. Shadmehr, Krakauer, 2010; Wolpert Kawato, 1998). Past research non-human primates mainly focused on seated subjects engaged repeated ballistic arm movements either stationary targets, or targets that instantaneously change movement(Philippe Caminiti, Battaglia-Mayer Dickey, Amit, Hatsopoulos, Kalaska, Massey, 1983; 1981). However, those approaches impose task constraints limit dynamics reaching. A recent field study marmoset monkeys highlights predictive aspects visually-guided insect prey capture among wild monkeys(Ngo 2022). To examine complementary similar behavior within a laboratory context we developed an ecologically motivated unrestrained reach-to-grasp involving live crickets. We used multiple high-speed video cameras marmosets crickets stereoscopically applied machine vision algorithms marker-free object hand tracking. Contrary estimates under traditional constrained paradigms, find can operate at incredibly short around 80 milliseconds, rivaling speeds are typical oculomotor systems closed-loop visual pursuit(Cloherty, Yates, Graf, DeAngelis, Mitchell, 2020). Multivariate linear regression modeling kinematic relationships between cricket velocity revealed predictions expected location fast These results suggest critical role facilitating adjustments prey.

Language: Английский

Active vision in freely moving marmosets using head-mounted eye tracking DOI Creative Commons
Vikram Pal Singh,

Jingwen Li,

Kyle Dawson

et al.

Proceedings of the National Academy of Sciences, Journal Year: 2025, Volume and Issue: 122(6)

Published: Feb. 3, 2025

Our understanding of how vision functions as primates actively navigate the real-world is remarkably sparse. As most data have been limited to chaired and typically head-restrained animals, synergistic interactions different motor actions/plans inherent active sensing—e.g., eyes, head, posture, movement, etc.—on visual perception are largely unknown. To address this considerable gap in knowledge, we developed an innovative wireless head-mounted eye-tracking system that performs Chair-free Eye-Recording using Backpack mounted micROcontrollers (CEREBRO) for small mammals, such marmoset monkeys. Because eye illumination environment lighting change continuously natural contexts, a segmentation artificial neural network perform robust pupil tracking these conditions. Leveraging investigate vision, demonstrate although freely moving marmosets exhibit frequent compensatory movements equivalent other primates, including humans, predictability behavior (gaze) higher when animals relative they head-fixed. Moreover, despite increases eye/head-motion during locomotion, gaze stabilization remains steady because increase vestibularocular reflex gain locomotion. These results efficient, dynamic visuo-motor mechanisms related behaviors enable stable, high-resolution foveal explore world.

Language: Английский

Citations

1

Comparing eye–hand coordination between controller-mediated virtual reality, and a real-world object interaction task DOI Creative Commons

Ewen B. Lavoie,

Jacqueline S. Hebert, Craig S. Chapman

et al.

Journal of Vision, Journal Year: 2024, Volume and Issue: 24(2), P. 9 - 9

Published: Feb. 23, 2024

Virtual reality (VR) technology has advanced significantly in recent years, with many potential applications. However, it is unclear how well VR simulations mimic real-world experiences, particularly terms of eye-hand coordination. This study compares coordination from a previously validated object interaction task to the same re-created controller-mediated VR. We recorded eye and body movements segmented participants' gaze data using movement data. In condition, participants wore head-mounted tracker motion capture markers moved pasta box into out set shelves. headset virtual handheld controllers. Unsurprisingly, took longer complete task. Before picking up or dropping off box, real world visually fixated about half second before their hand arrived at area action. 500-ms minimum fixation time was preserved Real-world disengaged eyes almost immediately after initiated terminated interaction, but stayed on for much picked dropped off. speculate that limited haptic feedback during interactions forces users maintain visual objects than world, altering These findings suggest current does not replicate experience

Language: Английский

Citations

6

Computational cross‐species views of the hippocampal formation DOI
Seren L. Zhu, Kaushik J. Lakshminarasimhan, Dora E. Angelaki

et al.

Hippocampus, Journal Year: 2023, Volume and Issue: 33(5), P. 586 - 599

Published: April 11, 2023

Abstract The discovery of place cells and head direction in the hippocampal formation freely foraging rodents has led to an emphasis its role encoding allocentric spatial relationships. In contrast, studies head‐fixed primates have additionally found representations views. We review recent experiments moving monkeys that expand upon these findings show postural variables such as eye/head movements strongly influence neural activity formation, suggesting function hippocampus depends on where animal looks. interpret results light humans performing challenging navigation tasks which suggest depending context, serve one two roles—gathering information about structure environment (active sensing) or externalizing contents internal beliefs/deliberation (embodied cognition). These prompt future experimental investigations into carried by signals flowing between brain regions controlling variables, constitute a basis for updating computational theories system accommodate movements.

Language: Английский

Citations

16

Fast prediction in marmoset reach-to-grasp movements for dynamic prey DOI Creative Commons
Luke Shaw, Kuan Hong Wang, Jude F. Mitchell

et al.

Current Biology, Journal Year: 2023, Volume and Issue: 33(12), P. 2557 - 2565.e4

Published: June 1, 2023

Language: Английский

Citations

16

The neurobiology of vocal communication in marmosets DOI Creative Commons
Dori M. Grijseels,

Brendan Prendergast,

Julia C. Gorman

et al.

Annals of the New York Academy of Sciences, Journal Year: 2023, Volume and Issue: 1528(1), P. 13 - 28

Published: Aug. 24, 2023

Abstract An increasingly popular animal model for studying the neural basis of social behavior, cognition, and communication is common marmoset ( Callithrix jacchus ). Interest in this New World primate across neuroscience now being driven by their proclivity prosociality repertoire, high volubility, rapid development, as well amenability to naturalistic testing paradigms freely moving recording imaging technologies. The complement these characteristics set marmosets up be a powerful brain years come. Here, we focus on vocal because it area that has both made most progress illustrates prodigious potential species. We review current state field with various areas networks involved perception production, comparing findings from other animals, including humans.

Language: Английский

Citations

15

Dynamic modulation of social gaze by sex and familiarity in marmoset dyads DOI Open Access

Feng Xing,

Alec G. Sheffield, Monika P Jadi

et al.

Published: March 17, 2025

Social communication relies on the ability to perceive and interpret direction of others’ attention, is commonly conveyed through head orientation gaze in humans nonhuman primates. However, traditional social experiments primates require restraining movements, significantly limiting their natural behavioral repertoire. Here, we developed a novel framework for accurately tracking facial features three-dimensional orientations multiple freely moving common marmosets ( Callithrix jacchus ). By combining deep learning-based computer vision tools with triangulation algorithms, were able track marmoset dyads within an arena. This method effectively generates dynamic 3D geometrical frames while overcoming challenges like occlusion. To detect direction, constructed virtual cone, oriented perpendicular frame. Using this pipeline, quantified different types interactive events, including partner-directed joint shared spatial location. We observed clear effects sex familiarity both interpersonal distance dynamics dyads. Unfamiliar pairs exhibited more stereotyped patterns arena occupancy, sustained levels across distance, increased monitoring. On other hand, familiar higher gazes. Moreover, males displayed elevated gazes toward females’ faces surrounding regions, irrespective familiarity. Our study reveals importance two key factors driving behaviors prosocial primate species lays groundwork rigorous quantification naturalistic settings.

Language: Английский

Citations

0

Dynamic modulation of social gaze by sex and familiarity in marmoset dyads DOI Open Access

Feng Xing,

Alec G. Sheffield, Monika P Jadi

et al.

Published: March 17, 2025

Social communication relies on the ability to perceive and interpret direction of others’ attention, is commonly conveyed through head orientation gaze in humans nonhuman primates. However, traditional social experiments primates require restraining movements, significantly limiting their natural behavioral repertoire. Here, we developed a novel framework for accurately tracking facial features three-dimensional orientations multiple freely moving common marmosets ( Callithrix jacchus ). By combining deep learning-based computer vision tools with triangulation algorithms, were able track marmoset dyads within an arena. This method effectively generates dynamic 3D geometrical frames while overcoming challenges like occlusion. To detect direction, constructed virtual cone, oriented perpendicular frame. Using this pipeline, quantified different types interactive events, including partner-directed joint shared spatial location. We observed clear effects sex familiarity both interpersonal distance dynamics dyads. Unfamiliar pairs exhibited more stereotyped patterns arena occupancy, sustained levels across distance, increased monitoring. On other hand, familiar higher gazes. Moreover, males displayed elevated gazes toward females’ faces surrounding regions, irrespective familiarity. Our study reveals importance two key factors driving behaviors prosocial primate species lays groundwork rigorous quantification naturalistic settings.

Language: Английский

Citations

0

Ultra-high field fMRI identifies an action-observation network in the common marmoset DOI Creative Commons
Alessandro Zanini, Audrey Dureux, Janahan Selvanayagam

et al.

Communications Biology, Journal Year: 2023, Volume and Issue: 6(1)

Published: May 22, 2023

The observation of others' actions activates a network temporal, parietal and premotor/prefrontal areas in macaque monkeys humans. This action-observation (AON) has been shown to play important roles social action monitoring, learning by imitation, cognition both species. It is unclear whether similar exists New-World primates, which separated from Old-Word primates ~35 million years ago. Here we used ultra-high field fMRI at 9.4 T awake common marmosets (Callithrix jacchus) while they watched videos depicting goal-directed (grasping food) or non-goal-directed actions. temporo-parieto-frontal network, including 6 45 cortices, PGa-IPa, FST TE occipito-temporal region V6A, MIP, LIP PG the occipito-parietal cortex. These results show overlap with humans macaques' AON, demonstrating existence an evolutionarily conserved that likely predates separation Old primates.

Language: Английский

Citations

9

Dynamics of eye-hand coordination are flexibly preserved in eye-cursor coordination during an online, digital, object interaction task DOI Open Access
Jennifer K. Bertrand, Craig S. Chapman

Published: April 19, 2023

Do patterns of eye-hand coordination observed during real-world object interactions apply to digital, screen-based interactions? We adapted a interaction task (physically transferring cups in sequence about tabletop) into two-dimensional (dragging-and-dropping circles with cursor). collected gaze (with webcam eye-tracking) and cursor position data from 51 fully-remote, crowd-sourced participants who performed the on their own computer. applied time-series segmentation strategies resolve self-paced movement phases rigorously cleaned eye-tracking data. In this preliminary investigation, we found that: 1) persist adapt digital context, 2) remote, online, cursor-tracking are useful tools for capturing visuomotor behaviours ecologically-valid human-computer task. discuss how these findings might inform design principles further investigations natural that environments.

Language: Английский

Citations

7

An ethologically motivated neurobiology of primate visually-guided reach-to-grasp behavior DOI Creative Commons
Jude F. Mitchell, Kuan Hong Wang, Aaron P. Batista

et al.

Current Opinion in Neurobiology, Journal Year: 2024, Volume and Issue: 86, P. 102872 - 102872

Published: April 1, 2024

The precision of primate visually guided reaching likely evolved to meet the many challenges faced by living in arboreal environments, yet much what we know about underlying brain organization derives from a set highly constrained experimental paradigms. Here review role vision guide natural reach-to-grasp movements marmoset monkey prey capture illustrate breadth and diversity these behaviors ethological contexts, fast predictive nature [1,2], advantages this particular model investigate neural mechanisms more naturalistic contexts [3]. In addition their amenability freely-moving recording methods for investigating basis dynamic [4,5], marmosets have smooth neocortical surface that facilitates imaging array recordings [6,7] all areas fronto-parietal network [8,9]. Together, organism offers novel opportunities study real-world interplay between dynamics using ethologically motivated neuroscientific designs.

Language: Английский

Citations

2