New paper out in @cognitionjournal.bsky.social by Ombline Rérolle and Leila Selimbegovic. How do we perceive dehumanized others?
www.sciencedirect.com/science/arti...
Deepest Thanks to Fondation Ostad Elahi – éthique et solidarité humaine, for funding this work!
Posts by Cognition
From spotting predators to recognizing other people, this system may lay the foundation for perceiving life across terrestrial species.
More strikingly, watching pigeon movements also biases animacy perception for human motion (Exp. 5).
This suggests a “life detection” system in the human brain attuned to kinematic patterns common among vertebrates.
Motion matters, not form.
Adapting to local foot motions with diagnostic kinematic cues is enough to change how animate the walker looked (Exp. 3),but static poses alone have no effect (Exp. 2).
Prolonged viewing of human walking biases subsequent perception of ambiguous walkers toward less animate (Exps.1 & 4). Just by adapting your brain, the same motion can feel less “alive”!
“Seeing Life in Motion: Animacy Perception across Species Revealed by Adaptation”
New paper from: Mei Huang, Xinlin Yang, Geqing Yang, Li Shen, Zhaoqi Hu, Ying Wang, & Yi Jiang
www.sciencedirect.com/science/arti...
How does the brain tell something is “alive” just by the way it moves? Using visual adaptation, this study suggests the existence of a neural mechanism for perceiving “life” from the movements of other people and even animals 👇.
Great collaboration with Artyom Zinchenko, Ananya Mandal, Heinrich Liesefeld, Daniel Weinert, & Thomas Geyer!
doi.org/10.1016/j.co...
#Cognition #VR #Psychology #Neuroscience
In sum: Using immersive VR + head tracking, we show that contextual cueing in large-scale visual search relies on two mechanisms: display-specific long-term memory for repeated layouts, and procedural scanning routines generalized across displays.
Using Dynamic Time Warping (DTW), we show that consistent scanning behavior predicts faster search — suggesting that contextual cueing is not just memory-based, but also a learnable motor skill.
Even more: classic 2D lab findings generalize to complex 3D environments.
Two key insights:
• We form memory templates for specific environments
• We develop general "head-scanning" routines that transfer across scenes
How do we search in the real world?
In our new paper in Cognition, we used immersive VR to study visual search beyond standard 2D screens — and found that people learn not just where to look, but how to look.
"The differential contribution of implicit and explicit priors during motion extrapolation"
📢New paper from: Giuseppe Di Dona, Sara Stottmeier, Alessia Santoni, Klara Hemmerich, & Luca Ronconi
www.sciencedirect.com/science/arti...
Not all prior knowledge influences what we see in the same way. Studying motion extrapolation, i.e. the ability to infer an object position from its past trajectory, this work highlights that implicit knowledge reshapes perception itself, while explicit knowledge adjusts how much we trust our senses
Faces behind us can be perceived as more intense, particularly for negative emotions, as egocentric spatial position biases emotion perception—even without turning. This may reflect a bias prioritizing potential threats located behind the observer.
www.sciencedirect.com/science/arti...
Thrilled to share that our new paper is now out in @cognitionjournal.bsky.social: "Who knows what? Bayesian Competence Inference guides Knowledge Attribution and Information Search," with @oliviermorin.bsky.social , @hugoreasoning.bsky.social & @tadegquillien.bsky.social!
Link: tinyurl.com/ykyhxcc6
Our results show that Whorfian views are insufficient on their own to explain cognitive diversity in spatial frames of reference.
Read more here: www.sciencedirect.com/science/arti...
This variation is often considered a classic Whorfian effect of language.
Studying the Hai||om in Namibia, we find a shift toward egocentric spatial thinking over time—despite no change in spatial language.
Human cultures vary in how they talk and think about space: some rely on body-centered (egocentric) reference frames, others on environment-centered (geocentric) ones.
A must-read for researchers in cognitive neuroscience, linguistics, and computational neuroscience! #CognitiveScience #Neurolinguistics #fMRIResearch #SyntacticProcessing
www.sciencedirect.com/science/arti...
The findings also support the neurobiological plausibility of the dependency length minimization hypothesis, offering critical evidence for the brain’s efficient allocation of neural resources for abstract syntactic complexity.
2. Links neural activation similarity (dependency length vs. syntactic structures) to integration necessity and length variability;
3. Proposes a novel XGBoost-SHAP + Double Machine Learning framework for fine-grained syntactic parsing in natural narratives (ecological validity + statistical rigor).
This study advances syntactic processing research to the sub-property level with three key contributions:
1. Identifies robust, distributed neural correlates for dependency length processing in the human frontotemporal-parietal language network;
We’re excited to feature new research uncovering the neural underpinnings of dependency length processing—a core quantitative syntactic feature—during natural language comprehension!
Reaction times for correct responses across the three test phases in monkeys and crows. While monkeys slowed down with increasing task demands, crows responded rapidly and largely independent of numerosity (Animal silhouettes available from phylopic.org under CCL).
These results suggest that similar numerical performance can arise from different cognitive strategies, highlighting that different brains may solve the same problem in different ways.
Both species exhibited classic Weber-like accuracy patterns. However, their reaction times are impressively different: the monkeys' reaction times slowed down as the task became more demanding, whereas the crows responded much faster.
Same task - but different cognitive strategies.
How do different brains process numbers? To test whether they rely on similar cognitive mechanisms, we compared monkeys and crows performing the same sequential delayed match-to-numerosity task with up to 3 test images.
Monkeys solve an abstract-category delayed match-to-sample task through slow, capacity-limited, sequential checking, whereas crows rely on fast, parallel, anticipatory processing with reduced working-memory load.
www.sciencedirect.com/science/arti...
🎉Our paper was just accepted in Cognition!🎉 Title: Training “Zero” in Preschoolers: Fast Referential Learning, Slow Relational Integration
Authors: Yanfei Yu, Marianna Thorne, David Barner
@cognitionjournal.bsky.social @drbarner.bsky.social