Advertisement · 728 × 90

Posts by AttentionLab Utrecht University

Post image

We had an exciting Seminar chat today from Stefan van der Stigchel, head of @attentionlab.bsky.social, titled "When participants aren’t human: Rethinking online behavioral research"

Thank you to those who attended and to Stefan for the great talk and discussion on AI/bots in research!

1 week ago 8 1 0 0
Snip  & Stitch: a simple and accessible correction for the pupil foreshortening error

For those interested in (re)analyzing (video-based) pupil data and looking into a solution for the foreshortening error (resulting from changes in the angle between camera and eye), check our open access Behavior Research Methods paper! rdcu.be/faNse
link.springer.com/article/10.3...

3 weeks ago 15 2 1 2
Video

We recently warned of bots in online behavioral research. @achetverikov.bsky.social showed there is no evidence for that in our @joinprolific.bsky.social data - but that doesn't mean we're safe. Agentic AI can do behavioral tasks through prompting alone. Reply & videos: osf.io/3cztr/overview

1 month ago 20 12 3 3

We show that synesthesia is sensory and automatic in nature: the pupil scales with the brightness of experienced synesthetic colors. doi.org/10.7554/eLif...
Now in its new dress @elife.bsky.social (convincing & valuable in round 1).
If anyone wants to pick up the method, happy to share & explain!

1 month ago 86 25 4 0
PNAS Proceedings of the National Academy of Sciences (PNAS), a peer reviewed journal of the National Academy of Sciences (NAS) - an authoritative source of high-impact, original research that broadly spans...

@attentionlab.bsky.social identified (what look like) #AI bots in an online response-time task (Posner cuing). Give-aways are normally distributed RTs and lack of serial-dependence effects. Pretty troublng. www.pnas.org/doi/10.1073/...

1 month ago 13 3 1 0

Recent work has shown how vulnerable online survey research is to LLMs. Motivated by this, we examined our online Posner cueing data from Prolific. It's concerning. We now must carefully consider when (or whether?) online behavioral data can be trusted.
see our comment:
www.pnas.org/doi/10.1073/...

2 months ago 78 34 6 6
Post image

Synesthetes claim sensory experiences, such as seeing color when reading or hearing a (black) number. 
But how genuine are these reports and sensations? We introduce a rather direct measure of synesthetic perception: Synesthetes’ pupils respond to evoked color as if it was real color #vision! 👁️🎨🧪

4 months ago 83 33 2 9

Our commentary @stigchel.bsky.social on Ruth Rosenholtz' Visual Attention in Crisis paper is now available:
doi.org/10.1017/S014...

We argue that effort must be considered when aiming to quantify capacity limits or a task's complexity.

4 months ago 9 4 1 0
Preview
Numerosity adaptation suppresses early visual responses - Communications Biology Numerosity adaptation suppresses monotonic neural responses to numerosity displays in the early visual cortex, with more suppression for higher numerosity adaptors. Therefore, numerosity adaptation ef...

Numerosity adaptation suppresses monotonic neural responses to numerosity displays in the early visual cortex, with more suppression for higher numerosity adaptors. Therefore, numerosity adaptation effects begin in early sensory stages of processing.

www.nature.com/articles/s42...

4 months ago 9 3 1 2
Advertisement
OSF

Planning on running a RIFT study? In a new manuscript, we put together the RIFT know-how accumulated over the years by multiple labs (@lindadrijvers.bsky.social, @schota.bsky.social, @eelkespaak.bsky.social, with Cecília Hustá and others).

Preprint: osf.io/preprints/ps...

5 months ago 22 8 1 1

Filled with a bunch of extra analyses, this is now accepted in The Journal of Neuroscience @sfn.org! You can have a sneak peak here: www.biorxiv.org/content/10.1...

5 months ago 14 2 1 1
Post image

Spatial attention and working memory are popularly thought to be tightly coupled. Yet, distinct neural activity tracks attentional breadth and WM load.

In a new paper @jocn.bsky.social, we show that pupil size independently tracks breadth and load.

doi.org/10.1162/JOCN...

6 months ago 36 15 1 1

Very happy to see this preprint out! The amazing @danwang7.bsky.social was on fire sharing this work at #ECVP2025, gathering loads of attention, and here you can find the whole thing!
Using RIFT we reveal how the competition between top-down goals and bottom-up saliency unfolds within visual cortex.

7 months ago 12 4 0 0

I'll show some (I think) cool stuff about how we can measure the phenomenology of synesthesia in a physiological way at #ECVP - Color II, atrium maximum, 9:15, Thursday.

say hi and show your colleagues that you're one of the dedicated ones by getting up early on the last day!

7 months ago 9 1 0 0
Preview
Dynamic competition between bottom-up saliency and top-down goals in early visual cortex Task-irrelevant yet salient stimuli can elicit automatic, bottom-up attentional capture and compete with top-down, goal-directed processes for neural representation. However, the temporal dynamics und...

🧠 Excited to share that our new preprint is out!🧠
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).

📄 Check it out on bioRxiv: www.biorxiv.org/cgi/content/...

7 months ago 31 12 2 1
Post image

And now without bluesky making the background black...

7 months ago 16 6 0 0
Post image

#ECVP2025 starts with a fully packed room!

I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!

7 months ago 18 3 1 1
Post image

Excited to give a talk at #ECVP2025 (Tuesday morning, Attention II) on how spatially biased attention during VWM does not boost excitability the same way it does when attending the external world, using Rapid Invisible Frequency Tagging (RIFT). @attentionlab.bsky.social @ecvp.bsky.social

7 months ago 14 3 0 1
Advertisement

Excited to share that I’ll be presenting my poster at #ECVP2025 on August 26th (afternoon session)!

🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging

@attentionlab.bsky.social @ecvp.bsky.social

7 months ago 9 3 0 0
Preview
Anticipated Relevance Prepares Visual Processing for Efficient Memory-Guided Selection Finding an object typically involves the use of working memory to prioritize relevant visual information at the right time. For example, successfully detecting a highway exit sign is useless when your...

Excited to present at #ECVP2025 - Monday afternoon, Learning & Memory - about how anticipating relevant visual events prepares visual processing for efficient memory-guided visual selection! 🧠🥳
@attentionlab.bsky.social @ecvp.bsky.social

Preprint for more details: www.biorxiv.org/content/10.1...

7 months ago 10 3 0 0
data saturation for gaze heatmaps. Initially, any additional participant will bring the total NSS or AUC as measures for heatmap similarity a lot closer to the full sample. However, the returns diminish increasingly at higher n.

data saturation for gaze heatmaps. Initially, any additional participant will bring the total NSS or AUC as measures for heatmap similarity a lot closer to the full sample. However, the returns diminish increasingly at higher n.

Gaze heatmaps (are popular especially for eye-tracking beginners and in many applied domains. How many participants should be tested?
Depends of course, but our guidelines help navigating this in an informed way.

Out now in BRM (free) doi.org/10.3758/s134...
@psychonomicsociety.bsky.social

8 months ago 9 2 1 0
Post image Post image

Thrilled to share that I successfully defended my PhD dissertation on Monday June 16th!

The dissertation is available here: doi.org/10.33540/2960

10 months ago 16 2 3 1

Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...

10 months ago 14 8 0 0

And Monday morning:

@suryagayet.bsky.social
has a poster (pavilion) on:
Feature Integration Theory revisited: attention is not needed to bind stimulus features, but prevents them from falling apart.

Happy @vssmtg.bsky.social #VSS2025 everyone, enjoy the meeting and the very nice coffee mugs!

11 months ago 6 0 0 0

@vssmtg.bsky.social
presentations today!

R2, 15:00
@chrispaffen.bsky.social:
Functional processing asymmetries between nasal and temporal hemifields during interocular conflict

R1, 17:15
@dkoevoet.bsky.social:
Sharper Spatially-Tuned Neural Activity in Preparatory Overt than in Covert Attention

11 months ago 8 4 1 0

and tomorrow, Monday:

Surya Gayet in the Pavilion in the morning session:
Feature Integration Theory revisited: attention is not needed to bind stimulus features, but prevents them from falling apart.

Enjoy VSS everyone!

11 months ago 0 0 0 0

We previously showed that affordable eye movements are preferred over costly ones. What happens when salience comes into play?

In our new paper, we show that even when salience attracts gaze, costs remain a driver of saccade selection.

OA paper here:
doi.org/10.3758/s134...

11 months ago 9 4 1 0
Advertisement
Post image

Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?

We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...

11 months ago 18 9 2 4

In our latest paper @elife.bsky.social we show that we choose to move our eyes based on effort minimization. Put simply, we prefer affordable over more costly eye movements.

eLife's digest:
elifesciences.org/digests/9776...

The paper:
elifesciences.org/articles/97760

#VisionScience

1 year ago 13 4 1 2
Heat map of gaze locations overlaid on top of a feature-rich collage image. There is a seascape with a kitesurfer, mermaid, turtle, and more.

Heat map of gaze locations overlaid on top of a feature-rich collage image. There is a seascape with a kitesurfer, mermaid, turtle, and more.

New preprint!

We present two very large eye tracking datasets of museum visitors (4-81 y.o.!) who freeviewed (n=1248) or searched for a +/x (n=2827) in a single feature-rich image.

We invite you to (re)use the dataset and provide suggestions for future versions 📋

osf.io/preprints/os...

1 year ago 23 6 2 1