We had an exciting Seminar chat today from Stefan van der Stigchel, head of @attentionlab.bsky.social, titled "When participants aren’t human: Rethinking online behavioral research"
Thank you to those who attended and to Stefan for the great talk and discussion on AI/bots in research!
Posts by AttentionLab Utrecht University
For those interested in (re)analyzing (video-based) pupil data and looking into a solution for the foreshortening error (resulting from changes in the angle between camera and eye), check our open access Behavior Research Methods paper! rdcu.be/faNse
link.springer.com/article/10.3...
We recently warned of bots in online behavioral research. @achetverikov.bsky.social showed there is no evidence for that in our @joinprolific.bsky.social data - but that doesn't mean we're safe. Agentic AI can do behavioral tasks through prompting alone. Reply & videos: osf.io/3cztr/overview
We show that synesthesia is sensory and automatic in nature: the pupil scales with the brightness of experienced synesthetic colors. doi.org/10.7554/eLif...
Now in its new dress @elife.bsky.social (convincing & valuable in round 1).
If anyone wants to pick up the method, happy to share & explain!
@attentionlab.bsky.social identified (what look like) #AI bots in an online response-time task (Posner cuing). Give-aways are normally distributed RTs and lack of serial-dependence effects. Pretty troublng. www.pnas.org/doi/10.1073/...
Recent work has shown how vulnerable online survey research is to LLMs. Motivated by this, we examined our online Posner cueing data from Prolific. It's concerning. We now must carefully consider when (or whether?) online behavioral data can be trusted.
see our comment:
www.pnas.org/doi/10.1073/...
Synesthetes claim sensory experiences, such as seeing color when reading or hearing a (black) number. But how genuine are these reports and sensations? We introduce a rather direct measure of synesthetic perception: Synesthetes’ pupils respond to evoked color as if it was real color #vision! 👁️🎨🧪
Our commentary @stigchel.bsky.social on Ruth Rosenholtz' Visual Attention in Crisis paper is now available:
doi.org/10.1017/S014...
We argue that effort must be considered when aiming to quantify capacity limits or a task's complexity.
Numerosity adaptation suppresses monotonic neural responses to numerosity displays in the early visual cortex, with more suppression for higher numerosity adaptors. Therefore, numerosity adaptation effects begin in early sensory stages of processing.
www.nature.com/articles/s42...
Planning on running a RIFT study? In a new manuscript, we put together the RIFT know-how accumulated over the years by multiple labs (@lindadrijvers.bsky.social, @schota.bsky.social, @eelkespaak.bsky.social, with Cecília Hustá and others).
Preprint: osf.io/preprints/ps...
Filled with a bunch of extra analyses, this is now accepted in The Journal of Neuroscience @sfn.org! You can have a sneak peak here: www.biorxiv.org/content/10.1...
Spatial attention and working memory are popularly thought to be tightly coupled. Yet, distinct neural activity tracks attentional breadth and WM load.
In a new paper @jocn.bsky.social, we show that pupil size independently tracks breadth and load.
doi.org/10.1162/JOCN...
Very happy to see this preprint out! The amazing @danwang7.bsky.social was on fire sharing this work at #ECVP2025, gathering loads of attention, and here you can find the whole thing!
Using RIFT we reveal how the competition between top-down goals and bottom-up saliency unfolds within visual cortex.
I'll show some (I think) cool stuff about how we can measure the phenomenology of synesthesia in a physiological way at #ECVP - Color II, atrium maximum, 9:15, Thursday.
say hi and show your colleagues that you're one of the dedicated ones by getting up early on the last day!
🧠 Excited to share that our new preprint is out!🧠
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).
📄 Check it out on bioRxiv: www.biorxiv.org/cgi/content/...
And now without bluesky making the background black...
#ECVP2025 starts with a fully packed room!
I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!
Excited to give a talk at #ECVP2025 (Tuesday morning, Attention II) on how spatially biased attention during VWM does not boost excitability the same way it does when attending the external world, using Rapid Invisible Frequency Tagging (RIFT). @attentionlab.bsky.social @ecvp.bsky.social
Excited to share that I’ll be presenting my poster at #ECVP2025 on August 26th (afternoon session)!
🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging
@attentionlab.bsky.social @ecvp.bsky.social
Excited to present at #ECVP2025 - Monday afternoon, Learning & Memory - about how anticipating relevant visual events prepares visual processing for efficient memory-guided visual selection! 🧠🥳
@attentionlab.bsky.social @ecvp.bsky.social
Preprint for more details: www.biorxiv.org/content/10.1...
data saturation for gaze heatmaps. Initially, any additional participant will bring the total NSS or AUC as measures for heatmap similarity a lot closer to the full sample. However, the returns diminish increasingly at higher n.
Gaze heatmaps (are popular especially for eye-tracking beginners and in many applied domains. How many participants should be tested?
Depends of course, but our guidelines help navigating this in an informed way.
Out now in BRM (free) doi.org/10.3758/s134...
@psychonomicsociety.bsky.social
Thrilled to share that I successfully defended my PhD dissertation on Monday June 16th!
The dissertation is available here: doi.org/10.33540/2960
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social
Open Access link: doi.org/10.3758/s134...
And Monday morning:
@suryagayet.bsky.social
has a poster (pavilion) on:
Feature Integration Theory revisited: attention is not needed to bind stimulus features, but prevents them from falling apart.
Happy @vssmtg.bsky.social #VSS2025 everyone, enjoy the meeting and the very nice coffee mugs!
@vssmtg.bsky.social
presentations today!
R2, 15:00
@chrispaffen.bsky.social:
Functional processing asymmetries between nasal and temporal hemifields during interocular conflict
R1, 17:15
@dkoevoet.bsky.social:
Sharper Spatially-Tuned Neural Activity in Preparatory Overt than in Covert Attention
and tomorrow, Monday:
Surya Gayet in the Pavilion in the morning session:
Feature Integration Theory revisited: attention is not needed to bind stimulus features, but prevents them from falling apart.
Enjoy VSS everyone!
We previously showed that affordable eye movements are preferred over costly ones. What happens when salience comes into play?
In our new paper, we show that even when salience attracts gaze, costs remain a driver of saccade selection.
OA paper here:
doi.org/10.3758/s134...
Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?
We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...
In our latest paper @elife.bsky.social we show that we choose to move our eyes based on effort minimization. Put simply, we prefer affordable over more costly eye movements.
eLife's digest:
elifesciences.org/digests/9776...
The paper:
elifesciences.org/articles/97760
#VisionScience
Heat map of gaze locations overlaid on top of a feature-rich collage image. There is a seascape with a kitesurfer, mermaid, turtle, and more.
New preprint!
We present two very large eye tracking datasets of museum visitors (4-81 y.o.!) who freeviewed (n=1248) or searched for a +/x (n=2827) in a single feature-rich image.
We invite you to (re)use the dataset and provide suggestions for future versions 📋
osf.io/preprints/os...