JOB ALERT: PhD opening in my lab!
@cimecunitrento.bsky.social
in Italy, as part of an Italian FIS3 starting grant.
The project will use advanced analysis methods of MEG data to investigate how our world's naturalistic hierarchical structure facilitates predictive neural processing.
Posts by dorottya hetenyi
happy to announce the official release of my first R package on CRAN! 🎉
building on the Hmetad toolbox by @smfleming.bsky.social, the hmetad package allows users to fit the meta-d' model of confidence ratings using a familiar brms/lme4-style formula syntax
Excited to share that our MEG project is now out in Current Biology! We show how visual content codes relate to motor oscillations in telling time.
Huge thanks to Quirin Gehmacher, Peter Kok, Matt Davis and Clare Press (bsky links below).🧵
authors.elsevier.com/sd/article/S...
Another excellent paper from benjyb.bsky.social showing how we align our perceptual judgements with others. Read and love it, and I trust you are already aware of this.
Out now in @nconsc.bsky.social 🧠😶🌫️
academic.oup.com/nc/article/2...
So excited to see this lovely paper with @benjyb.bsky.social, @matanmazor.bsky.social and @giuliacabbai.bsky.social published in @nconsc.bsky.social!
academic.oup.com/nc/article/2...
Job alert 🚨 Fully funded PhD position available in our Maastricht lab! Are you interested in the relationship between memory and prediction, and have a track record of neuroimaging/decoding? Please apply! #NeuroJobs
www.academictransfer.com/nl/jobs/3576...
Passionate about women's mental health?
Interested in brain stimulation?
Excited by cutting edge neurotech?
Come do a PhD with me!
www.findaphd.com/phds/project...
(thread)
High-level visual surprise is rapidly integrated during perceptual inference!
🚨 New paper 🚨 out now in @cp-iscience.bsky.social with @paulapena.bsky.social and @mruz.bsky.social
www.cell.com/iscience/ful...
Summary 🧵 below 👇
I am very excited and grateful to have been awarded a Consolidator grant by @erc.europa.eu. We will use it to investigate the role of memory in perception, focused on the hippocampus. Thank you to all the colleagues in my team and the department for their support in making this possible!
And it's out now in Cortex: www.sciencedirect.com/science/arti...
Summary below 🧵
I’m excited to share the first preprint from my PhD project!
Together with Daniel Kaiser (@dkaiserlab.bsky.social), we investigated how internal models shape inter-individual differences in the perception and neural processing of natural scenes.
Preprint: osf.io/preprints/ps...
1/n
I am very excited to share our new preprint, spearheaded by the brilliant @lunahuestegge.bsky.social, w/ @peterkok.bsky.social and others: ‘An attempt to push mental imagery over the reality threshold using non-invasive brain stimulation’
doi.org/10.31234/osf...
Such a perfectly fitting recognition, congraaaats! 🫶
New BBS article w/ @lauragwilliams.bsky.social and Hinze Hogendoorn, just accepted! We respond to a thought-provoking article by @smfleming.bsky.social & @matthiasmichel.bsky.social, and argue that it's premature to conclude that conscious perception is delayed by 350-450ms: bit.ly/4nYNTlb
A ✨bittersweet✨ moment – after 5 years at UCL, my final first-author project with @smfleming.bsky.social is ready to read as a preprint! 🥲
I said it before and I'll say it again: Cognition is rhythmic
Contents of visual predictions oscillate at alpha frequencies
www.jneurosci.org/content/earl...
#neuroscience
Thanks Ben!! 😊
thanks so much Pete!😊
Super happy to share my very first first-author paper out in
@sfnjournals.bsky.social! We show content-specific predictions are represented in an alpha rhythm. It’s been a beautiful, inspiring, yet challenging journey.
Huge thanks to everyone, especially @peterkok.bsky.social @jhaarsma.bsky.social
From line drawings to scene perception — our new review argues for moving beyond experimenter-driven manipulations toward participant-driven approaches to reveal what’s in our internal models of the visual world. 👁️✍️🛋
royalsocietypublishing.org/doi/10.1098/...
Hellohello #ICON2025! Please come and have a chat with me today at 10.45am about some content-specific alpha fluctuations! 🤓
At long last, the pre-print to our MEG study + RIFT study and the final paper from my Ph.D with @olejensen.bsky.social We show that strong pre-search alpha oscillations are associated with faster responses in visual search www.biorxiv.org/content/10.1...
@thechbh.bsky.social #neuroskyence
Very proud to share this one🥹! We show that personalized signatures of brain activity are heritable and relate to the expression of specific genes. That means my brain-fingerprint is very similar to my twin brother's! #ResearchIsMeSearch🧠 🧬 ♊️
very cool stuff from brilliant people, an absolute must-read!!🫶
Hellohello again! Tomorrow at 5:15pm I’m giving a short talk on our latest MEG study about our very well-loved oscillating perceptual predictions @meguki2025.bsky.social. Come by and talk brains!
There are amazing talks and very cool science happening all around! MEGUKI 2025👌
Out now @cp-trendscognsci.bsky.social, w/ @akalt.bsky.social & @drmattdavis.bsky.social.
Are sensory sampling rhythms fixed by intrinsically-determined processes, or do they couple to external structure? Here we highlight the incompatibility between these accounts and propose a resolution [1/6]
NEW DEADLINE: Friday 20th June 🚨
MEG‑UKI 2025 lands in London (16–18 July)! A 3-day deep dive into the brain—naturalistic neuroscience, OP-MEG, cutting-edge methods, and real-world impact. Keynotes by Dominik Bach & Jamie Ward. Art, abstracts, and more!
Register here: meguk.ac.uk/registration/
Finally out in @commsbio.nature.com !
Using MEG and Rapid Invisible Frequency Tagging (RIFT) in a classic visual search paradigm we show that neuronal excitability in V1 is modulated in line with a priority-map-based mechanism to boost targets and suppress distractors!
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social
Open Access link: doi.org/10.3758/s134...