Advertisement Β· 728 Γ— 90

Posts by Adam Zaidel

OSF

Preprint alertπŸ“’ (collab with Baolin Li and co)
Different types of prior experience affect different stages of subsequent perceptual processing:
-sensory carryover -> affects earlier neural processing
-decisional carryover -> affect later neural processing
doi.org/10.31234/osf...

4 weeks ago 0 0 0 0
Post image

#SensorySubstitution #NeuroEnhancement #Multisensory

New paper out in @cp-iscience.bsky.social by Roie Karni (PhD candidate in my lab).

We created a synthetic sound "motion" cue πŸ”Š , and found that people use it to improve perception of vestibular self-motion in space!
www.cell.com/iscience/ful...

4 weeks ago 2 0 0 0

Just out in @cognitionjournal.bsky.social!
Cross-modal #serialdependence in audio-visual temporal perception.
We found that the effect of prior choices (attractive in unimodal conditions) can 'flip' - and become repulsive when switching between modalities.
authors.elsevier.com/c/1lg2l2Hx2-...

7 months ago 1 2 0 0

Now I really have FOMO. Gotta make it to the next #IMRF2026

9 months ago 0 0 0 0
Preview
An open-source Modular Online Psychophysics Platform (MOPP) In recent years, there is a growing need and opportunity to use online platforms for psychophysics research. Online experiments make it possible to evaluate large and diverse populations remotely and ...

Looking for an open-source tool to run psychophysics experiments online?
Try this... developed in the lab by PhD student Yuval Samoilov-Katz & co
arxiv.org/abs/2505.23137

9 months ago 0 0 0 0
Preview
Augmentation of self-motion perception with synthetic auditory cues People who suffer from vestibular loss or damage have difficulty maintaining balance and perceiving their own motion in space (self-motion). Sensory augmentation of vestibular information, via other s...

New preprint from the lab πŸ“£
Sensory augmentation of vestibular perception using a synthetic auditory πŸ”Š cue
Well done to PhD student Roie Karni πŸ‘
www.biorxiv.org/content/10.1...

9 months ago 2 1 0 0

by @gdrori.bsky.social @pazbartal.bsky.social @aberling.bsky.social @urihertz.bsky.social @adamzaidel.bsky.social @royesal.bsky.social et al.; more information in the thread bsky.app/profile/roye...

1 year ago 4 1 0 0

If unclear... I'd be happy to explain via zoom or similar

1 year ago 1 0 1 0

Cause this

1 year ago 1 0 1 0

To its own percept. So this is really strange. Eg. Visual perception shifts rightward but the neuronal responses shift leftward. So I'm wondering if some interaction between the visual and vestibular receptive fields (supposedly the inputs from earlier layers) which shift in opposite directions can

1 year ago 1 0 0 0
Advertisement

Not slow...just partial information :). In our paper the visual and vestibular perception shift in opposite directions (in response to a systematic heading discrepancy). In VIP (unlike other areas) the neurons of one cue (vest) shift with the vest percept. But the other cue (visual) shifts opposite

1 year ago 1 0 1 0

Ok, I get that the model itself doesn't model adaptation. But my thought (perhaps naive) was that a shift in the receptive fields of the different sensory inputs (to the trained, static, model) could elicit this effect

1 year ago 1 0 1 0

Hi Gunnar,
looks interesting!
I'm wondering if your model can explain another strange phenomenon that we saw in VIP neurons - they recalibrate to multisensory cues counterintuitively - sometimes opposite to behavior (elifesciences.org/articles/828...)?
Can this be explained by the gain modulation?

1 year ago 3 0 1 0

Thanks!

1 year ago 1 0 0 0

Thank you!

1 year ago 1 0 0 0

Cool - thanks!

1 year ago 1 0 0 0

Hey, how do I find a starter pack (in general - and this one you mention in particular)?

1 year ago 3 0 2 0

Thanks John!
I'm new here - so this is exactly what I need :)

1 year ago 1 0 1 0