Advertisement · 728 × 90

Posts by Gal Chen

Preview
Sage Journals: Discover world-class research Subscription and open access journals from Sage, the world's leading independent academic publisher.

With @deouell.bsky.social and Ran Hassin. Read more in the open-access version of the paper:
journals.sagepub.com/doi/10.1177/...

2 weeks ago 2 1 0 0
Post image

Negative words are detected *less* often when you focus on another task, regardless of task difficulty, the specific word set, or potentially confounding features. The decision of the system to disengage from a primary (visual, in our case) task might not conform to our conscious intuition.

2 weeks ago 0 0 1 0

We found that, while low-level/phonetic features, and pronunciation intelligibility play a large role in determining awareness, word valence plays a role too - suggesting semantic information is prioritized before conscious awareness. And not in the direction you'd think!

2 weeks ago 0 0 1 0
Paradigm description (see ms fig 1)

Paradigm description (see ms fig 1)

We designed a dual task in which dozens of spoken words evade consciousness, but without degrading the words - if you knew they were coming, you would've heard them. Which features mitigate this "inattentional deafness" that is so familiar in daily lives?

2 weeks ago 0 0 1 0
Results of exp 2 (lower awareness rate for negative words)

Results of exp 2 (lower awareness rate for negative words)

New in Psychological Science! We've been studying visual non-conscious prioritization processes for decades, but not a lot is known about non-conscious speech processing. Does it conform to the same principles, despite the very different way it functions?
#consciousness #psychscisky #PsychSky

2 weeks ago 12 3 2 2

An extraordinary experience!! Huge thanks to all mentors and organizers.

4 months ago 6 0 1 0

First paper is now out in Cortex! We find behavioral and neural evidence for non-conscious speech processing, using a new dual-task paradigm that creates repetitive occurrences of in attentional deafness without masking or degrading stimuli. @deouell.bsky.social
www.sciencedirect.com/science/arti...

5 months ago 5 2 1 1
Advertisement

Sometimes when diving into a new topic, you don't even know the right keywords to look for. In that sense LLMs can be helpful right from the start, as long as you enter the papers yourself

7 months ago 1 0 1 0

If you mean logistic regression-like for proportion data then yeah beta. but note that for logistic 50 to 40 is not like 5 to 4, the latter is actually a smaller effect size in log odds ratio (making sense because the change from 95 to 96 is tiny).

8 months ago 1 0 1 0
Preview
Sensory responses of visual cortical neurons are not prediction errors Predictive coding is theorized to be a ubiquitous cortical process to explain sensory responses. It asserts that the brain continuously predicts sensory information and imposes those predictions on lo...

1/3) This may be a very important paper, it suggests that there are no prediction error encoding neurons in sensory areas of cortex:

www.biorxiv.org/content/10.1...

I personally am a big fan of the idea that cortical regions (allo and neo) are doing sequence prediction.

But...

🧠📈 🧪

9 months ago 220 79 13 5
Post image

When we listen to speech, we do it while constantly predicting upcoming contents. Is this prediction associated with the subjective experience of engaged, conscious listening? What happens when we fail to listen? Come take a look at my poster (P116) tomorrow at @assc28.bsky.social 16:30

9 months ago 2 0 0 0

The conclusions from 5 EEG and behavior studies draw a coherent picture: when speech is task-relevant and supraliminal, inattentional deafness might not be absolute: we can process speech contents non-consciously, and use their meaning to prioritize information for consciouness.

9 months ago 0 0 0 0

These conditions, without masking, allowed repeated cases of inattentional deafness. We then asked (1) which words are detected more often (the answer will surprise you!) and (2) what happens in the brain when we miss a word, and is it goal-dependent?

9 months ago 0 0 1 0
Post image

We are often too busy to listen to things we need to notice. How can we study the very frequent case of "hearing without listening" without maybe-too-aggressive masking? We developed a new dual task that requires noticing unexpected spoken words during visual task performance.

9 months ago 0 0 1 0
Advertisement
OSF

Opening a new window into auditory awareness with two new preprints!
Conscious Detection of Spoken Words Depends on Their Valence
osf.io/preprints/ps...
Neural Markers of Speech Processing During Inattentional Deafness osf.io/preprints/ps...

9 months ago 1 0 1 1