Advertisement · 728 × 90

Posts by Christoph Rühlemann

Preview
Frontiers | Do frequency and frequency-related measures signal turn completion? An exploratory corpus study Speakers in conversation have access to word frequency information stored in the mental lexicon. This article examines whether word frequencies play a role a...

Can changes in word frequencies signal turn completion/continuation? I’m trying to answer this question here: www.frontiersin.org/articles/10....

4 months ago 1 1 0 0
Preview
Speech planning depends on next-speaker selection: evidence from pupillometry in question–answer sequences in naturalistic triadic conversation Next-speaker selection, which controls who should speak next, is fundamental to turn taking. While it is central in Conversation Analysis, little is known about its cognitive repercussions. We draw...

Happy to see this out in Discourse Processes, after more than 2 years hard work:

our proposal to extend the current concensus model on speechplanning by factoring in next-speaker selection:

www.tandfonline.com/doi/full/10....

7 months ago 2 0 0 0
Post image

still fascinated by how we can visualize meaningful patterns in linguistic data and - nonetheless - create aesthetically pleasing graphics. Here: Mutual Gaze in 'open-floor' QA sequences (i.e., where both recipients are licensed to answer, e.g., "where are you guys from?")

7 months ago 1 0 0 0
Preview
Frontiers | The effect of gesture expressivity on emotional resonance in storytelling interaction The key function of storytelling is a meeting of hearts: a resonance in the recipient(s) of the story narrator’s emotion toward the story events. This paper ...

Interested in gesture research, and how gestures impact emotions in storytelling? Check out our Frontiers paper (w/ @jamestrujillo.bsky.social), which just reached a new milestone with 500+ downloads:
: www.frontiersin.org/article/1477...

8 months ago 3 0 0 0
Post image

News from the multimodal transcription front ;)

Why not include in transcripts, not only data on observable behavior such as gaze movements (cf. A-gaze, etc.), phases of mutual gaze, and so on, but also imperceptible data such as pupil size changes (since they index internal cognitive effort)?

10 months ago 1 0 0 0
Preview
South Africa Collides Head-On With Trump’s Claims of White Victimhood

If you've thought it merely possible, it's now time to acknowledge it as a fact: the majority of 70+ m ppl who voted 4 T despite glaring deficits did so because he wants to save white supremacy.

South Africa Collides Head-On With Trump’s Claims of White Victimhood www.nytimes.com/2025/05/22/w...

11 months ago 0 0 0 0
Mobile Eye Tracking | John Benjamins Situated within the flourishing domain of pragmatics, this volume explores the crucial role of gaze in human interaction, with a particular focus on the potential of mobile eye tracking to advance our...

Happy to be a (double) contributor (once with Elisabeth Zima & Peter Auer @UniFreiburg, once with Mathias Barthel @IDS Mannheim) to this completely OA volume on eye tracking:

www.jbe-platform.com/content/book...

11 months ago 1 0 0 0
Post image

Glad to see my "Regex in R for Multimodal Analysis" workshop on IPrA-website:

pragmatics.international/page/PreConf...

You painstakingly elaborate transcripts & annotations in ELAN but no clue how to synthesize, aggregate, large-scale analyze & visualize this data? This workshop might be for you.

1 year ago 3 2 0 0

Also possible to read-in & structure *several (e.g., hundreds)* of #CA transcripts in *single piece of code (i.e., in 1 go)*

Here's 1 way to do this for 3 transcripts from Stivers 2021; gaps & pauses are extracted into separate columns showing simple transformations possible once the data are in R

1 year ago 1 0 0 0
Advertisement
Post image Version #1

Version #1

Version #2, with Gaps extracted into separate column

Version #2, with Gaps extracted into separate column

Anybody interested in converting #CA transcripts into machine-readable data frames? Can be done with Regular expression in R.

Here's an example from Stivers (2021), extract (4):

1 year ago 1 0 0 1
Post image

On my way, via Frankfurt, to Marburg to teach “Regex in R”

1 year ago 2 0 0 0
1 year ago 2 0 0 0
Preview
Frontiers | The effect of gesture expressivity on emotional resonance in storytelling interaction

Happy to see this published (with @jamestrujillo.bsky.social):

www.frontiersin.org/articles/10.....

1 year ago 2 0 0 0

Yes, very exciting to have our paper coming online soon! www.frontiersin.org/journals/psy...
We looked at how "gesture expressivity" develops over the course of a story telling, and how this gesture expressivity is associated with emotional resonance between storyteller and listener.

1 year ago 7 1 0 0
Post image

So I’ll start my Bluesky career with some happy news @jamestrujillo.bsky.social

1 year ago 4 0 1 1

Hi, I’m on blue sky! Got fed up with the dirt on the X ground

1 year ago 2 0 0 0