Advertisement · 728 × 90

Posts by Antonin Fourcade

🎉 Excited to share the fruits of an awesome collaboration with @aleksanpi! Check out our new preprint!
Thread below 🧵👇

1 month ago 6 0 0 0

Wondering how to go from raw peripheral physio signals (ECG, PPG, respiration) to anything meaningful? Feeling lost in the labyrinth of analysis workflows?

BBSIG here is for you! 🫀🫁🧠
👉 www.bbsig.de

We provide easy-to-use pipelines with recommended steps & benchmarked functions

#neuroskyence

1 month ago 26 8 3 0
Preview
Your Input for AffectTracker 2D Wanna know what your participants really feel when watching movies? Current affective science often relies on static summary ratings that miss the rich temporal dynamics of our emotions. To bridge th...

Glad to hear! Too bad you’re not there this year, but you can still tell us how you’d like to use it right here:
docs.google.com/forms/d/e/1F...
That would be super valuable to us!

1 month ago 1 0 0 0
Post image

@martager.bsky.social
📍Special Talk
📆 Mar 11, 13:00

To conclude, meet the Brain-Body Analysis Special Interest Group (BBSIG): a collaborative initiative to benchmark & standardize peripheral physio signals (ECG, PPG, RESP) preprocessing and analysis. Check our pipelines from v0.0.1! 🫀🫁🧠
👉 bbsig.de

1 month ago 9 2 0 2

Another poster at the #MindBrainBody Symposium. We need your input!

1 month ago 5 1 1 0

Come check out our work at the #MindBrainBody Symposium!

1 month ago 4 1 0 0
Preview
From Body to Brain and Back: Multimodal Evidence for Interoceptive Alterations in Schizophrenia Spectrum Disorders When the brain and body misalign, emotional experience and sense of reality can be disrupted. Although such atypical experiences are central to schizophrenia spectrum disorders (SSD), interoception, p...

How the brain listens to the body matters.
Our new preprint investigates interoceptive processing in schizophrenia spectrum disorders across phenomenology, behavior, and heartbeat-evoked brain responses. 🧠🫀DOI: doi.org/10.64898/202...

3 months ago 15 8 0 1

Trying to build an experiment in Unity and slowly losing your patience?
Spoiler: that’s completely normal!

Meet EDIA - a modular framework for building studies in Unity.
🧩 Reusable modules
📊 Data sync
🕶️ Multi-headset support

And yes, a “Find Waldo” demo is included - because science should be fun!

5 months ago 6 3 0 0
Advertisement

Symposium 1.1, here we go!

To read more about AffectTracker, check out our latest publication: doi.org/10.3389/frvi...

@toninfrc.bsky.social @therealspr.bsky.social

6 months ago 10 5 0 0
Post image

Happy also to chat about our Brain-Body Analysis Special Interest Group (BBSIG) pipelines for preprocessing and analysing ECG, PPG and respiration (soon), openly available and ready-to-use with BIDS data as Jupyter Notebooks 🫀🫁

Work of +20 wonderful collaborators! ✨

📑 Documentation: www.bbsig.de

6 months ago 16 6 1 0
Villringer et al. Figure 1. Conceptual framework for brain–body states

Villringer et al. Figure 1. Conceptual framework for brain–body states

Villringer et al. Figure 2 Brain–body micro-, meso-, and macro-states can be distinguished on the basis of their duration and reversibility

Villringer et al. Figure 2 Brain–body micro-, meso-, and macro-states can be distinguished on the basis of their duration and reversibility

'Brain–body states as a link between cardiovascular and mental health'

by Arno Villringer, Vadim Nikulin & Michael Gaebler @mbe-lab.bsky.social @michaelgaebler.com @mpicbs.bsky.social sky.social

www.cell.com/trends/neuro...

6 months ago 36 13 1 2

Check out our new article for young readers (ages 8-15) on heart-brain interactions and interoception! 🧠🫀

I had so much fun co-writing this with @agatapatyczek.bsky.social @el-rei.bsky.social with the support of @michaelgaebler.com ✍️

👉 Share it widely with curious young minds
Yay for #scicomm

7 months ago 32 11 1 1
GitHub - afourcade/AffectTracker: AffectTracker: real-time continuous rating of affective experience in immersive virtual reality AffectTracker: real-time continuous rating of affective experience in immersive virtual reality - afourcade/AffectTracker

Our studies confirmed AffectTracker is reliable, with high user experience and low interference. It opens new avenues for linking subjective experience to physiological dynamics. The tool is open-source and available on GitHub!
#OpenScience

6 months ago 13 2 1 1
Video

AffectTracker allows users to continuously rate their valence and arousal during VR experiences. It features customizable feedback options, including a simplified affect grid and a novel abstract shape ("Flubber"), designed to be intuitive and minimally interfering.

6 months ago 8 3 1 0
Video

👥An amazing team effort by:
@fra-malandrone.bsky.social
@lucyroe.bsky.social
A. Ciston
@thefirstfloor.bsky.social
A. Villringer
S. Carletto
@michaelgaebler.com

#neuroskyence #vr #emotion #affect #selfreports

6 months ago 12 3 1 2
Preview
Frontiers | AffectTracker: real-time continuous rating of affective experience in immersive virtual reality Subjective experience is key to understanding affective states, characterized by valence and arousal. Traditional experiments using post-stimulus summary rat...

📢Our peer-reviewed article about the AffectTracker is finally out! 😲🕹️📈
Traditional methods for rating emotion often miss the dynamic, moment-to-moment nature of feelings. We designed a tool to capture this continuous affective experience in real-time during dynamic emotional stimulation.

6 months ago 43 14 2 2
Advertisement

📣 We're at the #MindBrainBody Symposium in Berlin, starting today! Looking forward to connect with everyone and share our latest research 🧠

Our group has an exciting lineup of posters - come chat with us! 💬 Check out the previews below to see where and when to meet us 📌

#MBBS24 #neuroskyence

1 year ago 18 6 1 0
Preview
Tools and Software

We centralized our open-science contributions in a new "Tools & Software" section on our website; check out

- open stimuli (eg. 3D objects)
- open data (eg. MindBrainBody)
- tools (eg. excite-o-meter, AffectTracker)
- analysis scripts
- & more

www.cbs.mpg.de/departments/...

#researchtransparency

1 year ago 37 16 1 2

The 1-min videos in study 1 are monoscopic, chosen as intermediate stimuli between static images and long videos to extend the classical short event-related stimulus approach. Also finding suitable free videos was challenging. Study 2's 23-min video is stereoscopic, a step further in stimuli type

1 year ago 2 0 0 0

3️⃣Tool offers a novel way to study affective dynamics with minimal interference, effectively capturing the nuances of subjective experiences. It opens new research opportunities to link affective states with physiological dynamics
🌟Stay tuned for the full paper & we welcome feedback & discussions! 💭

1 year ago 1 0 1 0

2️⃣Empirically evaluated in 2 studies at 2 sites (Berlin & Torino; N = 134) with both shorter 1-min 360° videos (low affective variability [AV] 〰️) and longer more dynamic 23-min stimulus (high AV 📈)
Both Grid & Flubber ➡️ high user experience 😃 & low interference with the affective experience itself

1 year ago 1 0 1 0
Video

1️⃣Participants can rate in real-time and continuously, using the touchpad or joystick of a VR controller 🎮(here: HTC Vive Pro). It comprises three customizable feedback options: a simplified affect grid (Grid), an abstract pulsating variant (Flubber), and no visual feedback (Proprioceptive)

1 year ago 1 0 1 0
GitHub - afourcade/AffectTracker: AffectTracker AffectTracker. Contribute to afourcade/AffectTracker development by creating an account on GitHub.

👥Together with F.Malandrone @lucyroe.bsky.social A.Ciston @thefirstfloor.bsky.social A.Villringer S.Carletto @michaelgaebler.com
🛠️ Unity prefab: github.com/afourcade/Af...

1 year ago 4 1 1 0
Video

🚀 Preprint out! doi.org/10.31234/osf...
We developed, empirically evaluated and openly share **AffectTracker**, a new tool to collect continuous ratings of two-dimensional (valence and arousal) affective experience **during** dynamic emotional stimulation (e.g., 360° videos) in immersive VR! 🥽🧠🟦

1 year ago 27 13 1 1
Advertisement

I thought it could be nice to connect the community of researchers exploring body-brain interactions on bsky, so here is the Body-Brain Interactions Starter Pack! 🫀🫁👀🧠 #neuroskyence #academicsky

Let me know if you would like to be added or know someone to add. Enjoy and share!

go.bsky.app/Fwqeu32

1 year ago 119 64 44 6
Post image

Title: Real-time continuous rating of affective experience in immersive Virtual Reality

P.361 (Session 1)
@toninfrc.bsky.social

In collaboration with Torino University, we developed a fun and intuitive new tool to record moment-by-moment feelings!

1 year ago 10 3 1 0
Post image

We are coming to Psychologie und Gihirn 2024 (PUG) in Hamburg! Come chat with us! See the some teezers in the comments 💬 #PuG2024

1 year ago 12 6 1 3

Thank you!

2 years ago 0 0 0 0

The picture was made using the AI image generator DALL-E3

2 years ago 0 0 2 0

We contribute to shed light on the complex relationship between emotions & the nervous systems (or MindBrainBody coupling) under naturalistic stimulation. 🌟 Stay tuned for the full paper & we’re very happy about feedback and discussions! 💭
(Illustration: DALL-E3)

2 years ago 3 0 1 0