Advertisement · 728 × 90

Posts by Amin Nejatbakhsh

I'm very excited about this work which can open up new ways for neural data analysis. If you're in NeurIPS consider checking Victor's poster.

4 months ago 4 0 0 0

Our work establishes a framework for modeling neural data under interventions. Our paper and code are available here:

Paper: openreview.net/pdf?id=n7qKt...
Code: github.com/amin-nejat/i...

Finally, a huge thanks to my co-author Yixin Wang for her contributions (n/n).

9 months ago 1 0 0 0
Post image

In electrophysiological recordings from the monkey prefrontal cortex during electrical micro-stimulation, we show that iSSM generalizes to unseen test interventions, an important property of identifiable models (8/n).

9 months ago 0 0 1 0
Post image

We apply iSSM to calcium recordings and photo-stimulation from mice's ALM area during a short-term memory task. We show that the latent variables inferred by the iSSM can distinguish between correct and incorrect trials, showing their behavioral relevance (7/n).

9 months ago 0 0 1 0
Post image

In the linear models of persistent activity in working memory, we show that iSSM can recover the true connectivity matrix with a high precision. For partial observations, this identification problem was originally posed as an open problem in the literature (Qian et al 2024) (6/n).

9 months ago 0 0 1 0
Post image

In models of motor cortex dynamics with linear dynamics and nonlinear observations, we show that iSSM can identify the underlying dynamics and emissions, and recover the true latent variables. The accuracy of this recovery improves when more interventions are applied (5/n).

9 months ago 1 0 1 0

Under some assumptions (bounded completeness of the observation noise, injectivity of the mixing function, and faithfulness) we prove that, under sufficiently diverse interventions, iSSM is able to recover the true latents, dynamics, emissions, and noise parameters (4/n).

9 months ago 0 0 1 0
Advertisement
Post image

We propose interventional state space models (iSSM), a statistical framework for the joint modeling of observational and interventional data. Compared to SSM, iSSM models interventions in a causal manner, where the interventions decouple nodes from their causal parents (3/n).

9 months ago 0 0 1 0
Post image

Can we use interventional data to identify the dynamics? Intuitively, interventions kick the state of the system outside of its attractor manifold, allowing for the exploration of the state space (Jazayeri et al 2017) (2/n).

9 months ago 0 0 1 0
Post image

In neuroscience we often ask which dynamical system model generated the data? However, our ability to distinguish between dynamical hypotheses from data is hindered by model non-identifiability. For example, the two systems below are indistinguishable using observational data (1/n).

9 months ago 1 0 1 0

Pleased to announce that our paper on "Identifying Neural Dynamics Using Interventional State Space Models" has been selected for a poster presentation in #ICML2025. Please check the thread for paper details (0/n).

Presentation info: icml.cc/virtual/2025....

9 months ago 12 3 1 0
GitHub - amin-nejat/netrep: Some methods for comparing network representations in deep learning and neuroscience. Some methods for comparing network representations in deep learning and neuroscience. - GitHub - amin-nejat/netrep: Some methods for comparing network representations in deep learning and neurosc...

Huge thanks to my co-authors @vgeadah.bsky.social , @itsneuronal.bsky.social and @lipshutz.bsky.social for their contributions. This work was funded by the Simons Foundation (n/n).

Code: github.com/amin-nejat/n...
Paper: arxiv.org/pdf/2412.14421

11 months ago 3 0 0 0
Post image

We compute pairwise distances for 2 pretrained models and 10 input prompts. Our results suggest that Causal OT (and SSD) mainly depend on the prompt, regardless of the model. This is reflected in the similar pattern in the four quadrants of the distance matrices (11/n).

11 months ago 1 0 1 0
Post image

Our final result is on latent text-to-image diffusion models. We took pretrained models and generated text-conditional samples and mean trajectories decoded into images. The lack of structure in means suggests that stochasticity and dynamics are critical for image generation (10/n).

11 months ago 0 0 1 0
Post image

We show that Causal OT can utilize the across time correlations to distinguish between the three systems (9/n).

11 months ago 0 0 1 0
Advertisement
Post image

Our second example focuses on distinguishing between flow fields. We generated data from three dynamical systems (saddle, point attractor, and line attractor) and adversarially tuned the parameters such that the marginal distributions from all these systems become the same (8/n).

11 months ago 0 0 1 0
Post image

We apply this intuition to the leading model of preparatory dynamics in the motor cortex. We show Causal OT can distinguish between the readout (i.e. muscle activity) and motor dynamics even when preparatory activity lies in the low-variance dimensions of population dynamics (7/n).

11 months ago 1 0 1 0
Post image

Our first experiment on a toy 1-d example builds an important intuition that Causal OT can distinguish between systems where the past can be more or less predictive of the future, even when the marginal statistics are exactly the same (6/n).

11 months ago 0 0 1 0
Post image

We then introduce Causal OT, a distance metric that respects both stochasticity and dynamics. Causal OT admits a closed-form solution for Gaussian Processes and importantly, it respects time causality, a property useful for processes with various predictability characteristics (5/n).

11 months ago 0 0 1 0
Post image

We argue that neither stochasticity nor dynamics alone is sufficient to capture similarities in noisy dynamical systems. This is important because many recent models have both of these components (e.g. diffusion models and biological systems) (4/n).

11 months ago 0 0 1 0
Post image

Several distance metrics have been proposed to measure representational similarities, covering a range of assumptions. Almost all methods assume deterministic responses to the inputs, while more recent methods assume stochastic or dynamic responses (3/n).

11 months ago 0 0 1 0
Post image

This paradigm allows us to analyze the shape space, a space where each point corresponds to a network and distances reflect the (dis)similarity between representations. Williams et al (2021) showed that analyzing shape space helps us understand the variability of representations across models (2/n).

11 months ago 0 0 1 0
Advertisement
Post image

A central question in AI is to understand how hidden representations are shaped in models. A useful paradigm is to define metric spaces that quantify differences in representations across networks. Given two networks, the goal is to compare the high-dimensional responses to the same inputs (1/n).

11 months ago 0 0 1 0
ICLR 2025 Comparing noisy neural population dynamics using optimal transport distances OralICLR 2025

Excited to announce that our paper on "Comparing noisy neural population dynamics using optimal transport distances" has been selected for an oral presentation in #ICLR2025 (1.8% top papers). Check the thread for paper details (0/n).

Presentation info: iclr.cc/virtual/2025....

11 months ago 23 7 1 0
Post image

We apply this intuition to the leading model of preparatory dynamics in the motor cortex. We show Causal OT can distinguish between the readout (i.e. muscle activity) and motor dynamics even when preparatory activity lies in the low-variance dimensions of population dynamics (7/n).

11 months ago 0 0 0 0
Post image

Our first experiment on a toy 1-d example builds an important intuition that Causal OT can distinguish between systems where the past can be more or less predictive of the future, even when the marginal statistics are exactly the same (6/n).

11 months ago 1 0 1 0
Post image

We then introduce Causal OT, a distance metric that respects both stochasticity and dynamics. Causal OT admits a closed-form solution for Gaussian Processes and importantly, it respects time causality, a property useful for processes with various predictability characteristics (5/n).

11 months ago 0 0 1 0
Post image

We argue that neither stochasticity nor dynamics alone is sufficient to capture similarities in noisy dynamical systems. This is important because many recent models have both of these components (e.g. diffusion models and biological systems) (4/n).

11 months ago 0 0 1 0
Post image

Several distance metrics have been proposed to measure representational similarities, covering a range of assumptions. Almost all methods assume deterministic responses to the inputs, while more recent methods assume stochastic or dynamic responses (3/n).

11 months ago 0 0 1 0
Post image

This paradigm allows us to analyze the shape space, a space where each point corresponds to a network and distances reflect the (dis)similarity between representations. Williams et al (2021) showed that analyzing shape space helps us understand the variability of representations across models (2/n).

11 months ago 0 0 1 0
Advertisement