Skip to main content

Seminars

January 9, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Dominic Skinner, Northwestern University
Statistical Physics Of Embryonic Transcriptomes Reveals Map Of Cellular Interactions.
Host: Eric Siggia
Starting from one totipotent cell, complex organisms form through a series of differentiation events, resulting in a multitude of cell types. In the ascidian embryo, differentiation happens early. For instance, by the 32 cell stage there are at least 10 transcriptomically distinct cell states. Moreover, cells coordinate within the embryo to differentiate in an extremely precise spatial pattern. Using recent single-cell sequencing data of early ascidian embryos, we leverage natural variation together with techniques from statistical physics to investigate development at the level of a complete interconnected embryo. After robustly identifying distinct transcriptomic states or cell types, a statistical analysis reveals correlations within embryos and across cell types beyond mean expression levels. From these intra-embryo correlations, we infer minimal networks of cell-cell interactions using regularization and spin glass-like models of interacting systems, revealing spatial connections that are of key importance in development.
January 10, 2024: (Special Time: 12:00PM) – Uri Alon, Weizmann Institute of Science
Mathematical Essence Of Aging.
Host: Stan Leibler
Aging shows nearly universal quantitative patterns. We explain them using a stochastic ODE for damage production and removal, deduced from experiments on damage dynamics in mice and in individual bacteria, the latter done by us. This simple model explains a wide range of phenomena in human aging and age-related diseases, as well as in model organisms. It pinpoints core molecular and cellular drivers of aging, and suggests interventions that, at least in mice, can compress the relative sick span (fraction of lifespan that an individual is disabled).
January 11, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Daniel Barabasi, Harvard University
Nature Over Nurture: How Complex Computations Emerge From Developmental Priors.
Host: Eric Siggia
An important challenge of modern neuroscience is to unveil the mechanisms shaping the wiring of the connectome, answering the difficult question of how the brain wires itself. Neuronal systems display a high degree of wiring reproducibility, such that multiple circuits and architectural features appear to be identical within a species, invariants that network models are unable to explain. This is because the architectures of neural circuits aren’t fully learned, as recent advances in systems neuroscience and AI would have us believe, nor is our wiring directly determined by our DNA; instead, our genome provides assembly rules that cells use to self-organize into a functional brain, much like how cellular automata’s simple patterns generate complex emergent behavior. To illustrate how complex computations can emerge from seemingly noisy, self-assembling processes, I derive a neurodevelopmental encoding of artificial neural networks that considers the weight matrix of a neural network to be emergent from well-studied rules of neuronal compatibility. Rather than updating the network’s weights directly, we improve task fitness by updating the neurons’ wiring rules, thereby mirroring evolutionary selection on brain development. We find that our model (1) provides sufficient representational power for high accuracy on ML benchmarks while also compressing parameter count, and (2) can act as a regularizer, selecting simple circuits that provide stable and adaptive performance on metalearning tasks. Finally, I will discuss how such physical models of directed, self-assembling systems can both (1) advance developmental understanding and (2) provide a fresh perspective on how evolution balances nature and nurture in neural circuits.
January 16, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Francois Bourassa, McGill University
Theory Of Antigen Encoding And Cross-Receptor Interactions In T Cell Immunotherapy.
Host: Eric Siggia
The mechanisms connecting early T cell receptor (TCR) activation to complex T cell responses in the immune system have not been fully elucidated. Understanding these processes quantitatively is however crucial to fine-tune immunotherapy treatments against cancer. To systematically map out T cell activation, the lab of Grégoire Altan-Bonnet has developed a robotic platform which tracks over days the dynamics of messenger proteins, called cytokines, produced by T cells to communicate with other cells. We found a low-dimensional representation of high-dimensional cytokine dynamics in which trajectories are ordered according to antigen strength. We termed this property “antigen encoding” and quantified it using information theory and nonlinear dynamical equations. We then leveraged these insights to disentangle cross-receptor interactions in chimeric antigen receptor (CAR) T cells used in cancer immunotherapy. In particular, we developed an adaptive model of receptor proofreading to explain antagonism (i.e. inhibition) of CAR activation by weak TCR stimulation. Our model predictions quantitatively matched experimental data, enabling us to engineer antagonism to reduce CAR T cell toxicity against healthy tissues.
January 18, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Jialong Jiang, California Institute of Technology
Revealing Regulatory Network Organization Through Single-Cell Perturbation Profiling And Maximum Entropy Models.
Host: Eric Siggia
Gene regulatory networks control cellular information processing and response to signals and environmental changes. Perturbations are widely used in genetics to decode gene interactions, yet how to extract regulatory network models from large-scale single-cell perturbation profiling remains a significant challenge. We develop the framework, D-SPIN, that constructs regulatory network models from single-cell data collected across thousands of perturbations. Using the maximum entropy principle, D-SPIN identifies a unified regulatory network model of the single-cell population and the effects of each perturbation on the gene program level. D-SPIN enables accurate network reconstruction and provides a reasoning framework of how cell states are constructed through pairwise interactions between gene programs. Using genome-wide Perturb-seq data, D-SPIN reveals different strategies of homeostasis regulation in cancer cell stress response. Using drug profiling data, D-SPIN dissects response in heterogeneous cell populations to elucidate how drug combinations induce novel cell states through additive recruitment of gene programs. Moreover, D-SPIN facilitates perturbation design by finding sloppy model parameters and informative perturbations to pindown these parameters. The framework extends to a wide range of applications including signaling responses of immune populations and identifying cell-cell interaction networks from spatial transcriptomics profiling. In conclusion, D-SPIN provides a computational framework for constructing interpretable models of gene regulatory networks to reveal principles of cellular information processing and physiological control, and strategies for designing perturbation for efficient network inference.
February 15, 2024: (Special Time: 10AM, Physics Fellow Candidate) – Nikolas Schonsheck, University of Delaware
Detecting And Learning Cyclic Structures In Neural Population Coding.
Host: Eric Siggia
Cyclic structures are a class of mesoscale features ubiquitous in both experimental stimuli and the activity of neural populations encoding them. Important examples include encoding of head direction, grid cells in spatial navigation, and orientation tuning in visual cortex. While cyclic structures are difficult to detect and analyze with classical methods, tools from the mathematical field of algebraic topology have proven to be particularly effective in understanding cyclic structures. Recently, work of Yoon et al. develops a topological framework to match cyclic coding patterns in distinct populations that encode the same information. We leverage this framework to study the efficacy of Hebbian learning rules in propagating cyclic structures through neural systems. Our primary results are 1) feedforward networks with connections drawn from inhibitory-biased random distributions do not reliably propagate cyclic features of neural coding 2) updating network connections with a biologically realistic Hebbian learning rule modeling spike timing dependent plasticity robustly constructs networks that do propagate cyclic features and 3) under biologically plausible parameter choices, the inhibition and propagation of such features can be modulated by the size of the output neuron population.
February 27, 2024: (4PM) – Noam Shental, Open University of Israel
High-Resolution Microbial Profiling Of Novel Niches And A Pan-Microbiome Knowledge Base.
Host: Orli Snir
I will present two computational tools for microbiome research recently developed by my group – the first allows high-resolution microbial profiling of novel niches (https://www.biorxiv.org/content/10.1101/2023.09.03.556087v1, under review), and the second is a bacterial knowledge base that has collected more than 1.5 million sequence-to-phenotype associations and allows extracting pan-microbiome biological insights (https://academic.oup.com/nar/article/51/13/6593/7199329).
March 5, 2024: (4PM) – Liat Shenhav, New York University
It’s About Time: Ecological And Eco-Evolutionary Dynamics Across The Scales.
Host: Bertrand Ottino-Loffler
Complex microbial communities play a vital role across many domains of life, from the female reproductive tract, through the oceans, to the plant rhizosphere. The study of these communities offers great opportunities for biological discovery, due to the ease of their measurement, the ability to perturb them, and their rapidly evolving nature. Yet, their complex composition, dynamic nature, and intricate interactions with multiple other systems, make it difficult to extract robust and reproducible patterns from these ecosystems. To uncover their latent properties, I develop models that combine longitudinal data analysis and statistical learning, and which draw from principles of community ecology, complexity theory and evolution. I will briefly present methods for decomposition of microbial dynamics at an ecological scale (Shenhav et al., Nature Methods; Martino & Shenhav et al., Nature Biotechnology). Using these methods we found significant differences in the trajectories of the infant microbiome in the first years of life as a function of early life exposures, namely mode of delivery and breastfeeding. I will then show how incorporating eco-evolutionary considerations allowed us to detect signals of purifying selection across ecosystems. I will demonstrate how interactions between evolution and ecology played a vital role in shaping microbial communities and the standard genetics code (Shenhav & Zeevi, Science, Liao & Shenhav Nature Comm.). Inspired by these discoveries, I am expanding the scope beyond the microbiome, modeling multi-layered data on human milk composition. I will present results from an ongoing study in which I am building integrative models of nasal, gut and milk microbiota, combined with human milk components, to predict infant respiratory health. I found that the temporal dynamics of microbiota in the first year of life, mediated by milk composition, predict the development of chronic respiratory disease later in childhood. These models, designed to identify robust spatiotemporal patterns, would help us better understand the nature and impact of complex ecosystems like the microbiome and human milk from the time of formation and throughout life.
March 19, 2024: (4PM) – Mason Porter, University of California, Los Angeles
Topological Data Analysis Of Spatial Systems.
Host: Bertrand Ottino-Loffler
I will discuss topological data analysis (TDA), which uses ideas from topology to quantify the “shape” of data. I will focus in particular on persistent homology (PH), which one can use to find “holes” of different dimensions in data sets. I will start by introducing these ideas and will discuss a series of examples of TDA of spatial systems. The examples that I’ll discuss include voting data, the locations of polling sites, the spread of COVID-19, and the webs of spiders under the influence of various drugs.
March 26, 2024: (4PM) – Stefano Di Talia, Duke University
Encoding Tissue Size And Shape During Vertebrate Regeneration.
Host: Woonyung Hur
Some animals have a remarkable ability to regenerate appendages and other damaged organs. I will focus on our attempts to reveal novel quantitative principles for the control of regeneration in zebrafish. I will describe how signaling waves and long-range gradients are used to control tissue growth and facilitate that tissues grow back to their correct size and shape.
April 2, 2024: (4PM) – Terry Hwa, University of California, San Diego
Quantitative Rules Govern Protein Expression And Activity Across The Bacterial Phylogeny.
Host: Eric Siggia
Distinct bacterial species thrive under distinct growth conditions. Even species sharing similar optimal conditions can grow at vastly different rates; e.g., Vibrio natriegens grows more than 50% faster than E. coli and B. subtilis in the same common growth media at 37C. What do the super-fast growers do differently? Quantitative proteomics reveal surprisingly rigid programs of proteome allocation for bacteria irrespective of the phylogeny, distinguished mainly by the speed of their enzymes and by their metabolic orientations.
April 16, 2024: (4PM) – Michail Tsodyks, Institute for Advanced Studies
Studying Human Memory for Random and Meaningful Material: A Comparative Study.
Host: Merav Stern
We consider the recognition and recall experiments on random lists of words vs meaningful narratives. A mathematical model based on a specific recall algorithm of random lists established the universal relation between the number of words that is retained in memory and the number of words that can on average be recalled, characterized by a square root scaling. This relation is expressed by an analytical expression with no free parameters and was confirmed experimentally to a surprising precision in online experiments. In order to extend this research to meaningful narratives, we took advantage of recently developed large language models that can generate meaningful text and respond to instructions in plain English with no additional training necessary. We developed a pipeline for designing large scale memory experiments and analyzing the obtained results. We performed online memory experiments with a large number of participants and collected recognition and recall data for narratives of different lengths. We found that both recall and recognition performance scale linearly with narrative length. Furthermore, in order to investigate the role of narrative comprehension in memory, we repeated these experiments using scrambled versions of the presented stories. We found that even though recall performance declined significantly, recognition remained largely unaffected. Interestingly, recalls in this condition seem to follow the original narrative order rather than the scrambled presentation, pointing to a contextual reconstruction of the story in memory.
April 23, 2024: (4PM) – Danny Abrams, Northwestern University
Careful Or Colorful? The Evolution Of Animal Ornaments.
Host: Bertrand Ottino-Loffler
Extravagant and costly ornaments (e.g., deer antlers or peacock feathers) are found throughout the animal kingdom. Charles Darwin was the first to suggest that female courtship preferences drive ornament development through sexual selection. In this talk I will describe a minimal mathematical model for the evolution of animal ornaments and will show that even a greatly simplified model makes nontrivial predictions for the types of ornaments we expect to find in nature.
April 30, 2024: (4PM) – Vikram Gadagkar, Columbia University
Neural Mechanisms Of Performance Evaluation In Singing Birds.
Host: Philip Kidd
Many behaviors are learned through trial and error by matching performance to internal goals, yet neural mechanisms of performance evaluation remain poorly understood. We recorded basal ganglia–projecting dopamine neurons in singing zebra finches as we controlled perceived song quality with distorted auditory feedback. Dopamine activity was suppressed after distorted syllables, consistent with worse-than-predicted performance, and activated when a predicted distortion did not occur, consistent with better-than-predicted performance. Thus, dopaminergic error signals can evaluate behaviors that are learned, not for reward, but by matching performance to internal goals. We then developed new computational methods to show that spontaneous dopamine activity correlated with natural song variations, demonstrating that dopamine can evaluate natural behavior unperturbed by experimental events such as cues, distortions, or rewards. Attending to mistakes during practicing alone provides opportunities for learning, but self-evaluation during audience-directed performance could distract from ongoing execution. It remains unknown how animals switch between, and process errors during, practice and performance modes. When male zebra finches transitioned from singing alone to singing female-directed courtship song, singing-related error signals were reduced or gated off and dopamine neurons were instead activated by female calls. Dopamine neurons can thus dynamically re-tune from self-evaluation to social feedback during courtship.
May 9, 2024: (4PM) – Arjun Karuvally, University of Massachusetts-Amherst
Hidden Traveling Waves in Artificial Recurrent Neural Networks Encode Working Memory.
Host: Marcelo Magnasco
Traveling waves are integral to brain function and are hypothesized to be crucial for short-term information storage. This study introduces a theoretical model based on traveling wave dynamics within a lattice structure to simulate neural working memory. We theoretically analyze the model’s capacity to represent state and temporal information, which is vital for encoding the recent history in history-dependent dynamical systems. In addition to enabling robust short-term memory storage, our analysis reveals that these dynamics can alleviate the diminishing gradient problem, which poses a significant challenge in the practical training of recurrent neural architectures. We explore the model’s application under two boundary conditions: linear and non-linear, the latter driven by self-attention mechanisms. Experimental findings show that randomly initialized and backpropagation-trained Recurrent Neural Networks (RNNs) naturally exhibit linear traveling wave dynamics, suggesting a potential working memory mechanism within these networks. This mechanism remains concealed within the high-dimensional state space of the RNN and becomes apparent through a specific basis transformation proposed by our model. In contrast, the non-linear scenario aligns with autoregressive loops in attention-based transformers, which drive the AI revolution. The results highlight the profound impact of traveling waves on artificial intelligence, improving our understanding of existing black-box neural computation and offering a foundational theory for future enhancements in neural network design.
May 14, 2024: (4PM) – Jorn Dunkel, Massachusetts Institute of Technology
Quantitative Model Inference For Living Matter.
Host: Bertrand Ottino-Loffler
Recent advances in live-imaging techniques provide dynamical data ranging from the cellular to the organism scale. Notwithstanding such experimental progress, quantitative theoretical models often remain lacking, even for moderately complex classes of biological systems. Here, I will summarize our ongoing efforts to implement computational frameworks for inferring predictive dynamical equations from multi-scale imaging data. As specific examples, we will consider models for cell locomotion, neural dynamics, mosquito flight behavior, and collective animal swarming.
May 21, 2024: (4PM) – Alfredo Fontanini, Stony Brook University
To Be Announced.
Host: Merav Stern
To Come.

 

Past Seminars

Click here for past seminars from the Center for Studies in Physics and Biology.



Contact Us

Center for Studies in Physics and Biology
The Rockefeller University
1230 York Avenue
New York, NY 10065