Papers by Leenoy Meshulam
Nature Methods
Bookmarks Related papers MentionsView impact
Nature
While sleeping, many vertebrate groups alternate between at least two sleep stages: rapid eye mov... more While sleeping, many vertebrate groups alternate between at least two sleep stages: rapid eye movement and slow wave sleep1–4, in part characterized by wake-like and synchronous brain activity, respectively. Here we delineate neural and behavioural correlates of two stages of sleep in octopuses, marine invertebrates that evolutionarily diverged from vertebrates roughly 550 million years ago (ref. 5) and have independently evolved large brains and behavioural sophistication. ‘Quiet’ sleep in octopuses is rhythmically interrupted by approximately 60-s bouts of pronounced body movements and rapid changes in skin patterning and texture6. We show that these bouts are homeostatically regulated, rapidly reversible and come with increased arousal threshold, representing a distinct ‘active’ sleep stage. Computational analysis of active sleep skin patterning reveals diverse dynamics through a set of patterns conserved across octopuses and strongly resembling those seen while awake. High-densi...
Bookmarks Related papers MentionsView impact
Acknowledgments I would like to thank Prof. Ari Barzilai and Prof. Amir Ayali for their time, ass... more Acknowledgments I would like to thank Prof. Ari Barzilai and Prof. Amir Ayali for their time, assistance, knowledge and constructive comments which were instrumental in directing this research towards its ultimate goal. To all my co-researchers, and mainly friends, from Prof. Eshel Ben Jacob's lab, I would like to thank you for the conversation, brain storming, advice and good company which has kept me happily going all these years. Notably I would like to thank Maurizio Abstract Network oscillation and reverberation are phenomena which play a critical role in learning and These have been explored widely; however, they have not been linked together. In tandem, research on the importance of electrical synapses (gap junctions) in the brain is gaining momentum, both in the experimental and theoretical domains This research aims to explore these two research vectors in unison. By developing novel computational models, it is made possible to study the contribution of electrical synap...
Bookmarks Related papers MentionsView impact
Leenoy Meshulam, Jeffrey L. Gauthier, Carlos D. Brody, David W. Tank, and William Bialek Center f... more Leenoy Meshulam, Jeffrey L. Gauthier, Carlos D. Brody, David W. Tank, and William Bialek Center for Computational Neuroscience, and Department of Applied Mathematics, University of Washington, Seattle, Washington 98195 Princeton Neuroscience Institute, Joseph Henry Laboratories of Physics, Lewis–Sigler Institute for Integrative Genomics, Department of Molecular Biology, and Howard Hughes Medical Institute, Princeton University, Princeton, NJ 08544 Department of Biology, Swarthmore College, Swarthmore, Pennsylvania 19081 Initiative for the Theoretical Sciences, The Graduate Center, City University of New York, 365 Fifth Ave., New York, NY 10016 (Dated: December 30, 2021)
Bookmarks Related papers MentionsView impact
Frontiers in Pharmacology, 2012
Bookmarks Related papers MentionsView impact
Neuron, Jan 6, 2017
Discussions of the hippocampus often focus on place cells, but many neurons are not place cells i... more Discussions of the hippocampus often focus on place cells, but many neurons are not place cells in any given environment. Here we describe the collective activity in such mixed populations, treating place and non-place cells on the same footing. We start with optical imaging experiments on CA1 in mice as they run along a virtual linear track and use maximum entropy methods to approximate the distribution of patterns of activity in the population, matching the correlations between pairs of cells but otherwise assuming as little structure as possible. We find that these simple models accurately predict the activity of each neuron from the state of all the other neurons in the network, regardless of how well that neuron codes for position. Our results suggest that understanding the neural activity may require not only knowledge of the external variables modulating it but also of the internal network state.
Bookmarks Related papers MentionsView impact
Bulletin of the American Physical Society, 2019
Bookmarks Related papers MentionsView impact
Bulletin of the American Physical Society, 2019
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Recent technological breakthroughs in large-scale neural recordings enable us to monitor simultan... more Recent technological breakthroughs in large-scale neural recordings enable us to monitor simultaneously the activity of thousands of neurons. To shed light on the collective nature of the activity in these large populations of cells, we seek theoretical approaches that will help us simplify the rich dynamics they exhibit. We focus on optical imaging experiments of dorsal hippocampus in mice as they run along a virtual linear track. First, we build minimal models to capture the activity in populations of ∼ 80 neurons. About half the neurons in these networks are place cells – neurons that become active only when the animal enters a particular location in its environment. However, many of the neurons are not place cells in any given environment. We use maximum entropy models which approximate the distribution of activity patterns in these mixed populations, by matching the correlations between pairs of cells but otherwise assuming as little structure as possible. Despite their simplic...
Bookmarks Related papers MentionsView impact
Bulletin of the American Physical Society, 2016
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
arXiv: Biological Physics, 2018
In many systems we can describe emergent macroscopic behaviors, quantitatively, using models that... more In many systems we can describe emergent macroscopic behaviors, quantitatively, using models that are much simpler than the underlying microscopic interactions; we understand the success of this simplification through the renormalization group. Could similar simplifications succeed in complex biological systems? We develop explicit coarse-graining procedures that we apply to experimental data on the electrical activity in large populations of neurons in the mouse hippocampus. Probability distributions of coarse-grained variables seem to approach a fixed non-Gaussian form, and we see evidence of power-law dependencies in both static and dynamic quantities as we vary the coarse-graining scale over two decades. Taken together, these results suggest that the collective behavior of the network is described by a non-trivial fixed point.
Bookmarks Related papers MentionsView impact
We study how recurrent neural networks (RNNs) solve a hierarchical inference task involving two l... more We study how recurrent neural networks (RNNs) solve a hierarchical inference task involving two latent variables and disparate timescales separated by 1-2 orders of magnitude. The task is of interest to the International Brain Laboratory, a global collaboration of experimental and theoretical neuroscientists studying how the mammalian brain generates behavior. We make four discoveries. First, RNNs learn behavior that is quantitatively similar to ideal Bayesian baselines. Second, RNNs perform inference by learning a two-dimensional subspace defining beliefs about the latent variables. Third, the geometry of RNN dynamics reflects an induced coupling between the two separate inference processes necessary to solve the task. Fourth, we perform model compression through a novel form of knowledge distillation on hidden representations – Representations and Dynamics Distillation (RADD)– to reduce the RNN dynamics to a low-dimensional, highly interpretable model. This technique promises a us...
Bookmarks Related papers MentionsView impact
We study how recurrent neural networks (RNNs) solve a hierarchical inference task involving two l... more We study how recurrent neural networks (RNNs) solve a hierarchical inference task involving two latent variables and disparate timescales separated by 1-2 orders of magnitude. The task is of interest to the International Brain Laboratory, a global collaboration of experimental and theoretical neuroscientists studying how the mammalian brain generates behavior. We make four discoveries. First, RNNs learn behavior that is quantitatively similar to ideal Bayesian baselines. Second, RNNs perform inference by learning a two-dimensional subspace defining beliefs about the latent variables. Third, the geometry of RNN dynamics reflects an induced coupling between the two separate inference processes necessary to solve the task. Fourth, we perform model compression through a novel form of knowledge distillation on hidden representations – Representations and Dynamics Distillation (RADD)– to reduce the RNN dynamics to a low-dimensional, highly interpretable model. This technique promises a us...
Bookmarks Related papers MentionsView impact
Physical Review Letters
Bookmarks Related papers MentionsView impact
Discussions of the hippocampus often focus on place cells, but many neurons are not place cells i... more Discussions of the hippocampus often focus on place cells, but many neurons are not place cells in any given environment. Here we describe the collective activity in such mixed populations, treating place and non-place cells on the same footing. We start with optical imaging experiments on CA1 in mice as they run along a virtual linear track and use maximum entropy methods to approximate the distribution of patterns of activity in the population, matching the correlations between pairs of cells but otherwise assuming as little structure as possible. We find that these simple models accurately predict the activity of each neuron from the state of all the other neurons in the network, regardless of how well that neuron codes for position. Our results suggest that understanding the neural activity may require not only knowledge of the external variables modulating it but also of the internal network state.
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Uploads
Papers by Leenoy Meshulam