## AI帮你理解科学

## AI 精读

AI抽取本论文的概要总结

微博一下：

# Point process models for sequence detection in high-dimensional neural spike trains

NIPS 2020, (2020)

EI

关键词

摘要

Sparse sequences of neural spikes are posited to underlie aspects of working memory, motor production, and learning. Discovering these sequences in an unsupervised manner is a longstanding problem in statistical neuroscience. Promising recent work utilized a convolutive nonnegative matrix factorization model to tackle this challenge. Ho...更多

简介

- Identifying interpretable patterns in multi-electrode recordings is a longstanding and increasingly pressing challenge in neuroscience.
- Neural sequences are an important example of high-dimensional structure: if N neurons fire sequentially with no overlap, the resulting dynamics are N -dimensional and cannot be efficiently summarized by PCA or other linear dimensionality reduction methods [4].
- Such sequences underlie current theories of working memory [1, 25], motor production [2], and memory replay [10].
- Sequence type probabilities The conditional distribution of the sequence type probability vector π is, p(π | {(rk, τk, Ak)}Kk=∗1, ξ) ∝ Dir(π | γ1R) Cat(rk | π) (83)

重点内容

- Identifying interpretable patterns in multi-electrode recordings is a longstanding and increasingly pressing challenge in neuroscience
- We develop a Bayesian point process modeling generalization of convolutive nonnegative matrix factorization, which was recently used by Peter et al [8] and Mackevicius et al [4] to model neural sequences
- Our Contributions We propose a point process model for neural sequences (PP-Seq) which extends and generalizes convolutive nonnegative matrix factorization (convNMF) to continuous time and uses a fully probabilistic Bayesian framework
- We treat held-out spikes as missing data and sample them as part of the Markov chain Monte Carlo (MCMC) algorithm. (Their conditional distribution is given by the PP-Seq generative model.) This approach involving a speckled holdout pattern and multiple imputation of missing data may be viewed as a continuous time extension of the methods proposed by Mackevicius et al [27] for convNMF
- We proposed a point process model (PP-Seq) inspired by convolutive NMF [4, 8, 9] to identify neural sequences
- PP-Seq is formulated in a probabilistic framework that better quantifies uncertainty and handles low firing rate regimes (see fig

方法

- The authors evaluate model performance by computing the log-likelihood assigned to held-out data.
- (Their conditional distribution is given by the PP-Seq generative model.) This approach involving a speckled holdout pattern and multiple imputation of missing data may be viewed as a continuous time extension of the methods proposed by Mackevicius et al [27] for convNMF.
- The likelihood of the train and test sets improves over the course of MCMC sampling, and can be used as a metric for model comparison—in agreement with the ground truth, test performance plateaus for models containing greater than R = 2 sequence types

结果

- The authors further improve performance by interspersing “split-merge” Metropolis-Hastings updates [51, 52] between Gibbs sweeps.
- The authors can improve performance substantially by parallelizing the computation [53]

结论

- The authors proposed a point process model (PP-Seq) inspired by convolutive NMF [4, 8, 9] to identify neural sequences.
- 25 s introduction of time warping, as well as other possibilities like truncated sequences and “clusterless” observations [57], which could be explored in future work
- Despite these benefits, fitting PP-Seq involves a tackling a challenging trans-dimensional inference problem inherent to Neyman-Scott point processes.
- These innovations are sufficient to fit PP-Seq on datasets containing hundreds of thousands of spikes in just a few minutes on a modern laptop

基金

- A.H.W. received funding support from the National Institutes of Health BRAIN initiative (1F32MH122998-01), and the Wu Tsai Stanford Neurosciences Institute Interdisciplinary Scholar Program
- S.W.L. was supported by grants from the Simons Collaboration on the Global Brain (SCGB 697092) and the NIH BRAIN Initiative (U19NS113201 and R01NS113119)

引用论文

- Mark S Goldman. “Memory without feedback in a neural network”. Neuron 61.4 (2009), pp. 621–634.
- Richard H R Hahnloser, Alexay A Kozhevnikov, and Michale S Fee. “An ultra-sparse code underlies the generation of neural sequences in a songbird”. Nature 419.6902 (2002), pp. 65– 70.
- Howard Eichenbaum. “Time cells in the hippocampus: a new dimension for mapping memories”. Nat. Rev. Neurosci. 15.11 (2014), pp. 732–744.
- Emily L Mackevicius, Andrew H Bahle, Alex H Williams, Shijie Gu, Natalia I Denisenko, Mark S Goldman, and Michale S Fee. “Unsupervised discovery of temporal sequences in high-dimensional datasets, with applications to neuroscience”. Elife 8 (2019).
- Moshe Abeles and Itay Gat. “Detecting precise firing sequences in experimental data”. J. Neurosci. Methods 107.1-2 (2001), pp. 141–154.
- Eleonora Russo and Daniel Durstewitz. “Cell assemblies at multiple time scales with arbitrary lag constellations”. Elife 6 (2017).
- Pietro Quaglio, Vahid Rostami, Emiliano Torre, and Sonja Grün. “Methods for identification of spike patterns in massively parallel spike trains”. Biol. Cybern. 112.1-2 (2018), pp. 57–80.
- Sven Peter, Elke Kirschbaum, Martin Both, Lee Campbell, Brandon Harvey, Conor Heins, Daniel Durstewitz, Ferran Diego, and Fred A Hamprecht. “Sparse convolutional coding for neuronal assembly detection”. Advances in Neural Information Processing Systems 30. Ed. by I Guyon, U V Luxburg, S Bengio, H Wallach, R Fergus, S Vishwanathan, and R Garnett. Curran Associates, Inc., 2017, pp. 3675–3685.
- Paris Smaragdis. “Convolutive speech bases and their application to supervised speech separation”. IEEE Trans. Audio Speech Lang. Processing (2006).
- Thomas J Davidson, Fabian Kloosterman, and Matthew A Wilson. “Hippocampal replay of extended experience”. Neuron 63.4 (2009), pp. 497–507.
- Anne C Smith and Emery N Brown. “Estimating a state-space model from point process observations”. Neural Computation 15.5 (2003), pp. 965–991.
- K L Briggman, H D I Abarbanel, and W B Kristan Jr. “Optical imaging of neuronal populations during decision-making”. Science 307.5711 (2005), pp. 896–901.
- Byron M Yu, John P Cunningham, Gopal Santhanam, Stephen I Ryu, Krishna V Shenoy, and Maneesh Sahani. “Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity”. J. Neurophysiol. 102.1 (2009), pp. 614–635.
- Liam Paninski, Yashar Ahmadian, Daniel Gil Ferreira, Shinsuke Koyama, Kamiar Rahnama Rad, Michael Vidne, Joshua Vogelstein, and Wei Wu. “A new look at state-space models for neural data”. Journal of Computational Neuroscience 29.1-2 (2010), pp. 107–126.
- Jakob H Macke, Lars Buesing, John P Cunningham, M Yu Byron, Krishna V Shenoy, and Maneesh Sahani. “Empirical models of spiking in neural populations”. Advances in Neural Information Processing Systems. 2011, pp. 1350–1358.
- David Pfau, Eftychios A Pnevmatikakis, and Liam Paninski. “Robust learning of lowdimensional dynamics from large neural ensembles”. Advances in Neural Information Processing Systems. 2013, pp. 2391–2399.
- Peiran Gao and Surya Ganguli. “On simplicity and complexity in the brave new world of large-scale neuroscience”. Curr. Opin. Neurobiol. 32 (2015), pp. 148–155.
- Yuanjun Gao, Evan Archer, Liam Paninski, and John P Cunningham. “Linear dynamical neural population models through nonlinear embeddings” (2016). arXiv: 1605.08454 [q-bio.NC].
- Yuan Zhao and Il Memming Park. “Variational Latent Gaussian Process for Recovering SingleTrial Dynamics from Population Spike Trains”. Neural Comput. 29.5 (2017), pp. 1293–1316.
- Anqi Wu, Stan Pashkovski, Sandeep R Datta, and Jonathan W Pillow. “Learning a latent manifold of odor representations from neural responses in piriform cortex”. Advances in Neural Information Processing Systems 31. Ed. by S Bengio, H Wallach, H Larochelle, K Grauman, N Cesa-Bianchi, and R Garnett. Curran Associates, Inc., 2018, pp. 5378–5388.
- Alex H Williams, Tony Hyun Kim, Forea Wang, Saurabh Vyas, Stephen I Ryu, Krishna V Shenoy, Mark Schnitzer, Tamara G Kolda, and Surya Ganguli. “Unsupervised discovery of demixed, low-dimensional neural dynamics across multiple timescales through tensor component analysis”. Neuron 98.6 (2018), 1099–1115.e8.
- Scott Linderman, Annika Nichols, David Blei, Manuel Zimmer, and Liam Paninski. “Hierarchical recurrent state space models reveal discrete and continuous dynamics of neural activity in C. elegans”. 2019.
- Lea Duncker, Gergo Bohner, Julien Boussard, and Maneesh Sahani. “Learning interpretable continuous-time models of latent stochastic dynamical systems”. Proceedings of the 36th International Conference on Machine Learning. Ed. by Kamalika Chaudhuri and Ruslan Salakhutdinov. Vol. 97. Proceedings of Machine Learning Research. Long Beach, California, USA: PMLR, 2019, pp. 1726–1734.
- Carsen Stringer, Marius Pachitariu, Nicholas Steinmetz, Matteo Carandini, and Kenneth D Harris. “High-dimensional geometry of population responses in visual cortex”. Nature 571.7765 (2019), pp. 361–365.
- Christopher D Harvey, Philip Coen, and David W Tank. “Choice-specific sequences in parietal cortex during a virtual-navigation decision task”. Nature 484.7392 (2012), pp. 62–68.
- Mikhail I Rabinovich, Ramón Huerta, Pablo Varona, and Valentin S Afraimovich. “Generation and reshaping of sequences in neural systems”. Biol. Cybern. 95.6 (2006), pp. 519–536.
- Emily Lambert Mackevicius and Michale Sean Fee. “Building a state space for song learning”. Curr. Opin. Neurobiol. 49 (2018), pp. 59–68.
- György Buzsáki and David Tingley. “Space and time: the hippocampus as a sequence generator”. Trends Cogn. Sci. 22.10 (2018), pp. 853–869.
- Eva Pastalkova, Vladimir Itskov, Asohan Amarasingham, and György Buzsáki. “Internally generated cell assembly sequences in the rat hippocampus”. Science 321.5894 (2008), pp. 1322– 1327.
- Daoyun Ji and Matthew A Wilson. “Coordinated memory replay in the visual cortex and hippocampus during sleep”. Nat. Neurosci. 10.1 (2007), pp. 100–107.
- Hannah R Joo and Loren M Frank. “The hippocampal sharp wave-ripple in memory retrieval for immediate use and consolidation”. Nat. Rev. Neurosci. 19.12 (2018), pp. 744–757.
- Wei Xu, Felipe de Carvalho, and Andrew Jackson. “Sequential neural activity in primary motor cortex during sleep”. J. Neurosci. 39.19 (2019), pp. 3698–3712.
- David Tingley and Adrien Peyrache. “On the methods for reactivation and replay analysis”. Philos. Trans. R. Soc. Lond. B Biol. Sci. 375.1799 (2020), p. 20190231.
- Kourosh Maboudi, Etienne Ackermann, Laurel Watkins de Jong, Brad E Pfeiffer, David Foster, Kamran Diba, and Caleb Kemere. “Uncovering temporal structure in hippocampal output patterns”. Elife 7 (2018).
- Lukas Grossberger, Francesco P Battaglia, and Martin Vinck. “Unsupervised clustering of temporal patterns in high-dimensional neuronal ensembles using a novel dissimilarity measure”. PLoS Comput. Biol. 14.7 (2018), e1006283.
- Roemer van der Meij and Bradley Voytek. “Uncovering neuronal networks defined by consistent between-neuron spike timing from neuronal spike recordings”. eNeuro 5.3 (2018).
- Daniel D Lee and H Sebastian Seung. “Learning the parts of objects by non-negative matrix factorization”. Nature 401.6755 (1999), pp. 788–791.
- Jesper Moller and Rasmus Plenge Waagepetersen. Statistical Inference and Simulation for Spatial Point Processes. Taylor & Francis, 2003.
- Isabel Valera Manuel Gomez Rodriguez. Learning with Temporal Point Processes. Tutorial at ICML. 2018.
- Jerzy Neyman and Elizabeth L Scott. “Statistical approach to problems of cosmology”. J. R. Stat. Soc. Series B Stat. Methodol. 20.1 (1958), pp. 1–29.
- Jeffrey W Miller and Matthew T Harrison. “Mixture models with a prior on the number of components”. J. Am. Stat. Assoc. 113.521 (2018), pp. 340–356.
- John Frank Charles Kingman. Poisson processes. Clarendon Press, 2002.
- Lea Duncker and Maneesh Sahani. “Temporal alignment and latent Gaussian process factor inference in population spike trains”. Advances in Neural Information Processing Systems 31. Ed. by S Bengio, H Wallach, H Larochelle, K Grauman, N Cesa-Bianchi, and R Garnett. Curran Associates, Inc., 2018, pp. 10445–10455.
- Alex H Williams, Ben Poole, Niru Maheswaranathan, Ashesh K Dhawale, Tucker Fisher, Christopher D Wilson, David H Brann, Eric M Trautmann, Stephen Ryu, Roman Shusterman, Dmitry Rinberg, Bence P Ölveczky, Krishna V Shenoy, and Surya Ganguli. “Discovering precise temporal patterns in large-scale neural recordings through robust and interpretable time warping”. Neuron 105.2 (2020), 246–259.e8.
- Ushio Tanaka, Yosihiko Ogata, and Dietrich Stoyan. “Parameter estimation and model selection for Neyman-Scott point processes”. Biometrical Journal: Journal of Mathematical Methods in Biosciences 50.1 (2008), pp. 43–57.
- Ushio Tanaka and Yosihiko Ogata. “Identification and estimation of superposed Neyman–Scott spatial cluster processes”. Ann. Inst. Stat. Math. 66.4 (2014), pp. 687–702.
- Jirí Kopecký and Tomáš Mrkvicka. “On the Bayesian estimation for the stationary NeymanScott point processes”. Appl. Math. 61.4 (2016), pp. 503–514.
- Yosihiko Ogata. “Cluster analysis of spatial point patterns: posterior distribution of parents inferred from offspring”. Japanese Journal of Statistics and Data Science (2019).
- Radford M Neal. “Markov chain sampling methods for Dirichlet process mixture models”. J. Comput. Graph. Stat. 9.2 (2000), pp. 249–265.
- Jun S Liu, Wing Hung Wong, and Augustine Kong. “Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes”. Biometrika 81.1 (1994), pp. 27–40.
- Sonia Jain and Radford M Neal. “A split-merge Markov chain Monte Carlo procedure for the Dirichlet process mixture model”. J. Comput. Graph. Stat. 13.1 (2004), pp. 158–182.
- Sonia Jain and Radford M Neal. “Splitting and merging components of a nonconjugate Dirichlet process mixture model”. Bayesian Anal. 2.3 (2007), pp. 445–472.
- Elaine Angelino, Matthew James Johnson, and Ryan P Adams. “Patterns of scalable Bayesian inference”. Foundations and Trends® in Machine Learning 9.2-3 (2016), pp. 119–247.
- Svante Wold. “Cross-validatory estimation of the number of components in factor and principal components models”. Technometrics 20.4 (1978), pp. 397–405.
- Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B Shah. “Julia: A fresh approach to numerical computing”. SIAM review 59.1 (2017), pp. 65–98.
- Andres D Grosmark and György Buzsáki. “Diversity in neural firing dynamics supports both rigid and learned hippocampal sequences”. Science 351.6280 (2016), pp. 1440–1443.
- Xinyi Deng, Daniel F Liu, Kenneth Kay, Loren M Frank, and Uri T Eden. “Clusterless decoding of position from multiunit activity using a marked point process filter”. Neural computation 27.7 (2015), pp. 1438–1460.
- 3. Resample global parameters a. Sample λ∅n ∼ Ga(α∅ + xs∈X0 I[ns = n], β∅ + T ) for n = 1,..., N. b. Sample π ∼ Dir ([γ + k I[rk = 1],..., γ + k I[rk = R]]) c. Sample ar ∼
- 0. In this paper, we set wf = wF−1+2(f −1)/(F −1)
- [1] John Frank Charles Kingman. Poisson processes. Clarendon Press, 2002.
- [2] D J Daley and D Vere-Jones. An Introduction to the Theory of Point Processes: Volume I: Elementary Theory and Methods. Springer, New York, NY, 2003.
- [3] Andrew Gelman, John B Carlin, Hal S Stern, David B Dunson, Aki Vehtari, and Donald B Rubin. Bayesian data analysis. 3rd ed. CRC press, 2013.
- [4] Kevin P Murphy. Machine learning: a probabilistic perspective. MIT press, 2012.
- [5] Radford M Neal. “Markov chain sampling methods for Dirichlet process mixture models”. J. Comput. Graph. Stat. 9.2 (2000), pp. 249–265.
- [6] Jeffrey W Miller and Matthew T Harrison. “Mixture models with a prior on the number of components”. J. Am. Stat. Assoc. 113.521 (2018), pp. 340–356.
- [7] Sonia Jain and Radford M Neal. “A split-merge Markov chain Monte Carlo procedure for the Dirichlet process mixture model”. J. Comput. Graph. Stat. 13.1 (2004), pp. 158–182.
- [8] Sonia Jain and Radford M Neal. “Splitting and merging components of a nonconjugate Dirichlet process mixture model”. Bayesian Anal. 2.3 (2007), pp. 445–472.

标签

评论

数据免责声明

页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果，我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问，可以通过电子邮件方式联系我们：report@aminer.cn