Development of large-scale spiking and non-spiking neural networks for brain simulations and data analytics applications
The project that involves a development and applications of large scale neural network models has two major aspects from the methodological perspective. Firstly, it deals with the simulations of biological spiking neural network models to study perceptual and cognitive memory related phenomena in silico. Secondly, more abstract non-spiking network implementation of our brain models will be developed and the resulting brain-like computing architectures will be tested on benchmark as well as real-world data analytics applications. In the following, the two aspects of the proposed project are described in detail. In the realm of biologically plausible simulations of the brain’s systems, we have developed over last years biophysically realistic model of a cortical working memory in a parallel pNeuron simulator and have validated different alternative implementations. Currently, we increasingly rely and continuously develop network models using a simplified spiking unit description in NEST simulator. One of the major milestones in this work has recently been the incorporation of Hebbian plasticity rule. In this project we intend to perform systematic evaluation of the plasticity rule in our working memory model and test a larger cognitive architecture including the long-tem memory storage. This will pave the way for future investigations into functional and dynamical aspects of the brain’s cortical memory system. Our brain-like computing algorithms have been shown to be applicable to pattern recognition problems. In the past we examined static neuroimaging data such as fMRI and PET. In this project we intend to study the proposed methodology in the context of temporal (dynamic) pattern processing. To this end, we have started a systematic evaluation of the capacity of our memory model to encode and store not only static patterns but also sequences of patterns. In addition, we continue the development of brain-like feature extraction algorithms that in unsupervised mode identify latent correlation structure in high-dimensional data. In relation to this area of our research, we also work towards a proof-of-concept connectionist approach to the problem of predicting wind power production within a project funded by Energimyndigheten in collaboration with two Swedish energy companies, Expektra and Scandem. The focus is on brain-inspired network architectures that would enable forecasting at multiple time scales, which is of particular importance in the context of renewable sources like wind power, where weather related factors come into play. A large-scale parallel computing infrastructure is needed to identify most salient predictive features and perform a validation of the proposed neural network-based methods on large data sets. Due to a multi-scale nature of predictions to be made, the original input has to be provided with numerous temporal lags, which can result in high-dimensional data representations. Most simulation cases rely on a large number of cores with relatively low memory load per core. In selected instances, simulations may take up to 6-7 hours. Finally, for the applications we will rely on data storage to train the models.