Pycabnn: Efficient and Extensible Software to Construct an Anatomical Basis for a Physiologically Realistic Neural Network Model
Physiologically detailed models of neural networks are an important tool for studying how biophysical mechanisms impact neural information processing. An important, fundamental step in constructing such a model is determining where neurons are placed and how they connect to each other, based on know...
Gespeichert in:
Veröffentlicht in: | Frontiers in neuroinformatics 2020-07, Vol.14, p.31-31 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Physiologically detailed models of neural networks are an important tool for studying how biophysical mechanisms impact neural information processing. An important, fundamental step in constructing such a model is determining where neurons are placed and how they connect to each other, based on known anatomical properties and constraints given by experimental data. Here we present an open-source software tool, pycabnn, that is dedicated to generating an anatomical model, which serves as the basis of a full network model. In pycabnn, we implemented efficient algorithms for generating physiologically realistic cell positions and for determining connectivity based on extended geometrical structures such as axonal and dendritic morphology. We demonstrate the capabilities and performance of pycabnn by using an example, a network model of the cerebellar granular layer, which requires generating more than half a million cells and computing their mutual connectivity. We show that pycabnn is efficient enough to carry out all the required tasks on a laptop computer within reasonable runtime, although it can also run in a parallel computing environment. Written purely in Python with limited external dependencies, pycabnn is easy to use and extend, and it can be a useful tool for computational neural network studies in the future. |
---|---|
ISSN: | 1662-5196 1662-5196 |
DOI: | 10.3389/fninf.2020.00031 |