A simple phenomenological neuronal model with inhibitory and excitatory synapses

  • Kerstin Lenk*
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

5 Citations (Scopus)

Abstract

We develop a simple model which simulates neuronal activity as observed in a neuronal network cultivated on a multielectrode array neurochip. The model is based on an inhomogeneous Poisson process to simulate neurons which are active without external input or stimulus as observed in neurochip experiments. Spike train statistics are applied to validate the resulting spike data. Calculated features adapted from spikes and bursts as well as the spike train statistics show that the presented model has potential to simulate neuronal activity.

Original languageEnglish
Title of host publicationAdvances in Nonlinear Speech Processing - 5th International Conference on Nonlinear Speech Processing, NOLISP 2011, Proceedings
Pages232-238
Number of pages7
DOIs
Publication statusPublished - 2011
Externally publishedYes
Publication typeA4 Article in conference proceedings
Event5th International Conference on Nonlinear Speech Processing, NOLISP 2011 - Las Palmas de Gran Canaria, Spain
Duration: 7 Nov 20119 Nov 2011

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume7015 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference5th International Conference on Nonlinear Speech Processing, NOLISP 2011
Country/TerritorySpain
CityLas Palmas de Gran Canaria
Period7/11/119/11/11

Keywords

  • multielectrode array neurochips
  • neuronal network model
  • spike train statistics

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'A simple phenomenological neuronal model with inhibitory and excitatory synapses'. Together they form a unique fingerprint.

Cite this