Abstract
Finding interdependency relations between time series provides valuable knowledge about the processes that generated the signals. Information theory sets a natural framework for important classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be partly alleviated when we have access to an ensemble of independent repetitions of the time series. In particular, we gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures. By doing so, we can obtain time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts), which are more accurate than the simple average of individual estimates over trials. We show with simulated and real data generated by coupled electronic circuits that the proposed approach allows one to recover the time-resolved dynamics of the coupling between different subsystems.
Original language | English |
---|---|
Pages (from-to) | 1958-1970 |
Number of pages | 13 |
Journal | Entropy |
Volume | 17 |
Issue number | 4 |
DOIs | |
Publication status | Published - 2015 |
Publication type | A1 Journal article-refereed |
Keywords
- Ensemble
- Entropy
- Estimator
- Time series
- Transfer entropy
- Trial
Publication forum classification
- Publication forum level 0
ASJC Scopus subject areas
- General Physics and Astronomy