Information Processing in Neural Networks: Learning of Structural Connectivity and Dynamics of Functional Activation

Please use this identifier to cite or link to this item:
Open Access logo originally created by the Public Library of Science (PLoS)
Title: Information Processing in Neural Networks: Learning of Structural Connectivity and Dynamics of Functional Activation
Authors: Finger, Holger Ewald
Thesis advisor: Prof. Dr. Peter K√∂nig
Thesis referee: Prof. Dr. Claus C. Hilgetag
Prof. Dr. Gordon Pipa
Abstract: Adaptability and flexibility are some of the most important human characteristics. Learning based on new experiences enables adaptation by changing the structural connectivity of the brain through plasticity mechanisms. But the human brain can also adapt to new tasks and situations in a matter of milliseconds by dynamic coordination of functional activation. To understand how this flexibility can be achieved in the computations performed by neural networks, we have to understand how the relatively fixed structural backbone interacts with the functional dynamics. In this thesis, I will analyze these interactions between the structural network connectivity and functional activations and their dynamic interactions on different levels of abstraction and spatial and temporal scales. One of the big questions in neuroscience is how functional interactions in the brain can adapt instantly to different tasks while the brain structure remains almost static. To improve our knowledge of the neural mechanisms involved, I will first analyze how dynamics in functional brain activations can be simulated based on the structural brain connectivity obtained with diffusion tensor imaging. In particular, I will show that a dynamic model of functional connectivity in the human cortex is more predictive of empirically measured functional connectivity than a stationary model of functional dynamics. More specifically, the simulations of a coupled oscillator model predict 54\% of the variance in the empirically measured EEG functional connectivity. Hypotheses of temporal coding have been proposed for the computational role of these dynamic oscillatory interactions on fast timescales. These oscillatory interactions play a role in the dynamic coordination between brain areas as well as between cortical columns or individual cells. Here I will extend neural network models, which learn unsupervised from statistics of natural stimuli, with phase variables that allow temporal coding in distributed representations. The analysis shows that synchronization of these phase variables provides a useful mechanism for binding of activated neurons, contextual coding, and figure ground segregation. Importantly, these results could also provide new insights for improvements of deep learning methods for machine learning tasks. The dynamic coordination in neural networks has also large influences on behavior and cognition. In a behavioral experiment, we analyzed multisensory integration between a native and an augmented sense. The participants were blindfolded and had to estimate their rotation angle based on their native vestibular input and the augmented information. Our results show that subjects alternate in the use between these modalities, indicating that subjects dynamically coordinate the information transfer of the involved brain regions. Dynamic coordination is also highly relevant for the consolidation and retrieval of associative memories. In this regard, I investigated the beneficial effects of sleep for memory consolidation in an electroencephalography (EEG) study. Importantly, the results demonstrate that sleep leads to reduced event-related theta and gamma power in the cortical EEG during the retrieval of associative memories, which could indicate the consolidation of information from hippocampal to neocortical networks. This highlights that cognitive flexibility comprises both dynamic organization on fast timescales and structural changes on slow timescales. Overall, the computational and empirical experiments demonstrate how the brain evolved to a system that can flexibly adapt to any situation in a matter of milliseconds. This flexibility in information processing is enabled by an effective interplay between the structure of the neural network, the functional activations, and the dynamic interactions on fast time scales.
Subject Keywords: Neural Networks; Deep Learning; Binding by Synchrony; Dynamic Coordination; Structural Connectivity; Functional Connectivity; Temporal Coding; Memory Consolidation
Issue Date: 16-Mar-2017
License name: Namensnennung-Keine Bearbeitung 3.0 Unported
License url:
Type of publication: Dissertation oder Habilitation [doctoralThesis]
Appears in Collections:FB08 - E-Dissertationen

Files in This Item:
File Description SizeFormat 
thesis_finger.pdfPräsentationsformat12,64 MBAdobe PDF

This item is licensed under a Creative Commons License Creative Commons