Skip to main content

Statistical Inference of Information in Networks

Causality and Directed Information Graphs

Time: Fri 2021-10-22 14.00

Location: F3,, Lindstedtsvägen 26, Stockholm (English)

Doctoral student: Sina Molavipour , Teknisk informationsvetenskap

Opponent: Professor Deniz Gündüz, Imperial College London, London, United Kingdom

Supervisor: Mikael Skoglund, Teknisk informationsvetenskap

Export to calendar

Abstract

Over the last decades, the advancements in measurement, collection, and storage of data have provided tremendous amounts of information. Thus, it has become crucial to extract valuable features and analyze the characteristics of data. As we study more complex systems (e.g. a network of sensors), the relationship between the information in different parts (e.g. measured signals) brings more insight in describing the characteristics of the system. Quantities such as entropy, mutual information, and directed information (DI) can be employed for this purpose. The main theme of this thesis is to study causality between random processes in systems where the instantaneous samples may depend on the history of other processes. We justify utilizing DI to describe the extent of causal influence and provide appropriate tools to estimate this quantity. Additionally, we study properties of the directed information graph, a representation model to demonstrate causal relationships in a network of processes. Although conventional estimation techniques for information-theoretic quantities may suit small systems with low-dimensional data, recent studies acknowledge that these methods may encounter a deterioration in performance when the data is high-dimensional. The estimation techniques proposed in this thesis are aimed to tackle this issue by using a novel approach based on neural networks. A major challenge of this method to estimate DI is to construct appropriate data batches to train the neural network. Thus, we propose a technique using the $k$ nearest neighbors ($k$-NN) algorithm to re-sample the original data. Since DI is characterized with conditional mutual information (CMI) terms, the convergence of our estimators is shown in two steps. First, we prove that the estimation for CMI converges asymptotically to the true value, when samples are independent and identically distributed (i.i.d.). The proof includes a concentration bound for our $k$-NN re-sampling technique. In the next step, the results are extended to the case where samples are allowed to be dependent in time which enables the method to estimate DI. Accordingly, we provide the convergence results for the end-to-end estimation of DI in this scenario. The performance of estimations is investigated in several experiments both with synthetic and real-world data. 

In order to detect a causal link in the system, a threshold test can be performed on the estimated DI. However, for more complex systems, where the directed information graph representation is adopted, it is required to detect all causal links correctly. Therefore, we study the performance of detecting the whole graph and show that with the conventional empirical estimation method and properly choosing the threshold for the test, the type I and II error probabilities tend to zero asymptotically. Finally, we suggest a roadmap to extend these results for a finite-sample regime to obtain explicit bounds describing the behavior of the false alarm and detection probabilities. 

urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-302730