Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. Networked units are ubiquitous, from interacting proteins in cells to neural connections in human brain or social relationships in virtual platforms, such as Twitter or Facebook. They provide a useful tool to model complex systems whose interacting constituents give rise to emergent behavior which can not be caused by single units separately.
The concept of entropy, a measure of the uncertainty that is crucial to understand the physical properties of a system and its evolution, is a pillar of classical and quantum information theory that, however, has remained elusive in the case of networks. We argue that the way information diffuses from one unit to another can be used to measure the entropy of a networked system.
Inspired by quantum thermodynamics and quantum computing, the new measure of entropy allows unprecedented approaches to statistical inference, establishing a fundamental step towards a network information theory.
Potential applications include machine learning, systems biology and inference of quantum complex network models based on qubit entanglement.
In the realm of complex networks, a satisfactory definition of network entropy has been hindered by the lack of a tractable probability distribution defining interconnected systems as a whole. A candidate starting point has been classical information theory, which is built centrally on the quantification of information through entropy, and has found vast interdisciplinary applications. This approach however seems limited to applying entropy to a known network descriptor, resulting only in an alternative means to analyze a probability distribution. Its quantum-mechanical counterpart, introduced by von Neumann, provided the foundation of modern quantum information theory. Applications of this entropy to a normalized network Laplacian or adjacency matrix results in nonsensical additivity properties. We address this issue by introducing a probability distribution representing a complex network which satisfies the same properties as a density matrix in quantum mechanics and from which we can recover the desired additivity properties. From this we build an information-theoretic framework which allows us to quantify the information content of complex networks; define a likelihood based on spectral properties of observed networks and their models, rather than relying on a subset of network descriptors such as mesoscale structure or degree distribution.
To study complex networks systematically from an information-theoretical perspective, it is necessary to develop a precise mathematical formulation of classical tools. We have just started to develop these tools, such as information entropy, Renyi q-entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks.
One of our achievements concerns the identification of network likelihood and the definition of model selection criteria in this new framework, which deserves further investigation.
We exploit this metric as a distance to compare the units of interconnected systems such as multilayer networks, as the human microbiome. Spectral methods have proven to be fundamental for analyzing and understanding complex networks, and our framework introduces a set of spectral tools that will boost network statistical inference.
HERE you can find an updated list of my pubblications about complex networks in general.