-June 1 2025, is the arrival date and welcome reception.
-Plenary Speakers:
Panos M. Pardalos, University of Florida, USA
Title: Dynamics of Information Systems
Abstract:
In the first part of the lecture we will discuss the history and initial ideas used to establish the series of conferences on DIS.
In the second part of the lecture we will address diffusive processes in networks in the context of complexity.
Networks possess a diffusive potential that depends on their topological configuration, but diffusion also relies on the process and initial conditions. The lecture introduces the concept of Diffusion Capacity, a measure of a node’s potential to diffuse information that incorporates a distance distribution considering both geodesic and weighted shortest paths and the dynamic features of the diffusion process. This concept provides a comprehensive depiction of individual nodes’ roles during the diffusion process and can identify structural modifications that may improve diffusion mechanisms. The lecture also defines Diffusion Capacity for interconnected networks and introduces Relative Gain, a tool that compares a node’s performance in a single structure versus an interconnected one. To demonstrate the concept’s utility, we apply the methodology to a global climate network formed from surface air temperature data, revealing a significant shift in diffusion capacity around the year 2000. This suggests a decline in the planet’s diffusion capacity, which may contribute to the emergence of more frequent climatic events. Our goal is to gain a deeper understanding of the complexities of diffusive processes in networks and the potential applications of the Diffusion Capacity concept.
References:
Schieber, T.A., Carpi, L.C., Pardalos, P.M. et al. Diffusion capacity of
single and interconnected networks. Nat Commun 14, 2217 (2023).
https://doi.org/10.1038/s41467-023-37323-0 (see also supplementary
information)
Schieber, T., Carpi, L., Díaz-Guilera, A. et al. Quantification of network
structural dissimilarities. Nat Commun 8, 13928 (2017).
https://doi.org/10.1038/ncomms13928
“Theory of Information and Its Value”
by Ruslan L. Stratonovich (Author), Roman V. Belavkin (Editor), Panos M. Pardalos (Editor), Jose C. Principe (Editor), Springer (2020)
Jose C. Principe, University of Florida, USA
Title: Self-Organization with Information Dynamics
Abstract: Information theory is crucial in the optimal design of communication systems, but it can also play fundamental roles in learning theory and information dynamics. This talk will briefly review estimators of information theoretic descriptors and their application for machine learning. We will focus on the less known Principle of Entropy Minimization to improve the mean square error loss that can be applied to estimate parameters under noisy conditions. It leads to the unexpected result of achieving noise free estimation, under quite mild conditions. Furthermore, the talk will also present the Principle of Relevant Information, a cost function for self-organized dynamics, where a fixed-point iteration constrained by a cost using entropy and mutual information will discover statistical structure in a single data source.
R. Terry Rockafellar, University of Washington, USA
Title: Adapting Information to Different Quantifactions of Risk
Abstract: Information and entropy, going back in concept to Shannon in appraising how much can be learned from one probability distribution relative to another, is fundamental in Bayesian statistics and the classics of means and variances. But in application to a random loss, the distribution of which might be influenced by decision variables in a problem of optimization, the expectation is just risk-neutral. It’s best when circumstances where the loss is greater than expected will be balanced in the long run by instances where it is less. Many situations in finance and engineering, however, don’t include a long run, and the focus must then be on risk-averse appraisals. It has been found dangerous, for instance, to view the value of a stock just through historical mean and variance; properties of the tail distribution associated with losses, especially high losses, are more important than parts associated with gains. A broad theory has, for this reason, been developed about alternative quantifications of risk. A quantification that is averse and deemed coherent fits into a scheme which identifies a corresponding alternative for standard deviation and is able to trigger a tailored approach to regression beyond least-squares. Such quantifications have been shown moreover to be closedly tied to stochastic divergences beyond Kullbach-Leibler. This suggest perhaps defining information and entropy differently for each of them and exploring the many consequences.
Xin-She Yang, Middlesex University, UK
Title:Insights and Open Problems in Nature-Inspired Computing
Abstract: Nature-inspired computing techniques such as genetic algorithms, particle swarm optimization and firefly algorithm have been widely used to solve problems in optimization, data mining and computational
intelligence. The number of nature-inspired algorithms has increased significantly in recent years. However, it lacks some in-depth mathematical analysis of these algorithms. This talk highlights some of the challenges and open problems in nature-inspired computing.