We are moving towards a highly connected world with an impressive amount of data around us.
Have a look at this picture showing just a simple representation of part of the systemic interdependencies of the socio-economic variables of the world we are living in.
In this direction towards an hyper-connected world, thinking at the fast evolution of pervasive computing and networking “at the edge” (i.e., around the Users), there is a growing interest in finding ways for understanding better (steering, controlling) the dynamic emergence of connected groups of entities (e.g., nodes/devices, things, Users).
This is not only for the design of the self-control of dynamic networks hooking a sheer number of physical and virtual resources, but also for learning the highly interconnected Users’ experience behind services such as those provided by Google, Facebook or Twitter… (i.e., involving any sort of relationships among people and data through social networking tools). Two sides of the same coin. Still we find systemic interdependencies.
Modelling “emergence” (and taming “butterfly effects”) is a well-known problem, which has been already analysed in several other contexts of Network Science (e.g., in medicine, biology, neuroscience, etc). Emergence can be seen as the aggregation of “mesoscopic structures” which are dynamically and spontaneously aggregating in highly connected (collaborative-competitive) environments.
Needless to say that a inner “mathematics of emergence” (if any) may have interesting business implications for any Telecom or ICT Player wishing to gain a winning role in future networks and services ecosystems.
Interestingly, some studies from statistical physics, which are modelling each node (or device, or ensembles of data) in a network as an energy level and each virtual link as a particle, show a perfect analogy between the mathematics of a network and the mathematics of a Bose gas: just have a look at this paper.
It appears even possible to argue that “first-mover-advantage”, “fit-get-rich,” and “winner-takes-all” strategies observed in collaborative-competitive environments emerge from the underlying dynamic networks in which single nodes captures a macroscopic fraction of links. And the “temperature” (T) is the main controlling parameter.
It might be possible even applying the same model to the human minds, which are behind the nodes and are, in turn, influenced by their interactions. Each single thought can be seen as an energy level (e.g., even the psychologist R. Assagioli argued something like that!) and association of a couple of thoughts as a particle: well, this is just another level of abstraction, but the inner mathematics, might be indeed the same. This is the universality of Nature’s laws.
Then, at the end of the day, we are talking about nested multi-scale networks, creating collaborative-competitive environments probably based on an the same inner mathematics, e.g., the one governing the condensations of “giant waves” (or certain levels of coherency) at the right conditions (i.e., values of some controlling parameters).
Concluding, I’m not arguing that future Internet (or human mind) should be modelled like a realistic Bose gas (even if this idea is very fascinating), but this reasoning brings to my mind another lateral perspective: said systemic interdependencies of a highly connected world full of data may represent sometimes a risk; for example think about unwanted couplings or over-coupled relations… do we have the instruments to tame unwanted “condensations” ? Can we “shield” Users from too many connections and too many data ?