One of the scenario addressed in the Europe 2050 group was the one of data playing a major role in the future of Europe.
What may differ in thirty years time is the way data global architecture is seen. Rather than having data in a Cloud (or several clouds) the vision is to have a global data architecture encompassing the world resulting in massive distributed data bases with some sort of uniform access layer (like we have today a uniform resource locator for Internet). In principle, then, data are located where they are “generated” (not necessarily so but you get the gist, the architectural approach is not to move them somewhere to make them accessible). By being kept local, they can be better protected, also enforcing local policies. Notice that by 2050, most objects, devices, ambient will have an associated massive data base. The “sensors” box in the graphic has to be understood in very general terms as means to generate data.
The other important idea is that this huge world of data will be impossible to understand unless we can imagine that what is actually visible is a status resulting from the interaction of the global data and individual brain, similarly to what is happening within our brains: meaning is not an information nor a collection of information, rather a state of the brain, that in turns is influenced by previous state(s).