Today there are plenty of open source software solutions that can be used to implement a fully open Cloud Computing environment; just to mention some of them: libcloud, OpenStack, NiftyName, Juju, appscale, SlapOS, buildout, supervisord, PyOCNI etc. Imagine using such solutions to create an ICT environment exploiting end-Users’ idle resources (instead of the servers in traditional data centers) for providing computing and storage services. This is Fog Computing at the Edge !
Fog Computing is about extending the Cloud Computing paradigm up to the edge of the network, by using a sheer number of unused ICT resources. It is not just a new tech buzzword, but it is about the migration trend of processing power, storage capability and embedded communications towards the edge of the network, which means in the hands of the Users. Fog computing could be also about storage for disaster recovery.
This is going to become a reality today. Symform is an example of start-up offering disaster resilience as a “decentralized, distributed, virtual, and crowd-sourced” fog. Let’s see how it works. Some Symform’s Users act also as hosts by allocating some amount of their on-site unused storage for use by Symform: pricing is 15 cents per gigabyte per month but if they provide as much storage resource as twice the data they are uploading, then their fog storage is free. When a User uploads a file to Symform’s fog, the system replicates it for redundancy, shreds it into tiny pieces, encrypts each piece, and then distributes it to other Symform Users. The system splits each 64 megabyte block of data into 96 fragments; only 64 of those fragments are necessary to recreate the entire block.
One may wonder about the performance of Fog Computing. Well, this brings me back to folding@home, a crowdsourcing initiative about computing intensive simulations of protein folding and other types of molecular dynamics. Folding@home uses the idle processing resources of thousands of personal computers owned by volunteers. As of November 28, 2012, folding@home has 208,622 active CPU cores, 10,206 active GPUs, and 4,583 active PS3s, for a total of about 5 petaFlops! (a petaFlop is a quadrillion calculations – 1015 – per second). Titan, first supercomputer in the world, has reached today a speed of 17.59 petaFlops.
Just imagine the variety of ICT services that could be executed and provided by orchestrating the idle computing and storage local resources of millions of smart nodes at the edge…one may argue that not all types of services and applications can run entirely on the edge, however, there are several examples like disaster resilience, content aggregation and transformation, data collection and analytics, static data bases, and many others (even at lower OSI layers) which can benefit from the fog.