First introduced some ten years ago, Cloud computing has become a universal solution for data storing, processing, and communicating between devices and applications. With more and more devices connected to the internet, the Cloud is becoming crowded and a new concept, Fog computing, could soon take its place.
Although storing data in the cloud is wildly popular globally, getting data into and out of the cloud is harder than most engineers, or at least their managers, are often willing to admit. The problem is bandwidth and latency. The Cloud is a great way to go for companies who don’t want to store the data themselves and don’t have to transfer data back and forth via high-speed cables. But as soon as you need to move vast amounts of data between devices, the Cloud will probably not be adequate.
In a world where the number of devices connected to the internet is increasing rapidly, the importance of ubiquitous connectivity and the ability to exchange data is constantly growing. With the Internet of Things, we’re getting to the point where there’s just too much data and it keeps piling up. Take airplanes, for example: at the 2015 Paris Air Show, Bombardier showcased its C Series Jetliner that carries an engine fitted with 5,000 sensors. The sensors generate up to 10 GB of data per second. A single twin-engine aircraft with an average 12-hour flight-time can produce up to 844 TB of data. In some cases, airplane parts are sending continuous streams of data about their status.
Or take General Electric, who as early as 2014 said it captures 50 million data points collected and communicated by 10 million sensors installed on a trillion dollars’ worth of equipment ranging from medical imaging systems to locomotives to jet engines.
Sander Soo, a Java Developer at Nortal, says that, “with the new emerging wave of IoT, our current Cloud computing approach will no longer be a viable option. The Fog computing paradigm introduces a new age to the computing world, similar to the advent of Cloud computing several years ago.”
Fog computing solves the problem by keeping datacloser ‘to the ground’, so to speak, in local computers and devices, rather than routing everything through a central data center in the cloud.
With the new emerging wave of IoT, our current Cloud computing approach will no longer be a viable option.
Just recently, Soo received his Master’s degree in Computer Science from the University of Tartu. His thesis proposed a proactive approach to Fog computing, supporting proactive Fog service discovery and process migration using Mobile Ad hoc Social Network in proximity, enabling Fog-assisted ubiquitous service provisioning in proximity without distant Cloud services. Moreover, a proactive approach was also applied for the Fog service provisioning itself, in order to hasten the task distribution process in Mobile Fog use-cases, and provide an optimization scheme based on run-time context information.
“Researchers have proposed the Fog computing model to utilize computational resources within the vicinity of the end-users,” Soo explains. “However, new issues arise regarding the support for the mobility of the users – there exists no central endpoint to query or send data to be processed anymore.”
Soo added that although Fog computing architecture is promising, it still faces challenges in many areas, especially when dealing with support for mobile users.
“Utilizing Fog for real-time mobile applications faces the new challenge of ensuring the seamless accessibility of fog services on the move,” Soo says. “Further, Fog computing also faces a challenge in terms of mobility when the tasks originate from mobile ubiquitous applications in which the data sources are moving objects.”
He added that as his research was conducted in the context of a completely new technology, and it will take time until the industry and the world in general will adopt this new paradigm. “There still remains a lot of research to be conducted,” Soo concludes.