Advertisment

Bullets and Roses: Fog Computing

It’s supposedly the next boiler plate to take over Cloud Computing. Here’s what we need to know before the Mist-ery settles

author-image
Pratima Harigunani
New Update
ID

INDIA: Enough, in fact, more than enough has been said and heard about Cloud computing’s revolutionary touch on the technology industry.

Advertisment

From allowing users to work from any hardware with the help of edgy software, enabling them to work on someone else’s hardware, ushering in the idea of pooling resources (black boxes or Internet bandwidth alike), giving them the scope of renting hardware or software, to equipping them with breakthrough economics of a new model altogether; this architectural disruption has almost redefined the word ‘cloud’ for dictionaries.

And now comes the Fog.

Fog computing, for the uninitiated, takes the Cloud magic a level further and ups the ante for application of the underlying architectures. Since a lot more devices are seated at the boundaries today than ever before, all these nodes at the edge of the network can be better used to do something more than just be a pitstop.

Advertisment

They can go further than just collecting and relaying data. What if these umpteen devices or end-points could compute the data too, process it for the tasks that are anyways meant for this side of the network (so why waste time and network in sending them to the centre), and apply the results for quick, real-time decisions that actually affect the edge after all?

This has started happening.

Understand it:

Advertisment

What makes this paradigm of this version of computing plausible is that because these fog nodes are geographically distributed, and planted in proximity to wireless access points in areas with a significant usage; they may as well assume the stature of stand-alone servers or network devices packed with on-board computing capabilities.

This not only saves time otherwise wasted in sending data all the way up to the queen bee, and the time spent waiting for instructions from the ivory towers; but it also helps in chopping service latency and redundancy. This way quality of service is improved reasonably, but what is more impressive is the sheer speed of real-time, and near-spot decision making that fog makes possible.

Its relevance and usage could not have been better timed. The advent of Internet of Things (IoT) has pushed the surface-area of end-point nodes, and in turn, the need for real-time or predictable latency, faster processing and better visibility. More so as Fog can assist densely distributed data collection points, in the form of a fourth axis that amplifies dimensions of Big Data, a.k.a. volume, variety, and velocity.

Advertisment

In short, Fog computing can be wrapped for a quick peek in these bullets:

1. It extends the idea of cloud and spreads it over the end-nodes, lying unused so far, all over the network.

2. It lets the guys on the perimeter do some data processing and decision-making of their own, specially, as they are the ones being affected eventually. The ones on the radius or at the centre can take over later or when required.

3. Its applications become amplified in any scenario involving network-edge ingredients, like sensors in an industrial plant or IoT devices in a consumer setting.

There is lot more going for this new paradigm as it continues to galvanise attention and what-ifs across many verticals.

Advertisment

Apply it:

A Cisco paper on the subject has even argued that extremely time-sensitive decisions should be made closer to the things producing and acting on the data.

As it observes, in industries such as manufacturing, oil and gas, utilities, transportation, mining, and the public sector, faster response time can improve output, boost service levels, and increase safety.

Advertisment

Its use gains further traction when one factors in the constraint with cloud servers that communicate with existing IP and not new legion of protocols used by IoT devices. That’s where Fog becomes an instant answer to crunch and stir most IoT data that sits near the very devices that produce and act on that data. Because of lack of industrial protocols that match IP of IoT or Clouds, Fog is a welcome architecture

So when a temperature sensor on a critical machine can relay nimble-footed readings associated with imminent failure on a factory floor; the expedient decisions of an emergency flavor can be taken by the sensor-side pieces themselves. One does not have to wait for the server sitting somewhere else to process such crucial, time-critical data in order to dispatch a technician before a shutdown attacks.

Similarly, when sensors on oil pipelines register a pressure change in an oil and exploration setting, the pumps can automatically slow down with the help of fog jumping into play and saving a disaster possibility.

Advertisment

Analyzing IoT data close to where it is collected minimizes latency. It not just offloads gigabytes of network traffic from the core network but also holds sensitive data inside the network. Also, it becomes a force to reckon with when we see that real-time data action in mission-critical and urgent industrial settings need to be handled with razor-sharp agility.

Whether it is cameras at remote field points, pay-as-you-drive vehicle insurance, or lighting-as-a-service, Machine-as-a-Service (MaaS) or anything that can leverage quick small-bites of data analytics; the fog model can have numerous advantages.

Specially when moving data from the network edge to the data center for processing piles up time, latency or expenses that can easily be rather sliced at the edge itself.

The sheer scale of these factors can be imagined when we see how data and its many forms will explode soon, adding complexity and silhouettes all around. A Wikibon survey expects the big data market to touch $92.2 billion in 2026 and this will bring in some successive and overlapping waves of application patterns. Data Lakes, Intelligent Systems of Engagement, and Self-Tuning Systems of Intelligence will become talk-points when constantly rising amounts of data generated by IoT sensors will drive each application pattern.

Would fog then mean answers to new questions or be a source of new questions, 'ifs' and 'buts'?

Probe it:

Barbs and bullets always accost every new technology shift, and Fog would not be spared for sure.

For instance, one wonders when Fog takes the lead over cloud, would it simultaneously sort out latency, unnecessary network bandwidth consumption or cloud storage?

Or how can an enterprise ensure that security is not compromised and that edge devices take appropriate, policy-based action? In fact, how does the whole enchilada work - authentication, audit, accountability and everything about security?

Would it then need new cloud models and hardware investments?What happens to storage, data usage (because transient storage works in fog), and archival? And of course, what about those recurring nightmares around Shadow IT again?

There are also concerns over sucking out bandwidth capacity when traffic from thousands of devices starts flitting about. Even if data can be extremely tap-able, there might be circumstances when industry regulations and privacy concerns have a prohibitory stance over offsite storage of certain types of data.

There are a few questions that are making the fog dense, like how would auditing and accountability work in Fog, how will access control mechanism or key management be sorted out in Fog; and most importantly, how would privacy aspects/data confidentiality be condensed well?

The questions take new proportions when juxtaposed with increasing cybersecurity fears of devices in an IoT age.

Fog can certainly look beautiful and poetic, but only when one has good fog-lights around.

Till then, may be avoid those hair-pin roads.

cyber-security internet-of-things-iot cio-insights iot-hub