Communications on Applied Electronics
Foundation of Computer Science (FCS), NY, USA
|
Volume 3 - Issue 5 |
Published: November 2015 |
Authors: Hathal Salamah A. Alwageed |
![]() |
Hathal Salamah A. Alwageed . FOG Computing: The new Paradigm. Communications on Applied Electronics. 3, 5 (November 2015), 21-27. DOI=10.5120/cae2015651946
@article{ 10.5120/cae2015651946, author = { Hathal Salamah A. Alwageed }, title = { FOG Computing: The new Paradigm }, journal = { Communications on Applied Electronics }, year = { 2015 }, volume = { 3 }, number = { 5 }, pages = { 21-27 }, doi = { 10.5120/cae2015651946 }, publisher = { Foundation of Computer Science (FCS), NY, USA } }
%0 Journal Article %D 2015 %A Hathal Salamah A. Alwageed %T FOG Computing: The new Paradigm%T %J Communications on Applied Electronics %V 3 %N 5 %P 21-27 %R 10.5120/cae2015651946 %I Foundation of Computer Science (FCS), NY, USA
As the Internet of Everything (IoE) heats up, Cisco engineers put forward a new networking, compute, and storage paradigm that extends to the edge of the network [http://newsroom.cisco]. Fog Computing is a paradigm that stretches out or extends Cloud Computing and services to the systems or network edge. Like Cloud, Fog gives information/data, process or compute, storage, and application services to end-clients. The recognizing Fog attributes are its closeness to end-clients, its tightly packed geographical conveyance or distribution, and its backing for mobility. Services are facilitated at the network edge or even end devices, for example, set-top-boxes or end points. Thusly, Fog diminishes services latency, and enhances QoS, bringing about prevalent client experience. Fog Computing holds up up-and-coming Internet of Everything (IoE) applications that request real timing/unsurprising latency (Industrial computerization/automation, transportation, sensors networks and actuators). On account of its geographical distribution the Fog paradigm is very much situated for real-time huge information or big data and analytics. Fog bolsters compactly distributed data collection points, subsequently adding a fourth pivot to the frequently specified Big Data measurements such as volume, variety, and velocity.