Explaining Edge Computing

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Welcome to another video from ExplainingComputers.com. This time I’m going to talk about edge computing. This places networked computing resources as close as possible to where data is created. As we’ll see, edge computing is associated with the Internet of Things, with mesh networks, and with the application of small computing devices like these. So, let’s go and delve more deeply into computing on the network edge. To understand edge computing we need to reflect on the rise of the cloud. In recent years, cloud computing has been one of the biggest digital trends, and involves the delivery of computing resources over the Internet. In the early days, most of the devices that accessed cloud services were PCs and other end-user hardware. But increasingly, devices accessing cloud services are also Internet of things or IoT appliances that transmit data for analysis online. Connecting cameras and other sensors to the Internet facilitates the creation of smart factories and smart homes. However, transmitting an increasing volume of data for remote, centralized processing is becoming problematic. Not least, transmitting video from online cameras to cloud-based vision recognition services can overload available network capacity and result in a slow speed of response. And this is the reason for the rise of edge computing. Edge computing allows devices that would have relied on the cloud to process some of their own data. So, for example, a networked camera may perform local vision recognition. This can improve latency -- or the time taken to generate a response from a data input -- as well as reducing the cost and requirement for mass data transmission. Staying with our previous example, let’s consider more deeply the application of artificial neural networks for vision recognition. Today, Amazon, Google, IBM and Microsoft all offer cloud vision recognition services that can receive a still image or video feed and return a cognitive response. These cloud AI services rely on neural networks that have been pre-trained on data center servers When an input is received, then they perform inference -- again on a cloud data center server -- to determine what the camera is looking at. Alternatively, in an edge computing scenario, a neural network is usually still trained on a data centre server, as training requires a lot of computational power. So, for example, a neural network for use in a factory may be shown images of correctly produced and then defective products so that it can learn to distinguish between the two. But once training is complete, a copy of the neural network is deployed to a networked camera connected to edge computing hardware. This allows it to identify defective products without transmitting any video over the network. Latency is therefore improved and the demands on the network are decreased, as data only has to be reported back when defective products are identified. This scenario of training a neural network centrally and deploying copies for execution at the edge has amazing potential. Here I’ve indicated how it could be used in vision recognition. But the same concept is equally applicable for the edge processing of audio, sensor data, and the local control of robots or other cyber physical systems. In fact, edge hardware can be useful in any scenario where the roll-out of local computing power at the extremities of a network can reduce reliance on the cloud. One of the challenges of both the Internet of Things, and of edge computing, is providing an adequate network connection to a vast number of cameras, sensors and other devices. Today, the majority of devices connected wirelessly to a local network communicate directly with a WiFi router. However, an alternative model is to create a mesh network in which all individual nodes dynamically interconnect on an ad-hoc basis to facilitate data exchange. Consider, for example, the placement of moisture and temperature sensors in a large industrial greenhouse. If all of these devices have to have direct wired or wireless connectivity, then a lot of infrastructure would need to be put in place. But if the sensors can be connected to edge computing devices that can establish a mesh network, then only one wired or wireless connection to the local network may be required. Edge computing hardware is defined by its location, not its size, and so some edge devices may be very powerful local servers. But this said, a lot of edge computing is destined to take place on small devices, such as single board computers. Here, for example, we have a LattePanda Alpha and a UDOO BOLT, both of which could be deployed to process data at the edge. Other potential edge devices include the Edge-V from Khadas as we can see here -- this has even got “edge” in its name -- and it’s got multiple camera connectors, which is very useful for edge applications. And then over here we have a Jetson Nano SoM, a system-on-a-module, and this is a particularly interesting single board computer because it’s got a 128 CUDA core GPU. So it’s very good for vision recognition processing at the edge. Another slightly different and very interesting device is this, the Intel Neural Compute Stick 2, or NCS2. This features a Movidius Myriad X vision processing unit, or VPU, and it’s a development kit for prototyping AI edge applications. And if I take off the end here you’ll see this is a cap, and this is actually a USB device. And the idea is you can plug this into a single board computer, such as a Raspberry Pi, in order to significantly increase the capability of a small board like a Raspberry Pi to run edge applications like vision recognition. The exact definition of edge computing remains a little blurry. This said, all major players agree that it places networked computing resources as close as possible to where data is created. To provide you with some more extensive definitions, IBM note that “Edge computing is an important emerging paradigm that can expand your operating model by virtualizing your cloud beyond a data center or cloud computing center. Edge computing moves application workloads from a centralized location to remote locations, such as factory floors, warehouses, distribution centers, retail stores, transportation centers, and more". Similarly, the Open Glossary of Edge Computing from the Linux Foundation defines edge computing as “The delivery of computing capabilities to the logical extremes of a network in order to improve the performance, operating cost and reliability of applications and services. By shortening the distances between devices and the cloud resources that serve them, and also reducing network hops, edge computing mitigates the latency and bandwidth constraints of today's Internet, ushering in new classes of applications”. Cisco have also introduced the term “fog computing”, which it describes as “. . .a standard that defines how edge computing should work, and [which] facilitates the operation of compute, storage and networking services between end devices and cloud computing data centers". What this means is that fog computing refers to resources that lie close to the metaphorical ground, or between the edges of a network and the remote cloud. It may be, for example, that in a factory some edge sensors communicate with local fog resources, which in turn communicate as necessary with a cloud data center. It should be noted that the term “fog computing” is mainly used by Cisco, and is viewed by some as a marketing term rather than an entirely distinct paradigm to edge computing. Edge computing is emerging for two reasons. The first is the rising pressure on network capacity. While the second is our growing demand to obtain a faster and faster response from AI and related applications. As a result, while for a decade we’ve been pushing computing power out to the cloud, increasingly we’re also be pushing it in the opposite direction to the local extremities of our networks. More information on a wide range of computing developments -- including AI, blockchain and quantum computing -- can be found here on the ExplainingComputers YouTube channel. But now that’s it for another video. If you’ve enjoyed what you’ve seen here please press that like button. If you haven't subscribed, please subscribe. And I hope to talk to you again very soon.
Info
Channel: ExplainingComputers
Views: 135,919
Rating: 4.9455371 out of 5
Keywords: Edge computing, Edge, cloud computing, edge and cloud, edge vs cloud, computing, edge computing cloud computing, Jetson Nano, LattePanda Alpha, fog computing, AI, AI edge, AI edge computing, edge and fog, edge vs flog, edge computing fog computing, edge fog cloud, edge computing definition, fog computing definition, Cisco, Cisco fog computing, IBM, IBM edge computing, Open Glossary of Edge Computing, Christpher Barnatt, Barnatt
Id: 0idvaOCnF9E
Channel Id: undefined
Length: 10min 25sec (625 seconds)
Published: Sun Oct 20 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.