Because the sensor will generate too much data, it is difficult to pass it to the cloud, so edge computing is becoming the mainstream trend.
According to Semiconductor Engineering, the original idea of the Internet of Things (IoT) device was that a simple sensor would send raw data to the cloud and process it through one or more gateways. These gateways may be located in companies, homes, factories, and even in cars. However, it is increasingly apparent that too much data must be processed. This method is not feasible.
Tien Shiah, who is responsible for HBM marketing for Samsung Electronics, said that a PC will generate 90MB of data each day. One car will produce 4TB per day, while the connected aircraft will be 50TB. Most of them are useless data.
If the preprocessing is done locally, then only by processing less data in the cloud, better performance can be achieved at a lower cost and with less power, thereby realizing the quick response required by the self-driving car, the drone, or even the robot. These are the reasons why edge operations suddenly get so much attention. It allows the computing task to be closer to the data source. In terms of driving a car, the final calculation may be performed on the sensor itself.
This is also important for artificial intelligence (AI), machine learning (ML) and deep learning (DL) applications. The key to AI/ML/DL is the ability to make inferences on the local device to improve security and performance. However, the bigger problem of inference is memory throughput. Frank Ferro, senior director of product management at Rambus, said that memory has once again become a bottleneck. Many emerging applications, whether AI or ADAS, require higher memory bandwidth.
In addition, these applications are mostly powered by batteries, or they must survive within a highly constrained power budget, and the difficulty of developing such devices becomes more challenging.
One of the biggest problems with edge operations is that it is a transformation technique that will be defined as it develops. Currently, it is still impossible to order special edge computing products that can support a combination of specific IoT devices, infrastructure, and computing requirements.
NVIDIA announced in late March that it is collaborating with ARM to integrate the NVIDIA Deep Learning Accelerator architecture with ARM's Project Trillium machine learning platform, allowing chip makers to easily add machine learning capabilities to IoT devices. Intel also introduced 14 new Xeon processors in February.
Both Intel and NVIDIA/ARM products can add more processing power near the endpoints, but neither of these products is ideal for sending data back to the cloud. According to Zeus Kerravala, chief analyst at ZK Research, the partnership between NVIDIA and ARM, as well as edge processors announced by Intel, are all basic products designed for devices, gateways, etc. that require increased processing capabilities.
The family IoT market may eventually exceed IIoT, but IIoT is setting the pace and agenda. Julian Watson, an analyst at market research agency IHS Markit, said that the demand for IoT gateways with edge computing capabilities is growing. The demand mainly comes from 3 specific areas: providing bridges for low-power nodes not directly connected to the network, such as Bluetooth Low Energy (BLE) or Zigbee based sensors; filtering traffic, deciding which data should be processed at the edge, and Which data is sent to the cloud; manage the security of these edge devices.
Michael Howard, executive director of IHS Markit, believes that IoT/edge gateways should at least be able to do the following: 1. Reduce the amount of raw data from IoT devices by integrating duplicate data. 2. Convert the data to a format that can be read by the upstream application. 3. Have an upstream application that can determine what data will be obtained and from which device. 4. Contains information on how to organize the data and optimize it.
Howard said that if the gateway cannot push the raw data into compact and practical data, pushing it upstream will only waste time and bandwidth. Processing must be done where the data occurs, preferably more than once.
All major system suppliers are eager to enter the market, but the demand for gateways is growing. This problem is more complex than collecting temperature data from several sensors. Especially in IIoT, the traditional SCADA and other automation systems in each vertical market are usually closed, proprietary, unfriendly to new communication technologies, and cannot be quickly eliminated.
Simon Segars, chief executive officer of ARM, said that there are so many Next Big Things happening now that it is difficult to determine where to start. The new communications protocol, whether it is 5G, LoRA, NBIoT and other new technologies, requires a lot of innovation in semiconductor equipment. Currently AI is driving cloud chips. At the edge, inferences are driving innovation in design.
More:YIKESHU
According to Semiconductor Engineering, the original idea of the Internet of Things (IoT) device was that a simple sensor would send raw data to the cloud and process it through one or more gateways. These gateways may be located in companies, homes, factories, and even in cars. However, it is increasingly apparent that too much data must be processed. This method is not feasible.
Tien Shiah, who is responsible for HBM marketing for Samsung Electronics, said that a PC will generate 90MB of data each day. One car will produce 4TB per day, while the connected aircraft will be 50TB. Most of them are useless data.
If the preprocessing is done locally, then only by processing less data in the cloud, better performance can be achieved at a lower cost and with less power, thereby realizing the quick response required by the self-driving car, the drone, or even the robot. These are the reasons why edge operations suddenly get so much attention. It allows the computing task to be closer to the data source. In terms of driving a car, the final calculation may be performed on the sensor itself.
This is also important for artificial intelligence (AI), machine learning (ML) and deep learning (DL) applications. The key to AI/ML/DL is the ability to make inferences on the local device to improve security and performance. However, the bigger problem of inference is memory throughput. Frank Ferro, senior director of product management at Rambus, said that memory has once again become a bottleneck. Many emerging applications, whether AI or ADAS, require higher memory bandwidth.
In addition, these applications are mostly powered by batteries, or they must survive within a highly constrained power budget, and the difficulty of developing such devices becomes more challenging.
One of the biggest problems with edge operations is that it is a transformation technique that will be defined as it develops. Currently, it is still impossible to order special edge computing products that can support a combination of specific IoT devices, infrastructure, and computing requirements.
NVIDIA announced in late March that it is collaborating with ARM to integrate the NVIDIA Deep Learning Accelerator architecture with ARM's Project Trillium machine learning platform, allowing chip makers to easily add machine learning capabilities to IoT devices. Intel also introduced 14 new Xeon processors in February.
Both Intel and NVIDIA/ARM products can add more processing power near the endpoints, but neither of these products is ideal for sending data back to the cloud. According to Zeus Kerravala, chief analyst at ZK Research, the partnership between NVIDIA and ARM, as well as edge processors announced by Intel, are all basic products designed for devices, gateways, etc. that require increased processing capabilities.
The family IoT market may eventually exceed IIoT, but IIoT is setting the pace and agenda. Julian Watson, an analyst at market research agency IHS Markit, said that the demand for IoT gateways with edge computing capabilities is growing. The demand mainly comes from 3 specific areas: providing bridges for low-power nodes not directly connected to the network, such as Bluetooth Low Energy (BLE) or Zigbee based sensors; filtering traffic, deciding which data should be processed at the edge, and Which data is sent to the cloud; manage the security of these edge devices.
Michael Howard, executive director of IHS Markit, believes that IoT/edge gateways should at least be able to do the following: 1. Reduce the amount of raw data from IoT devices by integrating duplicate data. 2. Convert the data to a format that can be read by the upstream application. 3. Have an upstream application that can determine what data will be obtained and from which device. 4. Contains information on how to organize the data and optimize it.
Howard said that if the gateway cannot push the raw data into compact and practical data, pushing it upstream will only waste time and bandwidth. Processing must be done where the data occurs, preferably more than once.
All major system suppliers are eager to enter the market, but the demand for gateways is growing. This problem is more complex than collecting temperature data from several sensors. Especially in IIoT, the traditional SCADA and other automation systems in each vertical market are usually closed, proprietary, unfriendly to new communication technologies, and cannot be quickly eliminated.
Simon Segars, chief executive officer of ARM, said that there are so many Next Big Things happening now that it is difficult to determine where to start. The new communications protocol, whether it is 5G, LoRA, NBIoT and other new technologies, requires a lot of innovation in semiconductor equipment. Currently AI is driving cloud chips. At the edge, inferences are driving innovation in design.
More:YIKESHU
没有评论:
发表评论