Edge compute allows application developers and enterprises to enable large-scale, latency-sensitive use cases at the edge. Two important examples of these initial edge compute use cases include autonomous vehicles and the Internet of Things.
Edge compute delivers computing resources closer to where the needs are. Instead of housing these critical resources in a large cloud data center that could be hundreds, or even thousands, of miles away from where the data will ultimately be delivered, this new architecture puts it all at the edge of the network. In turn, edge compute is propelling the growth of autonomous vehicles and the Internet of Things.
Autonomous Vehicles – Edge Compute Use Case
Autonomous driving can be classified into six different levels, from traditional vehicles being Level 0 to fully autonomous vehicles being Level 5. Additionally, within this classification, Advanced Driver-Assistance Systems (ADAS) start at Level 1 and extend to Level 4. ADAS offers semi-autonomous features to the vehicle and is the first step towards fully autonomous vehicles.
Data Consumption by Autonomous Vehicles
At present, vehicles are not fully autonomous. However, as vehicles increasingly shift from Level 1 to Level 5 autonomy, more decision-making capability will be given to the vehicle. In turn, the data consumed by the vehicle will increase, as the level of autonomy increases, because it has to make more decisions. Specifically, Level 1 vehicles consume 3 gigabytes per hour, whereas Level 5 vehicles consume 50 gigabytes per hour. Indeed, Level 5 vehicles consume almost 17 times more data than Level 1 vehicles.
Examples of the data consumption by autonomous vehicles includes i) analyzing traffic patterns, ii) observing road conditions and iii) helping the driver make decisions. For example, Cruise, which is majority-owned by General Motors is producing very sophisticated autonomous vehicles. Furthermore, Cruise plans to begin testing these vehicles in San Francisco in late 2020.
In effect, autonomous vehicles are driving computers, functioning like a mini-data center. Autonomous vehicles are constantly aggregating, creating, sending, and receiving data. Numerous applications run on a vehicle and thus, significant Internet of Things information is being sent from a diagnostics perspective.
Requirements for Constant Connectivity
Connectivity is critical for autonomous vehicles. However, even Level 4 or Level 5 autonomous vehicles do not need to have the vehicle connected 24/7, with 99.999% uptime. This is because much of the mapping data that autonomous vehicles need to operate is already applied through artificial intelligence.
However, the autonomous vehicle does have to keep reading the conditions of the road and needs to keep transmitting data back-and-forth, between itself and a data center. By keeping this communication in-place with a data center, it allows the autonomous vehicle to learn and make decisions as it drives.
In contrast, advanced driver-assistance programs (ADAS) or semi-autonomous vehicles, can rely on GPS and Internet of Things information to function.
Network and Digital Infrastructure for Autonomous Vehicles
Presently, autonomous vehicles driving on major highways, under stable conditions, can virtually drive themselves. This current autonomous vehicle functionality relies on significant Internet of Things infrastructure. In turn, the Internet of Things depends on mobile infrastructure, including towers. Furthermore, autonomous vehicles use artificial intelligence workloads, which require processing in data centers.
Ultimately, to enable autonomous vehicles to perform at their highest level, they need to run on a well-built 5G network. In turn, digital infrastructure supports this 5G network.
Towers and Autonomous Vehicles
Tower coverage is a critical component to ensure that autonomous vehicles can function at scale. Presently, ~99% coverage of roads with towers has been achieved across the United States. This includes coverage of all major interstate highway systems.
However, the secondary, state and county roads are more difficult to cover, from an economic standpoint. Therefore, it will take time for 100% of roads in the United States to have coverage by towers. Indeed, for autonomous vehicles to fully function, they need full coverage of roads by towers.
Edge Compute and Autonomous Vehicles
Edge compute is another critical component to ensure that autonomous vehicles can function at scale. Artificial intelligence processing cannot be done in the cloud, and sent back to the vehicle. This is because there is too much latency in that round-trip path. Therefore, all that processing needs to be done at edge data center locations. These edge data centers can mitigate the latency issues of the cloud, by being physically closer to the vehicle.
In order to allow the autonomous vehicle to make decisions quickly, a low-latency environment is necessary. Low-latency allows information to transmit back-and-forth between the vehicle and the server, housed in the edge data center. In this environment, the server can iterate back-and-forth with the vehicle, and decisions can be made in milliseconds.
As a result of this new edge compute architecture, base stations and applications can no longer just reside in the city center. Instead, base stations and applications will need to also reside in the suburbs and transportation corridors. Indeed, the suburbs and transportation corridors are where much of the driving by autonomous vehicles will occur.
Digital Infrastructure Providers
As an example, digital infrastructure provider Colony Capital is using its extensive portfolio which comprises ~29k towers, 95+ data centers, >140k fiber route miles and >40k small cell & distributed antenna system nodes to help facilitate autonomous vehicles. Specifically, Colony has partnered in autonomous driving with major auto manufacturers, Uber, and semiconductor company Nvidia.
Internet of Things – Edge Compute Use Case
Another important use case that is driving growth in edge compute is the device-driven Internet of Things. Device-to-device connections are fueling efficiencies in consumer electronics, and in the home. Edge compute is the intermediate step to allow the hyperscale cloud companies to access these Internet of Things networks that the devices run on.
Smart Refrigerators – Example #1
As an example, connected refrigerators allow for real-time diagnostic information. Specifically, diagnostic information can be taken on the consumption patterns in the homeowner’s refrigerator. In turn, this information can allow companies like Amazon to determine when the next grocery delivery to a particular home should be.
Smart Thermostats – Example #2
As an example, connected in-home thermostats allow for energy efficiency analysis. Diagnostic information can be taken from these thermostats to assess how it relates to the devices used in the home, the homeowner’s utility bill, and the impact of power consumed by multiple homes, in the same area, on the overall electrical grid.
Both examples are opportunities for the cloud to manage data virtually. However, to process so much information effectively, there is a physical element that has to be close to end users. That physical element is edge compute.