During the last two years, several cloud service providers, including Alibaba, Amazon, Facebook, Google, Huawei, and Tencent, have been busy designing their own in-house chipsets for handling Artificial Intelligence (AI) workloads in their data centers. 

ABI Research (www.abiresearch.com), a global tech market advisory firm, estimates that cloud service providers commanded 3.3% market share of the total AI Cloud chip shipments in the first half of 2019. These players will increasingly rely on their own in-house AI chips and will be producing a total of 300,000 cloud AI chips by 2024, representing 18% of the global cloud AI chipsets shipped in 2024.

The increasing requirements for intelligent services by many enterprise verticals are pushing cloud service providers to rapidly upgrade their data centers with AI capabilities, which has already created an enormous demand for cloud AI chipsets in recent years. ABI Research expects revenues from these chipset shipments to increase significantly in the next five years, from 4.2 billion in 2019 to US$10 billion in 2024. 

Established chipset suppliers such as NVIDIA, Intel, and, to certain extent, Xilinx will continue to dominate the market landscape, thanks to the robust developer ecosystem they have created around their AI chipsets.

However, these players will increasingly face intensive competition from many new entrants and challengers, particularly their clients, namely the webscale companies such as Google, Alibaba, Amazon, and Huawei.

“The approach by webscale companies to develop in-house AI chips allows for better hardware-software integration and resources tailored to handle specific AI networks, which serves as a key differentiating point not only at the chipset level but also at the cloud AI service level,” said Lian Jye Su, principal analyst at ABI Research. “The success of these highly optimized processing units provides strong validation for the emergence of other cloud AI Application-Specific Integrated Circuits (ASICs) startups, such as Cerebras Systems, Graphcore, and Habana Labs.”

This trend, initiated by Google in 2017, has led to many other webscale companies to follow Google’s track. Baidu immediately followed with its own AI chipset, Kunlun, in 2018, and later in the same year, Amazon introduced its Inferentia chip to support its Amazon Web Service (AWS). AWS has strong influence in the AI industry due to the success of SageMaker, its machine learning development platform. 

Huawei is another captive company that has made a move toward using its in-house chips for its cloud services in an attempt to reduce its reliance on Western chipset suppliers. The company launched Ascend 310 and 910 in 2018 and has since expanded its product lineup into a series of cloud AI hardware, including an AI accelerator card and AI system. Recently, Huawei launched Atlas 900, an AI training cluster which is a direct competitor to NVIDIA’s DGX and features over 1,000 Ascend 910 chipsets.

“This further expands the footprint of cloud AI service providers, as they are also competing with Intel and NVIDIA for the mindshare of developers. By offering end-to-end AI hardware solutions, Google, Amazon, and Huawei can ensure that their users will enjoy the ease of development and deployment while creating an active and vibrant developer community around their chipset solutions and ultimately generating a large user base for their cloud AI services,” concluded Su.