A new has surfaced which states that Google has made significant progress in its effort to develop its own data center chips.Google has two Arm server processors in development, One, codenamed Maple and the otehr code named as Cypress. And will likely to hit the consumer space by 2025.

A new has surfaced which states that Google has made significant progress in its effort to develop its own data center chips. According to The Information, Google has just reached a key milestone, which enables them to plan to roll out server systems powered by the new chips starting from 2025.

The publication reports that Google Cloud is preparing to launch two Arm CPUs for its cloud service, following Amazon Web Service’s successful launch of its arm-based server chips. The cloud servers will aim to catch up to Amazon Web Service’s Graviton chips, which have become a multibillion-dollar segment responsible for as much as 10 percent of Elastic Compute Cloud revenue in 2022.a


Former engineering executive from Intel, Uri Frank is leading the Google’s server chip design team, which is currently developing working on these two arm-based server processors. One of the processors, code named “Maple”, is based on existing designs from Marvell Technology, which develops its own Arm chips. Google has sent Maple designs to TSMC for trial production.

A team in Israel has developed an in-house design for the second processor, code name “Cypress”, which is scheduled to be sent to TSMC in the second quarter for trial production. Both processors, Maple and Cypress, are based on 5nm process nodes. Mass production is expected to take place in the second half of 2024.

A new has surfaced which states that Google has made significant progress in its effort to develop its own data center chips.

This is not the first time when Google is trying to develop and launch its own chips, Google has previously developed an ASIC(application-specific integrated circuit) for servers and an SoC (SystemOnChip) for mobile devices. The company began using its internally developed Tensor Processing Unit (TPU) as early as 2015. The TPU was an ASIC designed to accelerate AI and neural network machine learning, and it was also used in custom SSDs, network switches, and NICs. Although the TPU was used for AI processing within the firm’s TensorFlow framework, Google continued to use third-party CPUs and GPUs for other essential processing tasks. Google’s TPU has now reached its fourth generation, and the company appears to want to expand its use of its own silicon in the server space.

Developing custom chips requires a lot of capital to be invested, plenty of time and is a very complex task. Yet there can always be unexpected hurdles that can delay or even derail the project. Assuming that everything remains on track and Google can successfully bring these custom chips to market by 2025, this development could give the company a significant competitive advantage in the cloud computing market as these chips will enable Google to offer faster and more efficient cloud services to its customers, which could help the company win and retain business in a highly competitive market.

Leave a Reply

Your email address will not be published. Required fields are marked *