LAS VEGAS: Google has introduced a new generation of tensor processing units (TPUs) aimed at accelerating artificial intelligence development and supporting the growing use of digital “agents” across industries.
The announcement came during the company’s annual cloud computing conference in Las Vegas, where high-performance chips were among the key innovations highlighted.
Google, along with Amazon, has increasingly focused on designing its own advanced AI chips to gain greater control over hardware and reduce dependence on Nvidia’s widely used graphics processing units (GPUs).
“In the era of AI agents, infrastructure needs to evolve to take on the most demanding AI workloads,” Google chief executive Sundar Pichai said in a blog post.
“This year, we’re bringing the eighth generation of our Tensor Processing Units with a dual chip approach.”
AI agents
The newly unveiled TPUs include one chip designed specifically for training large language models that underpin AI systems, while the other is tailored for “inference” — a reasoning and decision-making process used by AI agents.
AI agents are increasingly being developed as digital assistants capable of independently performing complex computing tasks.
The chips, developed in collaboration with semiconductor firm Broadcom, are expected to become available later this year, according to Google Cloud head Thomas Kurian.
The move comes amid intensifying competition in the AI hardware space. Earlier this year, Nvidia unveiled its latest Vera and Rubin GPUs, while Amazon introduced a new generation of its Trainium processors.
Despite efforts to develop in-house chips, major tech firms including Google, Amazon and Microsoft continue to rely on Nvidia GPUs within their broader computing infrastructure.



