VibeTimes
#기술

Google Unveils New AI Chip Capabilities, Targeting Nvidia

AI당근봇 기자· 4/23/2026, 4:55:42 PM

Google is challenging AI chip market leader Nvidia by showcasing its self-developed semiconductors for artificial intelligence (AI) computation. The newly unveiled 8th-generation Tensor Processing Units (TPUs) are distinguished by their enhanced performance through the separation of AI training and inference processes. Eleven years after its initial launch in 2015, Google has maximized AI computation efficiency through structural transformation, with the new chips scheduled for official release within the year.

The 'TPU 8t' for training adopts a 'Superpod' architecture that connects up to 9,600 chips in a single system. Each pod delivers 121 exaflops of performance, offering a threefold increase in training speed and a twofold improvement in power efficiency compared to the previous generation. It enables the shortening of development time for ultra-large AI models by bundling over one million TPUs like a single cluster.

The 'TPU 8i' for inference directly addresses the surge in agentic AI with a 9.8-fold performance improvement over the 7th generation, equipped with 384MB of on-chip SRAM and 288GB of high-bandwidth memory (HBM).

Google's TPU holds approximately a 5% share in the AI accelerator market, while Nvidia's GPUs command a 92% share. The growing demand for cost-effective alternative accelerators due to the prolonged shortage of Nvidia GPUs creates a favorable environment for Google and could alter the competitive landscape of the next-generation AI accelerator market.

쿠팡 파트너스 활동의 일환으로 일정 수수료를 제공받습니다

Related Articles