Google pushes for relaxed AI regulations
Credits: Wikimedia Commons

Tech behemoth Google is reportedly planning to collaborate with MediaTek to produce advanced AI chips next year. The Alphabet-owned search giant is teaming up with Taiwan’s leading semiconductor firm to develop the next generation of its Tensor Processing Units (TPUs), according to a report by The Information. Notably, TPUs are specialized chips designed to accelerate artificial intelligence (AI) computations.

Speaking of benefits, the report suggests that MediaTek offers more competitive pricing per chip compared to Broadcom, potentially reducing Google’s production costs. Along with this, MediaTek’s established relationship with the Taiwan Semiconductor Manufacturing Company (TSMC) ensures access to advanced semiconductor fabrication technologies. This is crucial for producing modern AI chips.

Clearly, by partnering with MediaTek, Google aims to strengthen its position in the AI industry. This will enhance the company’s internal research capabilities and offer its cloud customers more powerful AI processing options. This move also reduces Google’s dependence on other major chip manufacturers (like Nvidia) to provide more flexibility and control over its AI hardware infrastructure.

Interestingly, for many years, Google has worked exclusively with Broadcom for the design and production of its TPUs. In fact, the company has still not cut off its ties with Broadcom. The company continues to engage with Broadcom in co-designing certain AI chips.

The development comes at a time when in late 2024, Google introduced its sixth-generation TPU to provide both itself and its Cloud customers with an alternative to Nvidia’s widely used processors. However, despite these internal developments, Google continues to collaborate with external partners for various aspects of chip production, packaging, and quality testing.

In terms of investment, last year Google invested around $6 – $9 billion in developing, producing, and deploying TPUs, as per Omdia’s research data. The tech giant claims its TPUs (Tensor Processing Units) are much more efficient and powerful than traditional processors. According to Google, TPUs can process AI and machine learning (ML) tasks 15 to 30 times faster than regular CPUs (like Intel or AMD chips) and GPUs (like NVIDIA or AMD graphics cards). Additionally, TPUs use much less electricity compared to CPUs and GPUs, delivering 30 to 80 times more work per unit of power consumed.

The potential move is noteworthy, considering Google is continuously investing its efforts and capital to stay ahead in the ongoing AI race among global tech giants like OpenAI, Anthropic, Microsoft, and others. Recently, the company rolled out Gemma 3, the newest addition to its growing AI model family. Even last week, Google DeepMind launched two advanced AI models – ‘Gemini Robotics’ and ‘Gemini Robotics-ER’ – designed to enhance robotic capabilities in real-world tasks. Importantly, Google’s parent company, Alphabet, plans to invest $75 billion in the AI domain in 2025.

The scenario becomes even more interesting as the social media powerhouse Meta has already begun trials of its first-ever internally created AI chip. Also, the global AI chip market, valued at around $73.27 billion in 2024, is projected to reach around $927.76 billion by 2034.