Google I/O

AI (and by extension Quantum Computing) is the future of computing, whether we are talking about conversation with virtual assistants, or pure computational power. Google’s I/O event shed more light into the advancements that the company has made to ensure that it has a head start this race to the future, by announcing two products that might not seem like they are linked, but will help Google capture a huge chunk of the future market.

Google AI Chips v4 TPUS:

Google has announced the next generation of its AI Tensor processing Units-the v4, which will allow the company to deliver twice the performance of v3 TPUs.

These TPUs are basically the building blocks of Pods, which are supercomputers with over exaflop of processing power. Don’t know what that means? Well, in the words of Sundar Pichai, 10 million laptops connected will deliver just as much power as a single Pod supercomputer. Phew, that’s a lot.

“This is the fastest system we’ve ever deployed at Google and a historic milestone for us,” Pichai said. “Previously to get an exaflop you needed to build a custom supercomputer, but we already have many of these deployed today and will soon have dozens of TPUv4 pods in our data centers, many of which will be operating at or near 90% carbon-free energy. And our TPUv4 pods will be available to our cloud customers later this year.”

These chips will help Google power many of its own Machine learning needs, but will also become available for developers as part of its Google Cloud Platform.

LaMDA:

Yeah, so you don’t care about some chip with computational power that you don’t need? Well, Google has something more customer oriented in its AI bag, even though it’s still in development. The company is currently working on LaMDA, a new system that will make virtual assistants talk just like human beings, fueling more natural conversations.

During the conference, Google showed a conversation that  LaMDA had with a Google employee, acting as Pluto. And for the most part, it actually felt like you were talking to the real Pluto. Not only was it able to express meaningful insights (like which spacecraft first reached the planet) but also engage in conversations that felt much more natural. For example, the LaMDA Pluto expressed its grief over being referred to as the dwarf planet. Is this the start of the Matrix?

The product is still in development, and Google admits that it sometimes gives meaningless responses. But over time, it will allow the company to provide better search results, and make its products more natural.