This article was published on June 3, 2021

China’s ‘Wu Dao’ AI is 10X bigger than GPT-3, and it can sing

Size isn't everything


China’s ‘Wu Dao’ AI is 10X bigger than GPT-3, and it can sing Image by: Possessed Photography on Unsplash

China’s going all in on deep learning. The Beijing Academy of Artificial Intelligence (BAAI) recently released details concerning its “Wu Dao” AI system – and there’s a lot to unpack here.

Up front: Wu Dao is a multi-modal AI system. That means it can do a bunch of different things. It can generate text, audio, and images, and, according to Engadget, it can even “power virtual idols.”

The reason for all the hullabaloo surrounding Wu Dao involves its size. This AI model is huge. It was trained using a whopping 1.75 trillion parameters. For comparison, OpenAI’s biggest model, GPT-3, was trained with just 175 billion.

Background: According to Zhang Hongjiang, the chairman of BAAI, the academy’s intent is to create the biggest, most powerful AI model possible.

Per the aforementioned Engadget report, Zhang said:

The path to general artificial intelligence is big models and big computer [sic]. What we are building is a power plant for the future of AI, with mega data, mega computing power, and mega models, we can transform data to fuel the AI applications of the future.

Quick take: This AI system sounds like a breakthrough UI for deep learning tricks, but it’s doubtful this kind of brute-force method will eventually lead to general artificial intelligence.

It’s cool to know there’s a powerful AI out there that can make music videos, write poetry, and create captions for images on its own. And, with so many parameters, Wu Dao surely produces some incredibly convincing outputs.

But creating a general AI – that is, an AI capable of performing any relative task a human can – isn’t necessarily a matter of increasing the power and parameters of a deep learning system.

Details as to exactly how Wu Dao was trained, what was in its various datasets, and what practical applications it can be used for remain scarce. It’s impossible to do a direct comparison to GPT-3 at this point.

But, even if we assume Wu Dao is 10-times better across the board, there’s still no reason to believe that’ll move the needle any closer to truly-intelligent machines.

A steadily-increasing number of AI and computer science experts are under the belief that deep learning is a dead-end street for general artificial intelligence.

We may already be seeing diminishing returns on power if the most exciting thing about a system trained on supercomputer clusters with 1.75T parameters is that it can generate digital pop stars.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with