Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Video – Career Paths in Analytics and Data Science and Analytics Team Building
 Event: Machine Learning Week 2021 Keynote: Career Paths in Analytics and...
What Percentage of Your Machine Learning Models Have Been Deployed?
  Do your models usually get deployed? Or are...
Video – Credit Models, Microfinance, and Improving the Lives of Families in the Developing World
 Event: Machine Learning Week 2021 Keynote: Credit Models, Microfinance, and...
Video – Identifying Program Effectiveness for Survivors of Human Trafficking from Muneeb Alam of QuantumBlack
 Event: Machine Learning Week 2021 Keynote: Identifying Program Effectiveness for Survivors...
SHARE THIS:

3 weeks ago
US-China Tech War: Beijing-Funded AI Researchers Surpass Google and OpenAI with New Language Processing Model

 
Originally published in South China Morning Post, June 2, 2021.
  • The WuDao 2.0 natural language processing model had 1.75 trillion parameters, topping the 1.6 trillion that Google unveiled in a similar model in January.
  • China has been pouring money into AI to try to close the gap with the US, which maintains an edge because of its dominance in semiconductors.

A government-funded artificial intelligence (AI) institute in Beijing unveiled on Monday the world’s most sophisticated natural language processing (NLP) model, surpassing those from Google and OpenAI, as China seeks to increase its technological competitiveness on the world stage.

The WuDao 2.0 model is a pre-trained AI model that uses 1.75 trillion parameters to simulate conversational speech, write poems, understand pictures and even generate recipes. The project was led by the non-profit research institute Beijing Academy of Artificial Intelligence (BAAI) and developed with more than 100 scientists from multiple organisations.

Parameters are variables defined by machine learning models. As the model evolves, parameters are further refined to allow the algorithm to get better at finding the correct outcome over time. Once a model is trained on a specific data set, such as samples of human speech, the outcome can then be applied to solving similar problems.

In general, the more parameters a model contains, the more sophisticated it is. However, creating a more complex model requires time, money, and research breakthroughs.

To continue reading this article, click here.

Leave a Reply