CN
China Telecom launches AI large model with hundreds of billions of parameters
Chinese article by 陈炳欣
English Editor 张未名
11-13 17:25

By Kate Yuan

(JW Insights) Nov 13 -- China Telecom, the country’s leading telecom carrier, unveiled its large language model Xingchen (星辰), which boasts hundreds of billions of parameters, at the Digital Technology Ecosystem Conference 2023 held on November 10 in southern China’s Guangzhou City.

Xingchen is an upgrade of China Telecom's self-developed large language model, increasing the parameter scale from millions to hundreds of billions with significant improvements in suppressing illusions, extrapolation window, interactive experiences, and multi-turn comprehension, the company said.

The inference speed of Xingchen has been increased by 4.5 times, and the Chinese image understanding and generation capabilities have improved by 30%. In terms of creative efficiency, its production time has been reduced by 92% compared to previous production tools, and design costs have decreased by 95%.

Xingchen includes the first batch of 12 industry-specific large language models for trial commercial use, covering education, grassroots governance, government service, emergency, medical insurance, transportation, construction and housing, and finance.

China Telecom also released a White Paper on China Telecom Industry Large Model Language Technology, sharing a series of technical standards from data annotation to deployment, as well as five industry training procedures for the large models.

China Telecom stated that they will open source the model with tens of billions of parameters by the end of this year and the model with hundreds of billions of parameters in April next year. All underlying code will be open source, and they will also release high-quality cleaned data of over 1TB, as well as various toolchains based on the Xingchen foundation.

linkedin twitter facebook line
Copy succeeded
link