This post is also available in: 
     עברית (Hebrew)
עברית (Hebrew)
In a country where strict censorship is paired with highly competitive technological advancement, some difficulties are bound to arise. And so, a recent report by the Financial Times revealed that AI companies in China are being reviewed by the government, in order to make sure that their LLMs are in line with “core socialist values”.
The review, which is overseen by the Cyberspace Administration of China (CAC), the country’s primary internet regulator, is evaluating AI models on their responses to politically sensitive topics, including Chinese President Xi Jinping, as well as scrutinizing the models’ training data and safety protocols. The review includes anywhere from large corporates like ByteDance (parent company of TikTok) and Alibaba to smaller startups, as reported by the Financial Times.
Just last year, China established comprehensive rules for generative AI, mandating compliance with “core values of socialism” and prohibiting the generation of “illegal content”. Implementing these rules involves a rigorous process of content filtering during AI model training, particularly as many Chinese models are still trained on English-language data. The filtering is carried out by removing information that might be problematic from the training data, and simultaneously creating a database of phrases that may be sensitive so that the model knows not to use them. Popular chatbots in China often decline to discuss sensitive topics like the Tiananmen Square protests, although during CAC testing, there are constraints on how many questions models can refuse, so the model must generate one that is appropriate. To manage risks, developers in China are integrating real-time filtering systems into their AI to replace problematic responses instantly.
Despite China being slow on launching its GPT competitor due to regulatory challenges and U.S. sanctions affecting access to chips used to train LLMs, China remains a leader in global generative AI patents.

 
            
