The draft policy calls for top-tier public cloud service providers to collaborate and pool their computing power for use by Beijing-based tertiary institutions, research facilities, and small- and medium-sized enterprises.
Public cloud refers to any technology platform that lets customers access shared storage and computing power via the internet.
Under the policy, the development of computing power projects will be accelerated to support the training of LLMs with hundreds of billions of parameters. These include the Beijing AI Public Computing Platform in Haidian district, and the Beijing Digital Economy Computing Centre located in Chaoyang district.
Parameters are the variables used in training an AI model. In general, the more the parameters, the more powerful a model becomes. GPT-3, the model originally underpinning Microsoft-backed OpenAI’s ChatGPT, has around 175 billion parameters. Google’s competing service Bard was trained with 137 billion parameters.