On April 19, Alibaba Damo Institute released a large-scale language model, plug, with 27 billion parameters, which is the world's largest pre training language model for Chinese text. The ability of language understanding and creative text generation is integrated in plug, which is outstanding in the field of long text generation such as fiction imitation, poetry generation and intelligent question answering. Its goal is to greatly improve the performance of Chinese natural language technology in various tasks through the ability of super large model, so as to achieve better performance than human performance.
According to reports, in May last year, openai, an overseas company, released a gpt-3 model that can write novels, chat, compose music scores, and write code. Since then, the training process of similar models in the Chinese field has attracted much attention. Similar to gpt-3, the plug model released by aridamo academy this time is expected to be widely used in the field of text generation. This kind of super large model has strong versatility and may become one of the new infrastructures in the era of artificial intelligence.
Compared with GPT - In contrast, plug integrates the dual model of language understanding and language generation developed by Dharma Institute, and improves the relevance of output text by building the ability of two-way understanding of input text. In the task of language understanding, plug takes 80 . 614 points, a new record in the industry's authoritative Chinese language understanding benchmark clue classification list; In terms of language generation task, a number of application data of plug have been improved by 8 % above.