当前位置: X-MOL 学术Daedalus › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Human Language Understanding & Reasoning
Daedalus ( IF 1.340 ) Pub Date : 2022-01-01 , DOI: 10.1162/daed_a_01905
Christopher D. Manning

Abstract The last decade has yielded dramatic and quite surprising breakthroughs in natural language processing through the use of simple artificial neural network computations, replicated on a very large scale and trained over exceedingly large amounts of data. The resulting pretrained language models, such as BERT and GPT-3, have provided a powerful universal language understanding and generation base, which can easily be adapted to many understanding, writing, and reasoning tasks. These models show the first inklings of a more general form of artificial intelligence, which may lead to powerful foundation models in domains of sensory experience beyond just language.

中文翻译:

人类语言理解与推理

摘要 过去十年,通过使用简单的人工神经网络计算,在自然语言处理方面取得了巨大而令人惊讶的突破,大规模复制并在大量数据上进行了训练。由此产生的预训练语言模型,例如 BERT 和 GPT-3,提供了强大的通用语言理解和生成基础,可以轻松适应许多理解、写作和推理任务。这些模型展示了一种更普遍的人工智能形式的初步迹象,这可能会在感官体验领域产生强大的基础模型,而不仅仅是语言。
更新日期:2022-01-01
down
wechat
bug