How Do Large Language Models Capture the Ever-changing World Knowledge? A Review of Recent Advances

Publication Name

EMNLP 2023 - 2023 Conference on Empirical Methods in Natural Language Processing, Proceedings

Abstract

Although large language models (LLMs) are impressive in solving various tasks, they can quickly be outdated after deployment. Maintaining their up-to-date status is a pressing concern in the current era. This paper provides a comprehensive review of recent advances in aligning LLMs with the ever-changing world knowledge without re-training from scratch. We categorize research works systemically and provide in-depth comparisons and discussion. We also discuss existing challenges and highlight future directions to facilitate research in this field.

Open Access Status

This publication is not available as open access

First Page

8289

Last Page

8311

Funding Sponsor

France Télécom

This record is in the process of being updated. Please contact us for more information.

Share

COinS