Serial Lifelong Editing via Mixture of Knowledge Experts
Journal
Proceedings of the Annual Meeting of the Association for Computational Linguistics
Journal Volume
1
Start Page
30888
End Page
30903
ISSN
0736587X
ISBN (of the container)
979-889176251-0
ISBN
[9798891762510]
Date Issued
2025
Author(s)
Abstract
It is challenging to update Large language models (LLMs) since real-world knowledge evolves. While existing Lifelong Knowledge Editing (LKE) methods efficiently update sequentially incoming edits, they often struggle to precisely overwrite the outdated knowledge with the latest one, resulting in conflicts that hinder LLMs from determining the correct answer. To address this Serial Lifelong Knowledge Editing (sLKE) problem, we propose a novel Mixture-of-Knowledge-Experts scheme with an Activation-guided Routing Mechanism (ARM), which assigns specialized experts to store domain-specific knowledge and ensures that each update completely overwrites old information with the latest data. Furthermore, we introduce a novel sLKE benchmark where answers to the same concept are updated repeatedly, to assess the ability of editing methods to refresh knowledge accurately. Experimental results on both LKE and sLKE benchmarks show that our ARM performs favorably against SOTA knowledge editing methods. © 2025 Association for Computational Linguistics.
Event(s)
63rd Annual Meeting of the Association for Computational Linguistics, ACL 2025
Publisher
Association for Computational Linguistics (ACL)
Type
conference paper
