Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add new paper: #39

Open
wyzh0912 opened this issue Feb 23, 2025 · 0 comments
Open

Add new paper: #39

wyzh0912 opened this issue Feb 23, 2025 · 0 comments

Comments

@wyzh0912
Copy link
Contributor

Title

Mass-Editing Memory with Attention in Transformers: A cross-lingual exploration of knowledge

Published Date

2024-08-01

Source

ACL

Head Name

Memory Head

Summary

  • Innovation: The paper introduces Mass-Editing Memory with Attention in Transformers (MEMAT), a method that significantly improves cross-lingual knowledge editing in transformer models by minimally modifying parameters and optimizing specific attention heads.

  • Tasks: The study uses cross-lingual datasets in English and Catalan to evaluate the effectiveness of MEMAT, focusing on factual knowledge editing in language models through experiments that assess the efficacy, generalization, and specificity of knowledge edits using various prompts and interventions.

  • Significant Result: MEMAT achieves more than a 10% improvement in magnitude metrics compared to previous methods, demonstrating enhanced portability and computational efficiency in cross-lingual knowledge editing without significant degradation in existing knowledge.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant