L

Llama2 7b WhoIsHarryPotter

Developed by microsoft
This model achieves selective forgetting of 'Harry Potter' series knowledge from large language models through fine-tuning while maintaining other performance metrics.
Downloads 520
Release Time : 10/3/2023

Model Overview

This model serves as a proof-of-concept for large language model forgetting technology, capable of erasing specific training data ('Harry Potter' series content) from LLMs without retraining, with performance on standard benchmarks nearly identical to the original model.

Model Features

Selective knowledge forgetting
Erases specific domain knowledge ('Harry Potter' series) from the model without retraining
Performance preservation
Maintains nearly identical performance to the original model on standard benchmarks, with only target knowledge erased
Efficient fine-tuning
Requires only 0.5 hours of training time (4 A100 GPUs) and 4 million tokens to complete knowledge erasure

Model Capabilities

Text generation
Selective knowledge forgetting
Dialogue systems

Use Cases

Research applications
Forgetting technology research
Used for testing and validating knowledge forgetting techniques in large language models
Successfully erased over 95% of 'Harry Potter' series related knowledge
Model safety research
Research on removing sensitive or inappropriate content from models
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase