🚀 Adapter AdapterHub/roberta-base-pf-imdb
for roberta-base
这是一个适用于roberta-base
模型的适配器,它在sentiment/imdb数据集上进行训练,并包含一个用于分类的预测头。
此适配器是为与**adapter-transformers**库配合使用而创建的。
🚀 快速开始
安装依赖
首先,安装adapter-transformers
:
pip install -U adapter-transformers
⚠️ 重要提示
adapter-transformers 是 transformers 的一个分支,可作为支持适配器的直接替代品。更多信息
加载并激活适配器
现在,可以按如下方式加载并激活适配器:
from transformers import AutoModelWithHeads
model = AutoModelWithHeads.from_pretrained("roberta-base")
adapter_name = model.load_adapter("AdapterHub/roberta-base-pf-imdb", source="hf")
model.active_adapters = adapter_name
🔧 技术细节
架构与训练
此适配器的训练代码可在 https://github.com/adapter-hub/efficient-task-transfer 找到。
特别地,所有任务的训练配置可在此处找到。
评估结果
有关评估结果的更多信息,请参考论文。
📄 许可证
引用
如果您使用此适配器,请引用我们的论文 "What to Pre-Train on? Efficient Intermediate Task Selection":
@inproceedings{poth-etal-2021-pre,
title = "{W}hat to Pre-Train on? {E}fficient Intermediate Task Selection",
author = {Poth, Clifton and
Pfeiffer, Jonas and
R{"u}ckl{'e}, Andreas and
Gurevych, Iryna},
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.827",
pages = "10585--10605",
}