🚀 SKEP-Roberta
SKEP-Roberta 是一个用于情感分析的预训练模型,它结合了情感知识增强预训练技术,能有效提升情感分析的效果。
🚀 快速开始
安装依赖
你可以使用以下命令安装所需的库:
pip install transformers torch
代码示例
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Yaxin/roberta-large-ernie2-skep-en")
model = AutoModel.from_pretrained("Yaxin/roberta-large-ernie2-skep-en")
高级用法
import torch
from transformers import RobertaTokenizer, RobertaForMaskedLM
tokenizer = RobertaTokenizer.from_pretrained('Yaxin/roberta-large-ernie2-skep-en')
input_tx = "<s> He like play with student, so he became a <mask> after graduation </s>"
tokenized_text = tokenizer.tokenize(input_tx)
indexed_tokens = tokenizer.convert_tokens_to_ids(tokenized_text)
tokens_tensor = torch.tensor([indexed_tokens])
segments_tensors = torch.tensor([[0] * len(tokenized_text)])
model = RobertaForMaskedLM.from_pretrained('Yaxin/roberta-large-ernie2-skep-en')
model.eval()
with torch.no_grad():
outputs = model(tokens_tensor, token_type_ids=segments_tensors)
predictions = outputs[0]
predicted_index = [torch.argmax(predictions[0, i]).item() for i in range(0, (len(tokenized_text) - 1))]
predicted_token = [tokenizer.convert_ids_to_tokens([predicted_index[x]])[0] for x in
range(1, (len(tokenized_text) - 1))]
print('Predicted token is:', predicted_token)
✨ 主要特性
SKEP(SKEP: Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis)由百度在 2020 年提出,它提出了用于情感分析的情感知识增强预训练方法。通过设计情感掩码和三个情感预训练目标,将各种类型的知识融入到预训练模型中。
更多细节请参考:https://aclanthology.org/2020.acl-main.374.pdf
📦 模型信息
属性 |
详情 |
模型名称 |
skep-roberta-large |
语言 |
英文 |
模型结构 |
层数:24,隐藏层维度:1024,注意力头数:24 |
此发布的 PyTorch 模型是从官方发布的 PaddlePaddle SKEP 模型转换而来,并进行了一系列实验以验证转换的准确性。
- 官方 PaddlePaddle SKEP 仓库:
- https://github.com/PaddlePaddle/PaddleNLP/blob/develop/paddlenlp/transformers/skep
- https://github.com/baidu/Senta
- PyTorch 转换仓库:尚未发布
📄 引用
如果你使用了该模型,请引用以下论文:
@article{tian2020skep,
title={SKEP: Sentiment knowledge enhanced pre-training for sentiment analysis},
author={Tian, Hao and Gao, Can and Xiao, Xinyan and Liu, Hao and He, Bolei and Wu, Hua and Wang, Haifeng and Wu, Feng},
journal={arXiv preprint arXiv:2005.05635},
year={2020}
}
参考链接:https://github.com/nghuyong/ERNIE-Pytorch