🚀 葡萄牙语临床命名实体识别 - 症状
该症状命名实体识别模型是 BioBERTpt项目 的一部分,该项目训练了13个临床实体模型(与UMLS兼容)。所有来自“pucpr”用户的命名实体识别模型均基于巴西临床语料库 SemClinBr 进行训练,以BioBERTpt(all)模型为基础,训练10个轮次,采用IOB2格式。
可视化示例
文本示例 |
详情 |
"Há 15 anos relata dor lombar com irradiação para coxa direita." |
患者自述腰痛15年,疼痛放射至右大腿。 |
"Paciente segue internado, sem presença de edema." |
患者仍在住院,无水肿表现。 |
数据集
项目图标

📚 详细文档
致谢
本研究部分由巴西高等教育人员发展协调局(CAPES)资助 - 资助代码001。
引用信息
若你使用了该项目,请引用以下论文:
@inproceedings{schneider-etal-2020-biobertpt,
title = "{B}io{BERT}pt - A {P}ortuguese Neural Language Model for Clinical Named Entity Recognition",
author = "Schneider, Elisa Terumi Rubel and
de Souza, Jo{\~a}o Vitor Andrioli and
Knafou, Julien and
Oliveira, Lucas Emanuel Silva e and
Copara, Jenny and
Gumiel, Yohan Bonescki and
Oliveira, Lucas Ferro Antunes de and
Paraiso, Emerson Cabrera and
Teodoro, Douglas and
Barra, Cl{\'a}udia Maria Cabral Moro",
booktitle = "Proceedings of the 3rd Clinical Natural Language Processing Workshop",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.clinicalnlp-1.7",
pages = "65--72",
abstract = "With the growing number of electronic health record data, clinical NLP tasks have become increasingly relevant to unlock valuable information from unstructured clinical text. Although the performance of downstream NLP tasks, such as named-entity recognition (NER), in English corpus has recently improved by contextualised language models, less research is available for clinical texts in low resource languages. Our goal is to assess a deep contextual embedding model for Portuguese, so called BioBERTpt, to support clinical and biomedical NER. We transfer learned information encoded in a multilingual-BERT model to a corpora of clinical narratives and biomedical-scientific papers in Brazilian Portuguese. To evaluate the performance of BioBERTpt, we ran NER experiments on two annotated corpora containing clinical narratives and compared the results with existing BERT models. Our in-domain model outperformed the baseline model in F1-score by 2.72{\%}, achieving higher performance in 11 out of 13 assessed entities. We demonstrate that enriching contextual embedding models with domain literature can play an important role in improving performance for specific NLP tasks. The transfer learning process enhanced the Portuguese biomedical NER model by reducing the necessity of labeled data and the demand for retraining a whole new model.",
}
问题反馈
若你有任何问题,请在 BioBERTpt仓库 上提交GitHub问题。