O

Openbioner Base

Developed by disi-unibo-nlp
OpenBioNER is a lightweight BERT model specifically designed for open-domain biomedical named entity recognition (NER). It can identify unseen entity types using only natural language descriptions of target entity types, without requiring retraining.
Downloads 210
Release Time : 4/25/2025

Model Overview

OpenBioNER is pre-trained on synthetic silver-standard data generated through self-supervision by large language models (LLMs). It outperforms specialized LLMs like UniNER and GPT-4o in zero-shot settings, achieving up to 10% higher F1 scores across multiple biomedical benchmarks.

Model Features

Zero-shot Learning Capability
Identifies unseen entity types using only natural language descriptions of target entity types, without requiring retraining.
Lightweight Design
With only 110M parameters, it reduces model size by up to 4 times compared to baseline models like GLiNER while delivering superior performance.
High Performance
Achieves up to 10% higher F1 scores across multiple biomedical benchmarks, surpassing models like GPT-4o and UniNER.

Model Capabilities

Biomedical Named Entity Recognition
Zero-shot Learning
Multi-entity Type Recognition

Use Cases

Biomedical Research
Bacterial Name Recognition
Identifies bacterial names from biomedical literature, such as Streptococcus pneumoniae.
Achieves an F1 score of 49.1% on the BC2GM dataset.
Chemical Substance Recognition
Identifies chemical substance names from chemical literature.
Achieves an F1 score of 48.0% on the BC4CHEMD dataset.
Medical Information Extraction
Disease Name Recognition
Identifies disease names from clinical texts.
Achieves an F1 score of 58.5% on the NCBI dataset.
Featured Recommended AI Models
ยฉ 2025AIbase