Roberta Base RTE
Sequence classification model fine-tuned on the GLUE dataset using the TextAttack framework, based on the roberta-base architecture
Downloads 29
Release Time : 3/2/2022
Model Overview
This pre-trained model is designed for text sequence classification tasks, particularly suited for natural language understanding tasks. Fine-tuned using the adversarial training framework TextAttack, it enhances the model's robustness in classification tasks.
Model Features
Adversarial training enhancement
Adversarial training via the TextAttack framework improves the model's robustness against adversarial examples
GLUE dataset fine-tuning
Fine-tuned on the General Language Understanding Evaluation (GLUE) benchmark, suitable for various natural language understanding tasks
Efficient training configuration
Optimized with a learning rate of 2e-05 and a maximum sequence length of 128, achieving peak performance within 5 epochs
Model Capabilities
Text classification
Natural language understanding
Sequence labeling
Use Cases
Text analysis
Sentiment analysis
Classifies the sentiment orientation of text content
Achieved 79.42% accuracy on the validation set
Topic classification
Categorizes text into predefined topic classes
Featured Recommended AI Models