X

Xlm Roberta Large Pooled Cap Minor

Developed by poltextlab
A multilingual text classification model fine-tuned based on xlm-roberta-large, used for minor topic code classification in comparative agenda projects
Downloads 61
Release Time : 4/9/2025

Model Overview

This model is a fine-tuned version based on the xlm-roberta-large architecture, specifically designed for zero-shot classification tasks in multilingual (English, Danish) texts, primarily applied to minor topic code classification in comparative agenda projects.

Model Features

Multilingual Support
Supports text classification in English and Danish
Zero-shot Classification
Capable of classifying unseen categories
Academic Specialization
The model is primarily intended for academic research purposes; commercial use requires special application

Model Capabilities

Multilingual Text Classification
Zero-shot Learning
Topic Code Identification

Use Cases

Academic Research
Comparative Agenda Analysis
Used for thematic classification of political agenda texts
Achieved 0.65 accuracy and 0.64 F1 score on English test sets
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase