B

Bertino

Developed by indigo-ai
A lightweight DistilBERT model pre-trained on a large-scale Italian corpus, suitable for various natural language processing tasks.
Downloads 64
Release Time : 3/2/2022

Model Overview

BERTino is an Italian pre-trained language model developed by indigo.ai, based on the DistilBERT architecture. It is task-agnostic and can be fine-tuned for various downstream tasks.

Model Features

Lightweight and Efficient
Utilizes the DistilBERT architecture, which is smaller and faster than standard BERT models while maintaining high performance.
Large-scale Pre-training
Pre-trained on 14 million sentences (12GB of text) from the combined Paisa and ItWaC corpora.
Multi-task Adaptability
Validated for various downstream tasks such as part-of-speech tagging, named entity recognition, and sentence classification.

Model Capabilities

Text Classification
Named Entity Recognition
Part-of-Speech Tagging
Semantic Understanding

Use Cases

Natural Language Processing
Part-of-Speech Tagging
Achieved an F1 score of 0.9801 on the Italian ISDT dataset
Fine-tuning took 9 minutes and 4 seconds, evaluation required only 3 seconds
Named Entity Recognition
Performance on the Italian WikiNER dataset
F1 score of 0.9038, with fine-tuning speed nearly 50% faster than the teacher model
Sentence Classification
Application in multi-class classification tasks
F1 score of 0.7788, evaluation time only 6 seconds
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase