AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Wikipedia fine-tuning

# Wikipedia fine-tuning

Trocr Base Spanish
MIT
Base version of TrOCR model, specifically designed for Spanish printed text, based on Transformer architecture, fine-tuned on a custom dataset
Text Recognition Transformers Supports Multiple Languages
T
qantev
170
5
Arabart Finetuned Wiki Ar
Apache-2.0
A translation model fine-tuned on Arabic Wikipedia dataset based on AraBART
Machine Translation Transformers
A
Jezia
382
0
Gpt2 Small Portuguese
MIT
A Portuguese language model fine-tuned based on GPT-2 small model, trained on Portuguese Wikipedia, supporting NLP tasks like text generation
Large Language Model Other
G
pierreguillou
10.09k
45
Gpt2 Small Spanish
Apache-2.0
Spanish language model based on GPT-2 small architecture, fine-tuned on Spanish Wikipedia through transfer learning
Large Language Model Spanish
G
datificate
13.14k
30
Gpt2 Small Turkish
Apache-2.0
This is a fine-tuned version of the GPT2-Small English model, trained on Turkish Wikipedia articles, suitable for Turkish text generation tasks.
Large Language Model Other
G
gorkemgoknar
545
10
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase