T

T5 Small Standard Bahasa Cased

Developed by mesolitica
Pre-trained T5 small standard Malay language model supporting multiple natural language processing tasks.
Downloads 28
Release Time : 3/2/2022

Model Overview

This model is a small Malay language model based on the T5 architecture, pre-trained for multitasking, suitable for various NLP tasks such as Q&A, summarization, and translation.

Model Features

Multitask pre-training
The model is pre-trained on 10 different tasks including language masking, title prediction, Q&A, translation, etc.
Malay language optimization
Specifically trained and optimized for Malay, supporting various NLP tasks in Malay.
Prefix support
Supports multiple task prefixes such as Q&A, summarization, and translation, allowing task specification through simple prefixes.

Model Capabilities

Q&A generation
Text summarization
Title generation
Text paraphrasing
English-Malay translation
Malay-English translation
Knowledge graph generation
Semantic similarity judgment

Use Cases

Information retrieval
Q&A system
Answering factual questions about Malaysia
Example: Input 'soalan: siapakah perdana menteri malaysia?', output 'Mahathir Mohamad'
Content generation
News summarization
Generating concise summaries of news articles
Title generation
Generating appropriate titles for article content
Language translation
English-Malay translation
Translating English text to Malay
Malay-English translation
Translating Malay text to English
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase