Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Efficient Sequence Compression
# Efficient Sequence Compression
Mrt5 Large
MrT5 is an efficient byte-level language model based on ByT5 improvements, reducing input sequence length by approximately 50% through dynamic token merging technology
Large Language Model
Transformers
Supports Multiple Languages
M
stanfordnlp
33
2
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase