# Rotary Position Embedding
Modernbert Base Squad2 V0.2
Apache-2.0
QA model fine-tuned from ModernBERT-base-nli, supporting long-context processing
Question Answering System
Transformers

M
Praise2112
42
2
Eva02 Small Patch14 224.mim In22k
MIT
EVA02 feature/representation model, pretrained on ImageNet-22k via masked image modeling, suitable for image classification and feature extraction tasks.
Image Classification
Transformers

E
timm
705
0
Roformer V2 Chinese Char Large
RoFormerV2 is an enhanced Transformer model based on rotary position embedding, developed by Zhuiyi Technology, supporting Chinese text processing tasks.
Large Language Model
Transformers Chinese

R
junnyu
84
3
Roformer V2 Chinese Char Base
RoFormerV2 is an enhanced Transformer model based on rotary position embedding, optimized for Chinese text processing
Large Language Model
Transformers Chinese

R
junnyu
65
6
Roformer Chinese Small
RoFormer is a Transformer model enhanced by Rotary Position Embedding (RoPE), suitable for Chinese text processing tasks.
Large Language Model Chinese
R
junnyu
599
2
Roformer Chinese Char Small
RoFormer is a Chinese Transformer model enhanced with Rotary Position Embedding, suitable for text infilling tasks.
Large Language Model Chinese
R
junnyu
24
0
Gpt J 6b
Apache-2.0
GPT-J 6B is a 6-billion-parameter autoregressive language model trained using the Mesh Transformer JAX framework, employing the same tokenizer as GPT-2/3.
Large Language Model English
G
EleutherAI
297.31k
1,493
Featured Recommended AI Models