AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
RoPE positional encoding

# RoPE positional encoding

Nomic Xlm 2048
A fine-tuned version based on the XLM-Roberta base model, using RoPE (Rotary Position Embedding) to replace the original positional embeddings, supporting 2048 sequence length
Large Language Model Transformers
N
nomic-ai
440
6
Gpt Neox 20b
Apache-2.0
GPT-NeoX-20B is an open-source autoregressive language model with 20 billion parameters, designed based on the GPT-3 architecture and trained on The Pile dataset.
Large Language Model Transformers English
G
EleutherAI
345.06k
559
Slovak Gpt J 1.4B
Gpl-3.0
A large Slovak language generation model with 1.4 billion parameters, based on the GPT - J architecture
Large Language Model Transformers Other
S
Milos
90
7
Slovak Gpt J 162M
Gpl-3.0
The first publicly available Transformer model trained on Slovak language corpus with 162 million parameters
Large Language Model Transformers Other
S
Milos
15
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase