AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
8K Long Context

# 8K Long Context

L3 8B Lunar Stheno
L3-8B-Lunar-Stheno is a model merged from L3-8B-Lunaris-v1 and L3-8B-Stheno-v3.2, addressing issues of overly long responses and insufficient initiative while enhancing context awareness and text generation capabilities.
Large Language Model Transformers
L
HiroseKoichi
44
35
Cogvlm2 Llama3 Chat 19B Int4
Other
CogVLM2 is a multimodal dialogue model based on Meta-Llama-3-8B-Instruct, supporting both Chinese and English, with 8K context length and 1344*1344 resolution image processing capabilities.
Text-to-Image Transformers English
C
THUDM
467
28
Meta Llama 3 8B
Meta Llama 3 is a series of large language models developed by Meta, including 8B and 70B scales, focusing on dialogue optimization and safety.
Large Language Model Transformers English
M
meta-llama
509.91k
6,151
Tablellama
TableLlama is an open-source general-purpose large model specifically designed for various table tasks, trained on the TableInstruct dataset and capable of handling contexts up to 8K in length.
Large Language Model Transformers English
T
osunlp
257
29
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase