1

14B

Developed by CausalLM
A 14B-parameter causal language model fully compatible with Meta LLaMA 2 architecture, outperforming all sub-70B models in multiple benchmarks
Downloads 236
Release Time : 10/22/2023

Model Overview

A large language model trained on Qwen and LLaMA2 architectures, specializing in text generation tasks with bilingual support (English/Chinese), demonstrating excellent performance in academic benchmarks

Model Features

High Performance
Outperforms all sub-70B models in benchmarks including MMLU and CEval, with GSM8K mathematical reasoning surpassing MetaMath-13B and Qwen-14B
Multilingual Support
Supports English and Chinese, with Japanese benchmark performance approaching SOTA Japanese models
Full Compatibility
Fully compatible with LLaMA2 architecture, supporting GGUF, GPTQ and AWQ quantization formats
High-quality Training Data
1.3 billion token SFT dataset with 90% sentences manually/synthetically rewritten, incorporating curated content from Wikipedia and multiple sources

Model Capabilities

Text Generation
Mathematical Reasoning
Multilingual Understanding
Academic Q&A

Use Cases

Academic Research
STEM Q&A
Answering questions in science, technology, engineering and mathematics fields
MMLU STEM accuracy 64.19, surpassing all sub-70B models
Educational Assistance
Math Problem Solving
Solving complex mathematical reasoning problems
GSM8K zero-shot math reasoning accuracy 70.13%
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase