S

Swallow 70b Hf

Developed by tokyotech-llm
An open-source large language model based on the Llama 2 series, enhanced for Japanese language capabilities, available in 7B/13B/70B sizes and instruction-tuned versions
Downloads 2,088
Release Time : 11/25/2023

Model Overview

A Japanese-optimized large language model developed by Tokyo Institute of Technology, improving Japanese task performance through continued pre-training and instruction fine-tuning, supporting Japanese-English bilingual text generation

Model Features

Japanese-Optimized Vocabulary
Expanded Japanese-specific tokenizer, significantly improving Japanese text processing efficiency
Multiple Size Options
Available in 7B/13B/70B parameter sizes to meet different computational needs
Instruction-Tuned Version
Optimized instruction-following capability through supervised fine-tuning
Continuous Updates
The team maintains high-frequency iterations, with multiple enhanced versions released in 2024

Model Capabilities

Japanese Text Generation
English Text Generation
Instruction Understanding and Execution
Open-Ended Question Answering
Machine Reading Comprehension
Automatic Summarization
Mathematical Reasoning

Use Cases

Education
Japanese Learning Assistance
Generates Japanese learning materials and exercises
Improves Japanese learning efficiency
Content Creation
Japanese Content Generation
Automatically generates text content that conforms to Japanese expression habits
Accelerates Japanese content production workflow
Research
Japanese NLP Research
Serves as a baseline model for Japanese natural language processing research
Advances Japanese AI technology development
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase