L

Longalpaca 7b 32k Chinese

Developed by yuyijiong
Chinese long-text dialogue model based on Llama2, supporting 32k context length, suitable for long-text QA, summarization, and other tasks
Downloads 32
Release Time : 10/25/2023

Model Overview

Fine-tuned using LongLora training technology with position interpolation on the llama2-chat model, featuring excellent long-text processing capabilities, supporting multi-document retrieval and paper summarization at the 10k-word level

Model Features

Ultra-long context processing
Extends the context window to 32k tokens through position interpolation technology, capable of processing 10k-word long texts
Chinese optimization
Fine-tuned with Chinese long instruction datasets, specifically optimized for Chinese long-text processing
Multi-document QA
Supports processing multiple reference documents simultaneously and generating comprehensive answers
Streaming generation support
Compatible with streaming-LLM, capable of generating ultra-long text content

Model Capabilities

Long-text QA
Multi-document information integration
Academic paper summarization
Chinese dialogue generation
Long-text instruction understanding

Use Cases

Academic research
Paper summarization
Summarizes key points of long academic papers
Achieved a rouge-L score of 0.15166 on the vcsum dataset
Information retrieval
Multi-document QA
Extracts information from multiple related documents to answer complex questions
Achieved a rouge-L score of 0.18369 on the dureader dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase