Cocom V1 4 Mistral 7b
C
Cocom V1 4 Mistral 7b
由 naver 开发
COCOM 是一种高效的上下文压缩方法,能够将长上下文压缩为少量上下文嵌入,从而加速问答任务的生成时间。
下载量 17
发布时间 : 10/14/2024
模型简介
COCOM 是一种用于检索增强生成(RAG)的高效上下文压缩方法,通过将长上下文压缩为少量上下文嵌入来加速生成时间,支持不同的压缩率以实现解码时间和答案质量的权衡。
模型特点
高效的上下文压缩
将长上下文压缩为少量上下文嵌入,显著减少解码时间。
支持多上下文处理
能够高效处理多个上下文,适用于复杂的问答场景。
可调节的压缩率
支持不同的压缩率,用户可以在解码时间和答案质量之间进行权衡。
模型能力
上下文压缩
问答生成
检索增强生成(RAG)
使用案例
信息检索与问答
影视角色查询
快速回答关于影视剧中角色扮演者的问题。
相比现有方法,最高实现5.69倍的加速。
🚀 COCOM:高效上下文压缩模型
COCOM是一种有效的上下文压缩方法,它能将长上下文缩减为少量的上下文嵌入,从而加快问答的生成速度。
🚀 快速开始
模型概述
检索增强生成(RAG)通过用外部上下文扩展输入,克服了大语言模型(LLMs)知识有限的问题。然而,RAG的一个主要缺点是,随着输入变长,解码时间会显著增加。为应对这一挑战,我们提出了COCOM,这是一种有效的上下文压缩方法,它将长上下文缩减为少量的上下文嵌入,从而加快生成时间。我们的方法允许不同的压缩率,在解码时间和答案质量之间进行权衡。与早期方法相比,COCOM能够更有效地处理多个上下文,显著减少长输入的解码时间。与现有的高效上下文压缩方法相比,我们的方法在实现更高性能的同时,速度提升了高达5.69倍。
模型推理
对于批量处理,模型的输入如下:
questions
(list
):包含问题的列表。contexts
(list of lists
):每个问题对应一个上下文列表,且问题间上下文数量固定。模型在微调(和推理)时使用5
个上下文。
模型将问题压缩为上下文嵌入,并根据提供的上下文嵌入回答问题。
代码示例
from transformers import AutoModel
model = AutoModel.from_pretrained('naver/cocom-v1-4-mistral-7b', trust_remote_code=True)
model = model.to('cuda')
contexts = [[
'Rosalind Bailey. Rosalind Bailey Rosalind Bailey (born 1946) is a British actress, known for her portrayal of Sarah Headley ("née" Lytton) in the 1970s and 1980s BBC television drama “When the Boat Comes In". Bailey has appeared in numerous British television drama series, including "Byker Grove", “Distant Shores" and "Burn Up". Her stage work includes playing Miss Mary Shepherd in Alan Bennett’s play "The Lady in the Van”.',
'Malcolm Terris. Malcolm Terris Malcolm Terris (born 11 January 1941 in Sunderland, County Durham) is a British actor. He had a lengthy career in a large number of television programmes. Possibly his best-known role was in "When the Boat Comes In", a popular 1970s series, where he played the part of Matt Headley. His film career includes appearances in "The First Great Train Robbery" (1978), "McVicar" (1980), "The Plague Dogs" (1982, voice only), "Slayground" (1983), “The Bounty" (1984) as Thomas Huggan, ship’s surgeon, "Mata Hari" (1985), "Revolution" (1985), “Scandal" (1989), and “Chaplin” (1992). His TV appearances include: One episode of',
'When the Boat Comes In. When the Boat Comes In When the Boat Comes In is a British television period drama produced by the BBC between 1976 and 1981. The series stars James Bolam as Jack Ford, a First World War veteran who returns to his poverty-stricken (fictional) town of Gallowshield in the North East of England. The series dramatises the political struggles of the 1920s and 1930s and explores the impact of national and international politics upon Ford and the people around him. Section:Production. The majority of episodes were written by creator James Mitchell, but in Series 1 north-eastern',
'Susie Youssef. Youssef began her comedy career as a writer for "The Ronnie Johns Half Hour" in 2006, and made her acting debut in the short film "Clicked" in the role of Lina in 2011. In 2014, she played Jane in the short film "Kevin Needs to Make New Friends: Because Everyone Hates Him for Some Reason" and then turned to television where she appeared in "The Chaser’s Media Circus". In 2014, Youssef played the lead role of Sarah in the Hayloft Project’s stage play "The Boat People" which won the Best On Stage award at the FBi SMAC Awards',
'Madelaine Newton. Madelaine Newton Madelaine Newton is a British actress best known for her portrayal of Dolly in 1970s BBC television drama "When the Boat Comes In". She is married to actor Kevin Whately, known for his role as Robert "Robbie" Lewis in both "Inspector Morse” and its spin-off "Lewis". They have two children. She starred alongside her husband in the “Inspector Morse" episode "Masonic Mysteries" as Beryl Newsome - the love-interest of Morse - whom Morse was wrongly suspected of murdering. She played Whately’s on-screen wife in the 1988 Look and Read children’s serial, Geordie Racer. She also made'
]]
questions = ['who played sarah hedley in when the boat comes in?']
answers = model.generate_from_text(contexts=contexts, questions=questions, max_new_tokens=128)
print(answers)
模型信息
属性 | 详情 |
---|---|
库名称 | transformers |
基础模型 | mistralai/Mistral-7B-Instruct-v0.2 |
参考文献
@misc{rau2024contextembeddingsefficientanswer,
title={Context Embeddings for Efficient Answer Generation in RAG},
author={David Rau and Shuai Wang and Hervé Déjean and Stéphane Clinchant},
year={2024},
eprint={2407.09252},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2407.09252},
}
模型流程图示
Phi 2 GGUF
其他
Phi-2是微软开发的一个小型但强大的语言模型,具有27亿参数,专注于高效推理和高质量文本生成。
大型语言模型 支持多种语言
P
TheBloke
41.5M
205
Roberta Large
MIT
基于掩码语言建模目标预训练的大型英语语言模型,采用改进的BERT训练方法
大型语言模型 英语
R
FacebookAI
19.4M
212
Distilbert Base Uncased
Apache-2.0
DistilBERT是BERT基础模型的蒸馏版本,在保持相近性能的同时更轻量高效,适用于序列分类、标记分类等自然语言处理任务。
大型语言模型 英语
D
distilbert
11.1M
669
Llama 3.1 8B Instruct GGUF
Meta Llama 3.1 8B Instruct 是一个多语言大语言模型,针对多语言对话用例进行了优化,在常见的行业基准测试中表现优异。
大型语言模型 英语
L
modularai
9.7M
4
Xlm Roberta Base
MIT
XLM-RoBERTa是基于100种语言的2.5TB过滤CommonCrawl数据预训练的多语言模型,采用掩码语言建模目标进行训练。
大型语言模型 支持多种语言
X
FacebookAI
9.6M
664
Roberta Base
MIT
基于Transformer架构的英语预训练模型,通过掩码语言建模目标在海量文本上训练,支持文本特征提取和下游任务微调
大型语言模型 英语
R
FacebookAI
9.3M
488
Opt 125m
其他
OPT是由Meta AI发布的开放预训练Transformer语言模型套件,参数量从1.25亿到1750亿,旨在对标GPT-3系列性能,同时促进大规模语言模型的开放研究。
大型语言模型 英语
O
facebook
6.3M
198
1
基于transformers库的预训练模型,适用于多种NLP任务
大型语言模型
Transformers

1
unslothai
6.2M
1
Llama 3.1 8B Instruct
Llama 3.1是Meta推出的多语言大语言模型系列,包含8B、70B和405B参数规模,支持8种语言和代码生成,优化了多语言对话场景。
大型语言模型
Transformers 支持多种语言

L
meta-llama
5.7M
3,898
T5 Base
Apache-2.0
T5基础版是由Google开发的文本到文本转换Transformer模型,参数规模2.2亿,支持多语言NLP任务。
大型语言模型 支持多种语言
T
google-t5
5.4M
702
精选推荐AI模型
Llama 3 Typhoon V1.5x 8b Instruct
专为泰语设计的80亿参数指令模型,性能媲美GPT-3.5-turbo,优化了应用场景、检索增强生成、受限生成和推理任务
大型语言模型
Transformers 支持多种语言

L
scb10x
3,269
16
Cadet Tiny
Openrail
Cadet-Tiny是一个基于SODA数据集训练的超小型对话模型,专为边缘设备推理设计,体积仅为Cosmo-3B模型的2%左右。
对话系统
Transformers 英语

C
ToddGoldfarb
2,691
6
Roberta Base Chinese Extractive Qa
基于RoBERTa架构的中文抽取式问答模型,适用于从给定文本中提取答案的任务。
问答系统 中文
R
uer
2,694
98