Cocom V1 4 Mistral 7b
C
Cocom V1 4 Mistral 7b
由naver開發
COCOM 是一種高效的上下文壓縮方法,能夠將長上下文壓縮為少量上下文嵌入,從而加速問答任務的生成時間。
下載量 17
發布時間 : 10/14/2024
模型概述
COCOM 是一種用於檢索增強生成(RAG)的高效上下文壓縮方法,通過將長上下文壓縮為少量上下文嵌入來加速生成時間,支持不同的壓縮率以實現解碼時間和答案質量的權衡。
模型特點
高效的上下文壓縮
將長上下文壓縮為少量上下文嵌入,顯著減少解碼時間。
支持多上下文處理
能夠高效處理多個上下文,適用於複雜的問答場景。
可調節的壓縮率
支持不同的壓縮率,用戶可以在解碼時間和答案質量之間進行權衡。
模型能力
上下文壓縮
問答生成
檢索增強生成(RAG)
使用案例
信息檢索與問答
影視角色查詢
快速回答關於影視劇中角色扮演者的問題。
相比現有方法,最高實現5.69倍的加速。
🚀 COCOM:高效上下文壓縮模型
COCOM是一種有效的上下文壓縮方法,它能將長上下文縮減為少量的上下文嵌入,從而加快問答的生成速度。
🚀 快速開始
模型概述
檢索增強生成(RAG)通過用外部上下文擴展輸入,克服了大語言模型(LLMs)知識有限的問題。然而,RAG的一個主要缺點是,隨著輸入變長,解碼時間會顯著增加。為應對這一挑戰,我們提出了COCOM,這是一種有效的上下文壓縮方法,它將長上下文縮減為少量的上下文嵌入,從而加快生成時間。我們的方法允許不同的壓縮率,在解碼時間和答案質量之間進行權衡。與早期方法相比,COCOM能夠更有效地處理多個上下文,顯著減少長輸入的解碼時間。與現有的高效上下文壓縮方法相比,我們的方法在實現更高性能的同時,速度提升了高達5.69倍。
模型推理
對於批量處理,模型的輸入如下:
questions
(list
):包含問題的列表。contexts
(list of lists
):每個問題對應一個上下文列表,且問題間上下文數量固定。模型在微調(和推理)時使用5
個上下文。
模型將問題壓縮為上下文嵌入,並根據提供的上下文嵌入回答問題。
代碼示例
from transformers import AutoModel
model = AutoModel.from_pretrained('naver/cocom-v1-4-mistral-7b', trust_remote_code=True)
model = model.to('cuda')
contexts = [[
'Rosalind Bailey. Rosalind Bailey Rosalind Bailey (born 1946) is a British actress, known for her portrayal of Sarah Headley ("née" Lytton) in the 1970s and 1980s BBC television drama “When the Boat Comes In". Bailey has appeared in numerous British television drama series, including "Byker Grove", “Distant Shores" and "Burn Up". Her stage work includes playing Miss Mary Shepherd in Alan Bennett’s play "The Lady in the Van”.',
'Malcolm Terris. Malcolm Terris Malcolm Terris (born 11 January 1941 in Sunderland, County Durham) is a British actor. He had a lengthy career in a large number of television programmes. Possibly his best-known role was in "When the Boat Comes In", a popular 1970s series, where he played the part of Matt Headley. His film career includes appearances in "The First Great Train Robbery" (1978), "McVicar" (1980), "The Plague Dogs" (1982, voice only), "Slayground" (1983), “The Bounty" (1984) as Thomas Huggan, ship’s surgeon, "Mata Hari" (1985), "Revolution" (1985), “Scandal" (1989), and “Chaplin” (1992). His TV appearances include: One episode of',
'When the Boat Comes In. When the Boat Comes In When the Boat Comes In is a British television period drama produced by the BBC between 1976 and 1981. The series stars James Bolam as Jack Ford, a First World War veteran who returns to his poverty-stricken (fictional) town of Gallowshield in the North East of England. The series dramatises the political struggles of the 1920s and 1930s and explores the impact of national and international politics upon Ford and the people around him. Section:Production. The majority of episodes were written by creator James Mitchell, but in Series 1 north-eastern',
'Susie Youssef. Youssef began her comedy career as a writer for "The Ronnie Johns Half Hour" in 2006, and made her acting debut in the short film "Clicked" in the role of Lina in 2011. In 2014, she played Jane in the short film "Kevin Needs to Make New Friends: Because Everyone Hates Him for Some Reason" and then turned to television where she appeared in "The Chaser’s Media Circus". In 2014, Youssef played the lead role of Sarah in the Hayloft Project’s stage play "The Boat People" which won the Best On Stage award at the FBi SMAC Awards',
'Madelaine Newton. Madelaine Newton Madelaine Newton is a British actress best known for her portrayal of Dolly in 1970s BBC television drama "When the Boat Comes In". She is married to actor Kevin Whately, known for his role as Robert "Robbie" Lewis in both "Inspector Morse” and its spin-off "Lewis". They have two children. She starred alongside her husband in the “Inspector Morse" episode "Masonic Mysteries" as Beryl Newsome - the love-interest of Morse - whom Morse was wrongly suspected of murdering. She played Whately’s on-screen wife in the 1988 Look and Read children’s serial, Geordie Racer. She also made'
]]
questions = ['who played sarah hedley in when the boat comes in?']
answers = model.generate_from_text(contexts=contexts, questions=questions, max_new_tokens=128)
print(answers)
模型信息
屬性 | 詳情 |
---|---|
庫名稱 | transformers |
基礎模型 | mistralai/Mistral-7B-Instruct-v0.2 |
參考文獻
@misc{rau2024contextembeddingsefficientanswer,
title={Context Embeddings for Efficient Answer Generation in RAG},
author={David Rau and Shuai Wang and Hervé Déjean and Stéphane Clinchant},
year={2024},
eprint={2407.09252},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2407.09252},
}
模型流程圖示
Phi 2 GGUF
其他
Phi-2是微軟開發的一個小型但強大的語言模型,具有27億參數,專注於高效推理和高質量文本生成。
大型語言模型 支持多種語言
P
TheBloke
41.5M
205
Roberta Large
MIT
基於掩碼語言建模目標預訓練的大型英語語言模型,採用改進的BERT訓練方法
大型語言模型 英語
R
FacebookAI
19.4M
212
Distilbert Base Uncased
Apache-2.0
DistilBERT是BERT基礎模型的蒸餾版本,在保持相近性能的同時更輕量高效,適用於序列分類、標記分類等自然語言處理任務。
大型語言模型 英語
D
distilbert
11.1M
669
Llama 3.1 8B Instruct GGUF
Meta Llama 3.1 8B Instruct 是一個多語言大語言模型,針對多語言對話用例進行了優化,在常見的行業基準測試中表現優異。
大型語言模型 英語
L
modularai
9.7M
4
Xlm Roberta Base
MIT
XLM-RoBERTa是基於100種語言的2.5TB過濾CommonCrawl數據預訓練的多語言模型,採用掩碼語言建模目標進行訓練。
大型語言模型 支持多種語言
X
FacebookAI
9.6M
664
Roberta Base
MIT
基於Transformer架構的英語預訓練模型,通過掩碼語言建模目標在海量文本上訓練,支持文本特徵提取和下游任務微調
大型語言模型 英語
R
FacebookAI
9.3M
488
Opt 125m
其他
OPT是由Meta AI發佈的開放預訓練Transformer語言模型套件,參數量從1.25億到1750億,旨在對標GPT-3系列性能,同時促進大規模語言模型的開放研究。
大型語言模型 英語
O
facebook
6.3M
198
1
基於transformers庫的預訓練模型,適用於多種NLP任務
大型語言模型
Transformers

1
unslothai
6.2M
1
Llama 3.1 8B Instruct
Llama 3.1是Meta推出的多語言大語言模型系列,包含8B、70B和405B參數規模,支持8種語言和代碼生成,優化了多語言對話場景。
大型語言模型
Transformers 支持多種語言

L
meta-llama
5.7M
3,898
T5 Base
Apache-2.0
T5基礎版是由Google開發的文本到文本轉換Transformer模型,參數規模2.2億,支持多語言NLP任務。
大型語言模型 支持多種語言
T
google-t5
5.4M
702
精選推薦AI模型
Llama 3 Typhoon V1.5x 8b Instruct
專為泰語設計的80億參數指令模型,性能媲美GPT-3.5-turbo,優化了應用場景、檢索增強生成、受限生成和推理任務
大型語言模型
Transformers 支持多種語言

L
scb10x
3,269
16
Cadet Tiny
Openrail
Cadet-Tiny是一個基於SODA數據集訓練的超小型對話模型,專為邊緣設備推理設計,體積僅為Cosmo-3B模型的2%左右。
對話系統
Transformers 英語

C
ToddGoldfarb
2,691
6
Roberta Base Chinese Extractive Qa
基於RoBERTa架構的中文抽取式問答模型,適用於從給定文本中提取答案的任務。
問答系統 中文
R
uer
2,694
98