đ t5-small Exported to the OpenVINO IR
This project focuses on exporting the t5-small model to the OpenVINO Intermediate Representation (IR). It enables tasks like summarization and translation, leveraging the power of OpenVINO.
đ Quick Start
The model can be easily used with the Transformers pipeline. Here's a simple example to get you started.
⨠Features
- Multilingual Support: Supports languages such as English, French, Romanian, and German.
- Diverse Datasets: Trained on the c4 dataset.
- Task Capabilities: Capable of tasks like summarization, translation, and is optimized with OpenVINO.
đĻ Installation
No specific installation steps are provided in the original document.
đģ Usage Examples
Basic Usage
from transformers import AutoTokenizer, pipeline
from optimum.intel.openvino import OVModelForSeq2SeqLM
model_id = "echarlaix/t5-small-openvino"
model = OVModelForSeq2SeqLM.from_pretrained(model_id, use_cache=False)
tokenizer = AutoTokenizer.from_pretrained(model_id)
translation_pipe = pipeline("translation_en_to_fr", model=model, tokenizer=tokenizer)
text = "He never went out without a book under his arm, and he often came back with two."
result = translation_pipe(text)
đ Documentation
Model description
T5 is an encoder-decoder model pre-trained on a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.
For more information, please take a look at the original paper.
Paper: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Authors: Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu
Information Table
Property |
Details |
Model Type |
t5-small exported to the OpenVINO IR |
Training Data |
c4 |
Supported Languages |
English, French, Romanian, German |
Tasks |
Summarization, Translation, OpenVINO optimization |
đ License
This project is licensed under the Apache-2.0 license.