# Abstractive Summarization Optimization
Bart Mofe Rl Xsum
Bsd-3-clause
MoFE is a model designed to control hallucination generation in abstractive summarization by mixing factual experts to reduce inaccuracies in summaries.
Text Generation
Transformers English

B
praf-choub
23
0
Barthez
Apache-2.0
BARThez is a French sequence-to-sequence pre-trained model based on the BART architecture, particularly suitable for generative tasks such as abstractive summarization.
Large Language Model
Transformers French

B
moussaKam
1,487
17
Featured Recommended AI Models