B

Bart Large Tos

Developed by ML-unipi
BART is a Transformer model with an encoder-decoder architecture, fine-tuned for abstractive summarization tasks of terms of service
Downloads 21
Release Time : 8/29/2022

Model Overview

The BART model is pre-trained on English language and fine-tuned for abstractive summarization tasks of terms of service, excelling in text generation and comprehension tasks

Model Features

Bidirectional Encoder and Autoregressive Decoder
Combines BERT-style bidirectional encoder with GPT-style autoregressive decoder, integrating both comprehension and generation capabilities
Text Reconstruction Pre-training
Pre-trained through noise corruption and original text reconstruction, enhancing model robustness
Specialized Fine-tuning for Terms of Service
Specifically optimized for contract text summarization tasks, suitable for legal document processing

Model Capabilities

Text Summarization Generation
Text Comprehension
Contract Clause Analysis

Use Cases

Legal Document Processing
Terms of Service Summarization
Automatically generate concise summaries of contracts/terms of service
Produces succinct summaries retaining key information
Legal Document Analysis
Extract core clauses from complex legal documents
General Text Processing
News Summarization
Generate key-point summaries of news articles
As shown in examples, can generate accurate summaries of 30-130 words
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase