R

Rubert Tiny2 Sentence Compression

Developed by cointegrated
A Russian sentence compression model based on the rubert-tiny2 architecture, capable of predicting and removing words that do not affect the core semantics of a sentence.
Downloads 613
Release Time : 5/19/2022

Model Overview

This model is used for extractive sentence summarization by predicting the deletability of words to compress sentences. The compressed results may not conform to grammatical norms but retain the core semantics.

Model Features

Semantic-preserving compression
Removes non-core words through probability prediction, maximizing the retention of the original sentence's semantics.
Adjustable compression strength
Supports controlling the compression level via threshold or retention ratio parameters.
Lightweight architecture
Optimized model based on rubert-tiny2, suitable for resource-constrained scenarios.

Model Capabilities

Russian sentence compression
Extractive summarization
Text simplification

Use Cases

Text processing
News summarization
Extracting core information from news sentences
Compressed sentences retain key facts
Document simplification
Reducing text length while maintaining readability
Generates a more concise version of the document
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase