M

Meaningbert

Developed by davebulaval
An automated trainable metric for evaluating semantic preservation between sentences
Downloads 785
Release Time : 11/14/2023

Model Overview

MeaningBERT is a BERT-based model specifically designed to evaluate the degree of semantic preservation between two sentences. Its design goal is to provide an automated semantic evaluation metric highly correlated with human judgments, suitable for quality assessment in scenarios such as text simplification and paraphrasing.

Model Features

Semantic Preservation Evaluation
Specifically designed to quantitatively assess the degree of semantic preservation between two sentences
High Correlation with Human Judgment
Model outputs are highly consistent with human subjective judgments on semantic preservation
Automated Rationality Verification
Built-in automated testing framework for identical and unrelated sentences
Improved Training Scheme
Utilizes 500 training epochs and more robust data augmentation techniques

Model Capabilities

Sentence Semantic Similarity Evaluation
Text Simplification Quality Assessment
Paraphrased Text Quality Evaluation
Automated Semantic Preservation Testing

Use Cases

Text Processing Quality Assessment
Text Simplification Evaluation
Assess the degree of semantic preservation between simplified text and the original
Highly correlated with human evaluation results
Paraphrasing Quality Detection
Detect whether paraphrased text maintains the core semantics of the original sentence
Effectively identifies semantic deviations
Educational Technology
Language Learning Assistance
Evaluate semantic preservation when learners paraphrase sentences
Provides objective semantic preservation scores
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase