S

Sup Simcse Bert Large Uncased

Developed by princeton-nlp
BERT-based sentence embedding model optimized through contrastive learning, suitable for feature extraction tasks
Downloads 1,545
Release Time : 3/2/2022

Model Overview

This model employs SimCSE contrastive learning to optimize BERT's sentence embeddings, enhancing performance in semantic similarity tasks

Model Features

Contrastive Learning Optimization
Utilizes SimCSE contrastive learning to improve embedding space uniformity while maintaining sentence alignment
Supervised Training Enhancement
Incorporates supervised training with MNLI and SNLI datasets to further improve sentence representation quality
Anisotropy Improvement
Effectively addresses the anisotropy issue in traditional BERT embeddings, generating more uniform semantic spaces

Model Capabilities

Sentence Feature Extraction
Semantic Similarity Computation
Text Representation Learning

Use Cases

Semantic Analysis
Semantic Textual Similarity (STS)
Computes semantic similarity scores between two sentences
Outperforms in STS benchmark tests (specific metrics not provided)
Downstream NLP Tasks
Transfer Learning Features
Serves as pre-trained features for various downstream NLP tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase