U

Unsup Simcse Bert Large Uncased

Developed by princeton-nlp
SimCSE is a simple contrastive learning framework for generating high-quality sentence embeddings, particularly suitable for unsupervised learning scenarios.
Downloads 32
Release Time : 3/2/2022

Model Overview

This model is based on the BERT architecture, optimizing sentence embedding representations through contrastive learning, and can be used for tasks such as feature extraction and semantic similarity calculation.

Model Features

Unsupervised Contrastive Learning
Adopts a simple contrastive learning framework that can train high-quality sentence embeddings without labeled data.
BERT Architecture Optimization
Optimized based on the BERT-large architecture, improving consistency while maintaining good alignment.
Efficient Training
Trained with relatively small batch sizes (64) and learning rates (1e-5).

Model Capabilities

Sentence Embedding Generation
Semantic Similarity Calculation
Text Feature Extraction

Use Cases

Semantic Analysis
Semantic Text Similarity Calculation
Calculate the semantic similarity between two sentences.
Performs excellently on STS tasks.
Information Retrieval
Document Retrieval
Document retrieval system based on semantic similarity.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase