Compositional Bert Large Uncased
C
Compositional Bert Large Uncased
Developed by perceptiveshawty
CompCSE and SimCSE are contrastive learning-based sentence embedding models for calculating sentence similarity.
Downloads 754
Release Time : 7/25/2023
Model Overview
These models are trained using contrastive learning techniques, enabling them to map semantically similar sentences to nearby vector spaces, primarily used for sentence similarity calculation and semantic search tasks.
Model Features
Contrastive learning
Uses contrastive learning techniques to optimize sentence embeddings, bringing similar sentences closer in the vector space.
Efficient training
Trains using unsupervised or weakly supervised methods, reducing reliance on labeled data.
Semantic understanding
Capable of capturing deep semantic information of sentences, not just surface-level word matching.
Model Capabilities
Sentence similarity calculation
Semantic search
Text clustering
Information retrieval
Use Cases
Information retrieval
Document similarity search
Quickly find semantically similar documents in a large-scale document repository.
Improves search accuracy and recall rate
Question answering systems
Question matching
Match user questions with similar questions in the knowledge base.
Improves response accuracy of the QA system
Featured Recommended AI Models