Bert Base Uncased Finetuned Quac
B
Bert Base Uncased Finetuned Quac
Developed by OrfeasTsk
Pre-trained language model based on Transformer architecture, suitable for various NLP tasks
Downloads 25
Release Time : 3/9/2022
Model Overview
Bidirectional Transformer encoder model that acquires general language understanding capabilities through large-scale corpus pre-training
Model Features
Bidirectional context understanding
Achieves bidirectional context encoding through MLM pre-training tasks
Transfer learning friendly
Can be adapted to downstream NLP tasks through fine-tuning
Attention mechanism
Fully connected self-attention layers capture long-range dependencies
Model Capabilities
Text feature extraction
Semantic similarity calculation
Text classification
Question answering systems
Named entity recognition
Use Cases
Text analysis
Sentiment analysis
Classify user comments into positive/negative sentiments
Typical accuracy >90% (estimated)
Information extraction
Entity recognition
Extract entities such as person names/locations/organizations from text
Featured Recommended AI Models