L

Legalbert

Developed by casehold
BERT-based pre-trained model specialized for legal texts with optimizations for legal text characteristics
Downloads 467
Release Time : 3/2/2022

Model Overview

This model is a BERT variant further pre-trained on large-scale legal judgment texts, specifically designed for natural language processing tasks in the legal domain, such as legal text classification and case analysis.

Model Features

Legal domain specialization
Further pre-trained on 37GB of legal judgment texts, optimized for legal terminology and text structure
Large-scale training data
Training corpus includes 3,446,187 legal judgments, far exceeding the scale of original BERT training data
Multi-task support
Supports masked language modeling, next sentence prediction, and legal-specific tasks like CaseHOLD multiple-choice questions

Model Capabilities

Legal text understanding
Legal text classification
Legal multiple-choice question answering
Legal text generation
Legal semantic analysis

Use Cases

Legal text analysis
Precedent overturning prediction
Analyze legal judgment texts to predict the likelihood of overturning precedents
Terms of service classification
Automatically classify legal contracts and terms of service
Legal education
CaseHOLD multiple-choice question answering
Assist in answering case-based multiple-choice questions in legal education
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase