Bert Base Japanese V3 Nli Jsnli Jnli Jsick
B
Bert Base Japanese V3 Nli Jsnli Jnli Jsick
Developed by akiFQC
A Japanese natural language inference cross-encoder trained on tohoku-nlp/bert-base-japanese-v3, supporting entailment, neutral, and contradiction judgments
Text Classification
Safetensors Supports Multiple Languages#Japanese Text Inference#Zero-shot Classification#BERT Cross-Encoder
Downloads 51
Release Time : 4/26/2024
Model Overview
This model is a cross-encoder for Japanese Natural Language Inference (NLI) tasks, capable of determining the logical relationship (entailment, neutral, or contradiction) between two sentences. Based on the BERT architecture, it was trained on Japanese NLI datasets such as JSNLI.
Model Features
Japanese-specific NLI Model
A natural language inference model specifically optimized for Japanese text
Multi-dataset Training
Trained on multiple Japanese NLI datasets including JSNLI, JNLI, and JSICK
Zero-shot Classification Capability
Can be used for zero-shot classification tasks
Model Capabilities
Natural Language Inference
Text Relation Judgment
Zero-shot Classification
Use Cases
Text Analysis
Logical Relationship Judgment
Determine whether the relationship between two sentences is entailment, neutral, or contradiction
Achieved 91.4% accuracy on the JGLUE-JNLI validation set
Content Moderation
Fact Verification
Verify whether text content is consistent with known facts
Featured Recommended AI Models