B

Bert Base Japanese V3 Nli Jsnli

Developed by akiFQC
A Japanese natural language inference model based on BERT architecture, trained on the JSNLI dataset, used to determine logical relationships (entailment/neutral/contradiction) between sentence pairs
Downloads 203
Release Time : 4/11/2024

Model Overview

This model is a cross-encoder trained on tohoku-nlp/bert-base-japanese-v3, specifically designed for Japanese natural language inference tasks, capable of outputting probability distributions of logical relationships between sentence pairs.

Model Features

Japanese-Specific Model
Optimized based on tohoku-nlp Japanese BERT, specifically designed for Japanese text inference tasks
Cross-Encoder Architecture
Uses joint encoding of sentence pairs, capturing finer-grained relationships between sentences better than dual encoders
Zero-shot Classification Capability
Supports direct application to text classification tasks without fine-tuning

Model Capabilities

Natural Language Inference
Zero-shot Classification
Text Semantic Relationship Judgment

Use Cases

Text Understanding
Logical Consistency Verification
Verify whether two Japanese sentences have entailment or contradiction relationships
Can output probability distributions of three types of relationships
Intelligent Customer Service
Q&A Pair Verification
Determine the logical matching degree between user questions and knowledge base answers
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase