H

Hebert

Developed by avichr
HeBERT is a pre-trained language model for Hebrew, based on the BERT architecture, focusing on polarity analysis and sentiment recognition tasks.
Downloads 102.19k
Release Time : 3/2/2022

Model Overview

HeBERT is a pre-trained BERT model optimized for Hebrew, supporting tasks such as masked language modeling, sentiment classification, and named entity recognition, with particularly outstanding performance in sentiment analysis.

Model Features

Hebrew-specific optimization
Pre-trained specifically for Hebrew language characteristics, outperforming general multilingual models in Hebrew NLP tasks
High-quality sentiment annotation data
Uses crowdsourced annotation data validated by Krippendorff's alpha coefficient to ensure the reliability of sentiment labels
Multi-task support
The same architecture supports multiple downstream tasks such as masked prediction, sentiment analysis, and named entity recognition

Model Capabilities

Text sentiment analysis
Named entity recognition
Masked language modeling
Emotion classification

Use Cases

Social media analysis
News comment section sentiment monitoring
Analyzes sentiment tendencies in Hebrew news website comments
Can identify 8 basic emotions such as anger and happiness
Business intelligence
Hebrew product review analysis
Automatically classifies the sentiment polarity of user reviews
Provides positive/negative sentiment scores
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase