đ Fake News Detection Model
This model is designed to classify news articles as real or fake based on their textual content, offering high - accuracy predictions.
đ Quick Start
To use this model, you can interact with it via the Hugging Face Inference API or integrate it into your Python - based applications.
Basic Usage
import requests
url = "https://api-inference.huggingface.co/models/your-username/fake-news-bert"
headers = {"Authorization": "Bearer YOUR_HUGGINGFACE_API_KEY"}
payload = {"inputs": "The news article content here"}
response = requests.post(url, headers=headers, json=payload)
prediction = response.json()
print(f"Prediction: {prediction}")
⨠Features
This model uses a BERT - based transformer model (bert-base-uncased
) fine - tuned on a custom dataset of news articles. It can accurately predict whether a given news article is fake or real.
đĻ Installation
No specific installation steps are provided in the original README, so this section is skipped.
đģ Usage Examples
The above code example shows how to use the model for inference.
đ Documentation
Model Overview
This model is designed to classify news articles as real or fake based on their textual content. It uses a BERT - based transformer model (bert-base-uncased
) fine - tuned on a custom dataset of news articles. The model predicts whether a given article is fake or real with high accuracy.
Datasets Used
The model was trained on a variety of datasets, including:
- Fake News Dataset: Contains labeled news articles with "fake" or "real" classifications.
- News Articles Dataset: A collection of news articles used for training and validation.
Languages
The model primarily works with English - language news articles, but it could be extended to other languages with appropriate data.
Metrics
The model's performance was evaluated on the following metrics:
Property |
Details |
Accuracy |
99.58% |
Precision |
99.27% |
Recall |
99.88% |
ROC - AUC |
99.99% |
F1 - Score |
99.57% |
Model Details
Property |
Details |
Base Model |
bert-base-uncased |
Fine - Tuning |
The model was fine - tuned on a news dataset with labeled examples of real and fake news. |
Training Epochs |
3 |
Batch Size |
32 |
Optimizer |
Adam with weight decay |
Learning Rate |
2e - 5 |
đ§ Technical Details
The model uses a BERT - based architecture and is fine - tuned on a news dataset. The fine - tuning process involves adjusting the model's weights to better fit the task of news classification. The use of appropriate metrics like accuracy, precision, recall, ROC - AUC, and F1 - score helps in evaluating the model's performance effectively.
đ License
This model is licensed under the Apache 2.0 License.