This project focuses on fake news classification using two distinct approaches: fine-tuning the DistilBERT model and leveraging XGBoost with DistilBERT embeddings. DistilBERT, a lightweight version of BERT, is fine-tuned to directly classify news articles as either fake or real. In the second approach, sentence embeddings are extracted from DistilBERT and used as input features for an XGBoost classifier, which performs the same classification task. The project compares the performance of both models, evaluating metrics like accuracy, precision, recall, and F1-score to identify the best method for detecting fake news.
-
Notifications
You must be signed in to change notification settings - Fork 0
Hawawou/fake_news_classification
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published