Skip to content

Hawawou/fake_news_classification

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This project focuses on fake news classification using two distinct approaches: fine-tuning the DistilBERT model and leveraging XGBoost with DistilBERT embeddings. DistilBERT, a lightweight version of BERT, is fine-tuned to directly classify news articles as either fake or real. In the second approach, sentence embeddings are extracted from DistilBERT and used as input features for an XGBoost classifier, which performs the same classification task. The project compares the performance of both models, evaluating metrics like accuracy, precision, recall, and F1-score to identify the best method for detecting fake news.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published