Please use this identifier to cite or link to this item:
https://hdl.handle.net/11499/48490
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Bozuyla, Mehmet | - |
dc.date.accessioned | 2023-01-09T21:37:59Z | - |
dc.date.available | 2023-01-09T21:37:59Z | - |
dc.date.issued | 2022 | - |
dc.identifier.issn | 2667-8055 | - |
dc.identifier.uri | https://doi.org/10.36306/konjes.995060 | - |
dc.identifier.uri | https://search.trdizin.gov.tr/yayin/detay/1121074 | - |
dc.identifier.uri | https://hdl.handle.net/11499/48490 | - |
dc.description.abstract | The increasing usage of social media and internet generates a significant amount of information to be analyzed from various perspectives. In particular, fake news is defined as the false news that is presented as factual news. Fake news are in general fabricated toward a manipulation aim. Fake news identification is in general a natural language analysis problem and machine learning algorithms are emerged as automated predictors. Well-known machine learning algorithms such as Naïve Bayes (NB) and Random Forest (RF) are successfully used for fake-news identification problem. Turkish is a morphologically rich language and it has agglutinative complexity that requires dense language pre-processing steps and feature selection. Recent neural language models such as Bidirectional Encoder Representations from Transformers (BERT) proposes an opportunity for Turkish-like morphologically rich languages a relatively straightforward pipeline in the solution of natural language problems. In this work, we compared NB, RF, Support Vector Machine (SVM), Naïve Bayes Multinomial (NBM) and Logistics Regression (LR) on top of correlation based feature selection and newly proposed Turkish-BERT (BERTurk) to identify Turkish fake news. And we obtained 99.90 % accuracy in fake news identification which is a highly efficient model without substantial language pre-processing tasks. | en_US |
dc.language.iso | en | en_US |
dc.relation.ispartof | Konya mühendislik bilimleri dergisi (Online) | en_US |
dc.rights | info:eu-repo/semantics/openAccess | en_US |
dc.subject | Machine learning | en_US |
dc.subject | Text mining | en_US |
dc.subject | Bidirectional Encoder Representations from Transformers (BERT) | en_US |
dc.subject | Fake news | en_US |
dc.subject | BERTurk | en_US |
dc.title | Advanced turkish fake news prediction with bidirectional encoder representations from transformers | en_US |
dc.type | Article | en_US |
dc.identifier.volume | 10 | en_US |
dc.identifier.issue | 3 | en_US |
dc.identifier.startpage | 750 | en_US |
dc.identifier.endpage | 761 | en_US |
dc.department | PAU | en_US |
dc.identifier.doi | 10.36306/konjes.995060 | - |
dc.relation.publicationcategory | Makale - Ulusal Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
dc.identifier.trdizinid | 1121074 | en_US |
dc.identifier.wos | WOS:001313258400017 | - |
item.fulltext | With Fulltext | - |
item.languageiso639-1 | en | - |
item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
item.openairetype | Article | - |
item.grantfulltext | open | - |
item.cerifentitytype | Publications | - |
Appears in Collections: | Mühendislik Fakültesi Koleksiyonu TR Dizin İndeksli Yayınlar Koleksiyonu / TR Dizin Indexed Publications Collection |
Files in This Item:
File | Size | Format | |
---|---|---|---|
document (10).pdf | 985.7 kB | Adobe PDF | View/Open |
CORE Recommender
Page view(s)
46
checked on Aug 24, 2024
Download(s)
4
checked on Aug 24, 2024
Google ScholarTM
Check
Altmetric
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.