Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11851/12724
Full metadata record
DC FieldValueLanguage
dc.contributor.authorTeke, Saziye Hande-
dc.contributor.authorYilmaz, Muhammed-
dc.contributor.authorOzbayoglu, A. Murat-
dc.date.accessioned2025-10-10T15:47:28Z-
dc.date.available2025-10-10T15:47:28Z-
dc.date.issued2025-
dc.identifier.isbn9798331566555-
dc.identifier.urihttps://doi.org/10.1109/SIU66497.2025.11112324-
dc.identifier.urihttps://hdl.handle.net/20.500.11851/12724-
dc.descriptionIsik Universityen_US
dc.description.abstractThis study proposes a new self-knowledge distillation method to enhance the classification performance of deep learning models. Compared to traditional knowledge distillation, which relies on the softmax outputs of a teacher model, the proposed method provides greater flexibility in learning. In standard distillation, the student model is constrained to mimic the teacher, whereas the proposed approach integrates the teacher's predictions as an additional input, allowing the model to preserve its own learning dynamics. This approach balances the teacher's guidance with independent feature learning, strengthening decision boundaries. The proposed method has been tested on CIFAR-10 and CIFAR-100 datasets and evaluated using basic convolutional neural network and ResNet50 architectures. The results demonstrate that this method offers a valuable performance improvement in applications where classification accuracy is crucial. © 2025 Elsevier B.V., All rights reserved.en_US
dc.language.isotren_US
dc.publisherInstitute of Electrical and Electronics Engineers Inc.en_US
dc.relation.ispartof-- 33rd IEEE Conference on Signal Processing and Communications Applications, SIU 2025 -- Istanbul; Isik University Sile Campus -- 211450en_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectConvolutional Neural Networken_US
dc.subjectDeep Learningen_US
dc.subjectImage Classificationen_US
dc.subjectSelf Knowledge Distillationen_US
dc.subjectSoftmaxen_US
dc.subjectClassification (Of Information)en_US
dc.subjectConvolutionen_US
dc.subjectConvolutional Neural Networksen_US
dc.subjectDeep Neural Networksen_US
dc.subjectDistillationen_US
dc.subjectLearning Systemsen_US
dc.subjectTeachingen_US
dc.subjectClassification Performanceen_US
dc.subjectConvolutional Neural Networken_US
dc.subjectDeep Learningen_US
dc.subjectDistillation Methoden_US
dc.subjectImages Classificationen_US
dc.subjectLearning Modelsen_US
dc.subjectNeural-Networksen_US
dc.subjectSelf Knowledge Distillationen_US
dc.subjectSoftmaxen_US
dc.subjectTeachers'en_US
dc.subjectImage Classificationen_US
dc.titleSoftmax Öznitelikleri ile Özbilgi Damıtma Yoluyla Derin Sinir Ağlarının Görüntü Sınıflandırma Performansının İyileştirilmesien_US
dc.title.alternativeImproving Image Classification Performance of Deep Neural Networks Through Self-Knowledge Distillation With Softmax Featuresen_US
dc.typeConference Objecten_US
dc.departmentTOBB University of Economics and Technologyen_US
dc.identifier.scopus2-s2.0-105015528740-
dc.identifier.doi10.1109/SIU66497.2025.11112324-
dc.authorscopusid60093147600-
dc.authorscopusid57221948826-
dc.authorscopusid57947593100-
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.identifier.scopusqualityN/A-
dc.identifier.wosqualityN/A-
item.languageiso639-1tr-
item.openairetypeConference Object-
item.grantfulltextnone-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
item.fulltextNo Fulltext-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
Show simple item record



CORE Recommender

Page view(s)

10
checked on Oct 20, 2025

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.