Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11851/5526
Full metadata record
DC FieldValueLanguage
dc.contributor.authorŞener, B.-
dc.contributor.authorGüdelek, M. U.-
dc.contributor.authorÖzbayoğlu, Ahmet Murat-
dc.contributor.authorÜnver, Hakkı Özgür-
dc.date.accessioned2021-09-11T15:19:10Z-
dc.date.available2021-09-11T15:19:10Z-
dc.date.issued2021en_US
dc.identifier.issn0263-2241-
dc.identifier.urihttps://doi.org/10.1016/j.measurement.2021.109689-
dc.identifier.urihttps://hdl.handle.net/20.500.11851/5526-
dc.description.abstractRegenerative chatter is harmful to machining operations, and it must be avoided to increase production efficiency. The recent success of deep learning methods in many fields also presents an excellent opportunity to advance chatter detection and its wider industrial adoption. In this work, a chatter detection method based on deep convolutional neural network (DCNN) is presented. The method uses a cardinal model-based chatter solution to precisely label regenerative chatter levels. During milling, vibration data are collected via a non-invasive data acquisition strategy. Considering nonlinear and non-stationary characteristics of chatter, continuous wavelet transform (CWT) is used as the pre-processing technique to reveal critical chatter rich information. Afterward, the images are used for training and test of the developed DCNN. The validation of the method revealed that when cutting parameters are also included as input features to the DCNN, average accuracy reached to 99.88%. © 2021 Elsevier Ltden_US
dc.description.sponsorshipTürkiye Bilimsel ve Teknolojik Araştirma Kurumu: 118M414en_US
dc.language.isoenen_US
dc.publisherElsevier B.V.en_US
dc.relation.ispartofMeasurement: Journal of the International Measurement Confederationen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectCWTen_US
dc.subjectDCNNen_US
dc.subjectDeep learningen_US
dc.subjectMillingen_US
dc.subjectRegenerative chatteren_US
dc.titleA novel chatter detection method for milling using deep convolution neural networksen_US
dc.typeArticleen_US
dc.departmentFaculties, Faculty of Engineering, Department of Computer Engineeringen_US
dc.departmentFaculties, Faculty of Engineering, Department of Mechanical Engineeringen_US
dc.departmentFaculties, Faculty of Engineering, Department of Artificial Intelligence Engineeringen_US
dc.departmentFakülteler, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümütr_TR
dc.departmentFakülteler, Mühendislik Fakültesi, Makine Mühendisliği Bölümütr_TR
dc.departmentFakülteler, Mühendislik Fakültesi, Yapay Zeka Mühendisliği Bölümütr_TR
dc.identifier.volume182en_US
dc.identifier.wosWOS:000684200500008en_US
dc.identifier.scopus2-s2.0-85107718180en_US
dc.institutionauthorGüdelek, Mehmet Uğur-
dc.institutionauthorÖzbayoğlu, Aahmet Murat-
dc.institutionauthorÜnver, Hakkı Özgür-
dc.identifier.doi10.1016/j.measurement.2021.109689-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.scopusqualityQ1-
item.openairetypeArticle-
item.languageiso639-1en-
item.grantfulltextnone-
item.fulltextNo Fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
crisitem.author.dept02.1. Department of Artificial Intelligence Engineering-
crisitem.author.dept02.7. Department of Mechanical Engineering-
Appears in Collections:Bilgisayar Mühendisliği Bölümü / Department of Computer Engineering
Makine Mühendisliği Bölümü / Department of Mechanical Engineering
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Yapay Zeka Mühendisliği Bölümü / Department of Artificial Intelligence Engineering
Show simple item record



CORE Recommender

WEB OF SCIENCETM
Citations

44
checked on Dec 21, 2024

Page view(s)

364
checked on Dec 23, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.