Convolution smoothing and non-convex regularization for support vector machine in high dimensions

dc.authorid0000-0002-7201-6963
dc.authorid0000-0003-1840-9958
dc.contributor.authorWang, Kangning
dc.contributor.authorYang, Junning
dc.contributor.authorPolat, Kemal
dc.contributor.authorAlhudhaif, Adi
dc.contributor.authorSun, Xiaofei
dc.date.accessioned2024-09-25T20:00:02Z
dc.date.available2024-09-25T20:00:02Z
dc.date.issued2024
dc.departmentBAİBÜ, Mühendislik Fakültesi, Elektrik Elektronik Mühendisliği Bölümü en_US
dc.description.abstractThe support vector machine (SVM) is a well-known statistical learning tool for binary classification. One serious drawback of SVM is that it can be adversely affected by redundant variables, and research has shown that variable selection is crucial and necessary for achieving good classification accuracy. Hence some SVM variable selection studies have been devoted, and they have an unified empirical hinge loss plus sparse penaltyformulation. However, a noteworthy issue is the computational complexity of existing methods is high especially for large-scale problems, due to the non-smoothness of the hinge loss. To solve this issue, we first propose a convolution smoothing approach, which turns the non-smooth hinge loss into a smooth surrogate one, and they are asymptotically equivalent. Moreover, we construct computationally more efficient SVM variable selection procedure by implementing non-convex penalized convolution smooth hinge loss. In theory, we prove that the resulting variable selection possesses the oracle property when the number of predictors is diverging. Numerical experiments also confirm the good performance of the new method.en_US
dc.description.sponsorshipNNSF project of China [11901356]en_US
dc.description.sponsorshipThe research was supported by NNSF project of China (11901356) .en_US
dc.identifier.doi10.1016/j.asoc.2024.111433
dc.identifier.issn1568-4946
dc.identifier.issn1872-9681
dc.identifier.scopus2-s2.0-85186633514en_US
dc.identifier.scopusqualityQ1en_US
dc.identifier.urihttps://doi.org/10.1016/j.asoc.2024.111433
dc.identifier.urihttps://hdl.handle.net/20.500.12491/14050
dc.identifier.volume155en_US
dc.identifier.wosWOS:001195152500001en_US
dc.identifier.wosqualityN/Aen_US
dc.indekslendigikaynakWeb of Scienceen_US
dc.indekslendigikaynakScopusen_US
dc.institutionauthorPolat, Kemal
dc.institutionauthorid0000-0003-1840-9958
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.relation.ispartofApplied Soft Computingen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.snmzYK_20240925en_US
dc.subjectSupport Vector Machineen_US
dc.subjectConvolution-Type Smoothingen_US
dc.subjectHigh Dimensionalityen_US
dc.subjectPenalized Learningen_US
dc.subjectSmooth Hinge Loss
dc.subjectSparse Penalty Formulation
dc.titleConvolution smoothing and non-convex regularization for support vector machine in high dimensionsen_US
dc.typeArticleen_US

Dosyalar

Orijinal paket
Listeleniyor 1 - 1 / 1
Yükleniyor...
Küçük Resim
İsim:
kangning-wang.pdf
Boyut:
801.96 KB
Biçim:
Adobe Portable Document Format
Açıklama:
Tam metin/Full text