Convolution smoothing and non-convex regularization for support vector machine in high dimensions
dc.authorid | 0000-0002-7201-6963 | |
dc.authorid | 0000-0003-1840-9958 | |
dc.contributor.author | Wang, Kangning | |
dc.contributor.author | Yang, Junning | |
dc.contributor.author | Polat, Kemal | |
dc.contributor.author | Alhudhaif, Adi | |
dc.contributor.author | Sun, Xiaofei | |
dc.date.accessioned | 2024-09-25T20:00:02Z | |
dc.date.available | 2024-09-25T20:00:02Z | |
dc.date.issued | 2024 | |
dc.department | BAİBÜ, Mühendislik Fakültesi, Elektrik Elektronik Mühendisliği Bölümü | en_US |
dc.description.abstract | The support vector machine (SVM) is a well-known statistical learning tool for binary classification. One serious drawback of SVM is that it can be adversely affected by redundant variables, and research has shown that variable selection is crucial and necessary for achieving good classification accuracy. Hence some SVM variable selection studies have been devoted, and they have an unified empirical hinge loss plus sparse penaltyformulation. However, a noteworthy issue is the computational complexity of existing methods is high especially for large-scale problems, due to the non-smoothness of the hinge loss. To solve this issue, we first propose a convolution smoothing approach, which turns the non-smooth hinge loss into a smooth surrogate one, and they are asymptotically equivalent. Moreover, we construct computationally more efficient SVM variable selection procedure by implementing non-convex penalized convolution smooth hinge loss. In theory, we prove that the resulting variable selection possesses the oracle property when the number of predictors is diverging. Numerical experiments also confirm the good performance of the new method. | en_US |
dc.description.sponsorship | NNSF project of China [11901356] | en_US |
dc.description.sponsorship | The research was supported by NNSF project of China (11901356) . | en_US |
dc.identifier.doi | 10.1016/j.asoc.2024.111433 | |
dc.identifier.issn | 1568-4946 | |
dc.identifier.issn | 1872-9681 | |
dc.identifier.scopus | 2-s2.0-85186633514 | en_US |
dc.identifier.scopusquality | Q1 | en_US |
dc.identifier.uri | https://doi.org/10.1016/j.asoc.2024.111433 | |
dc.identifier.uri | https://hdl.handle.net/20.500.12491/14050 | |
dc.identifier.volume | 155 | en_US |
dc.identifier.wos | WOS:001195152500001 | en_US |
dc.identifier.wosquality | N/A | en_US |
dc.indekslendigikaynak | Web of Science | en_US |
dc.indekslendigikaynak | Scopus | en_US |
dc.institutionauthor | Polat, Kemal | |
dc.institutionauthorid | 0000-0003-1840-9958 | |
dc.language.iso | en | en_US |
dc.publisher | Elsevier | en_US |
dc.relation.ispartof | Applied Soft Computing | en_US |
dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
dc.rights | info:eu-repo/semantics/closedAccess | en_US |
dc.snmz | YK_20240925 | en_US |
dc.subject | Support Vector Machine | en_US |
dc.subject | Convolution-Type Smoothing | en_US |
dc.subject | High Dimensionality | en_US |
dc.subject | Penalized Learning | en_US |
dc.subject | Smooth Hinge Loss | |
dc.subject | Sparse Penalty Formulation | |
dc.title | Convolution smoothing and non-convex regularization for support vector machine in high dimensions | en_US |
dc.type | Article | en_US |
Dosyalar
Orijinal paket
1 - 1 / 1
Yükleniyor...
- İsim:
- kangning-wang.pdf
- Boyut:
- 801.96 KB
- Biçim:
- Adobe Portable Document Format
- Açıklama:
- Tam metin/Full text