Wednesday, October 15, 2025. Prof. Chien-Tong Lin
Wednesday, October 15, 2025
Time: 16:10-17:00
venue: Mathematics Building Room 527
Speaker: Prof. Chien-Tong Lin (Department of Applied Mathematics, National Sun Yat-sen University)
Title: High-dimensional Model Selection via Chebyshev’s Greedy Algorithm
Abstract:
Chien-Tong Lin1, Chi-Shian Dai2, You-Lin Chen3 and Ching-Kang Ing4
1Feng-Chia University
2University of Wisconsin-Madison
3Cognitive Computing Lab, Baidu Research
2National Tsing-Hua University
Assuming sparsity on the regression coefficient is fundamental to ultra-high dimensional variable selection. However, the true sparsity of practical data is typically uncertain, making it necessary to device a variable selection technique that performs well under various sparsity settings. In this talk, we investigate the convergence rate of Chebyshev's greedy algorithm (CGA) for regression models when the true coefficient vector satisfies a general weak sparsity condition. We determine the iteration number of CGA using our developed data-driven approach and demonstrate that the optimal convergence rate can be achieved even when the actual sparsity level is unknown. Our convergence theory relies on the convexity and the smoothness of the population loss function, allowing for the analysis of a broad family of regression models and providing optimality guarantees under weak assumptions. As a specific example, we apply our method to generalized linear models (GLM), and offer the sufficient conditions under which the optimal rate can be achieved. Thorough simulation studies as well as data analysis are provided to support the obtained theory.
Keywords: Variable selection, Weak sparsity, Chebyshev’s Greedy Algorithm, High-dimensional
Akaike’s Information Criterion.