A Foray into Parallel Optimisation Algorithms for High Dimension Low Sample Space Generalized Distance Weighted Discrimination problems

Srivathsan Amruth, Xin Yee Lam

In many modern data sets, High dimension low sample size (HDLSS) data is prevalent in many fields of studies. There has been an increased focus recently on using machine learning and statistical methods to mine valuable information out of these data sets. Thus, there has been an increased interest in efficient learning in high dimensions. Naturally, as the dimension of the input data increases, the learning task will become more difficult, due to increasing computational and statistical complexities. This makes it crucial to overcome the curse of dimensionality in a given dataset, within a reasonable time frame, in a bid to obtain the insights required to keep a competitive edge. To solve HDLSS problems, classical methods such as support vector machines can be utilised to alleviate data piling at the margin. However, when we question geometric domains and their assumptions on input data, we are naturally lead to convex optimisation problems and this gives rise to the development of solutions like distance weighted discrimination (DWD), which can be modelled as a second-order cone programming problem and solved by interior-point methods when sample size and feature dimensions of the data is moderate. In this paper, our focus is on designing an even more scalable and robust algorithm for solving large-scale generalized DWD problems.

Knowledge Graph



Sign up or login to leave a comment