Skip to main content

Scaling Up Large-scale Sparse Learning and Its Application to Medical Imaging

Abstract Large-scale $\ell_1$-regularized loss minimization problems arise in high-dimensional applications such as compressed sensing and high-dimensional supervised learning, including classification and regression problems. In many applications, it remains challenging to apply the sparse learning model to large-scale problems that have massive data samples with high-dimensional features. One popular and promising strategy is to scaling up the optimization problem in parallel. Parallel solvers run multiple cores on a shared memory system or a distributed environment to speed up the computation, while the practical usage is limited by the huge dimension in the feature space and synchronization problems.

In this dissertation, I carry out the resea... (more)
Created Date 2017
Contributor Li, Qingyang (Author) / Ye, Jieping (Advisor) / Xue, Guoliang (Advisor) / He, Jingrui (Committee member) / Wang, Yalin (Committee member) / Li, Jing (Committee member) / Arizona State University (Publisher)
Subject Computer science / Dictionary Learning / Distributed Computing / Machine Learning / Medical Imaging / Parallel Computing / Sparse Learning
Type Doctoral Dissertation
Extent 125 pages
Language English
Reuse Permissions All Rights Reserved
Note Doctoral Dissertation Computer Science 2017
Collaborating Institutions Graduate College / ASU Library
Additional Formats MODS / OAI Dublin Core / RIS

  Full Text
2.9 MB application/pdf
Download Count: 391

Description Dissertation/Thesis