Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Learning with sparsity: Structures, ...
~
Chen, Xi.
Linked to FindBook
Google Book
Amazon
博客來
Learning with sparsity: Structures, optimization and applications.
Record Type:
Language materials, printed : Monograph/item
Title/Author:
Learning with sparsity: Structures, optimization and applications./
Author:
Chen, Xi.
Description:
173 p.
Notes:
Source: Dissertation Abstracts International, Volume: 74-12(E), Section: B.
Contained By:
Dissertation Abstracts International74-12B(E).
Subject:
Computer Science. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3573623
ISBN:
9781303441745
Learning with sparsity: Structures, optimization and applications.
Chen, Xi.
Learning with sparsity: Structures, optimization and applications.
- 173 p.
Source: Dissertation Abstracts International, Volume: 74-12(E), Section: B.
Thesis (Ph.D.)--Carnegie Mellon University, 2013.
The development of modern information technology has enabled collecting data of unprecedented size and complexity. Examples include web text data, microarray & proteomics, and data from scientific domains (e.g., meteorology). To learn from these high dimensional and complex data, traditional machine learning techniques often suffer from the curse of dimensionality and unaffordable computational cost. However, learning from large-scale high-dimensional data promises big payoffs in text mining, gene analysis, and numerous other consequential tasks.
ISBN: 9781303441745Subjects--Topical Terms:
626642
Computer Science.
Learning with sparsity: Structures, optimization and applications.
LDR
:03303nam a2200325 4500
001
1960545
005
20140623111234.5
008
150210s2013 ||||||||||||||||| ||eng d
020
$a
9781303441745
035
$a
(MiAaPQ)AAI3573623
035
$a
AAI3573623
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Chen, Xi.
$3
1017731
245
1 0
$a
Learning with sparsity: Structures, optimization and applications.
300
$a
173 p.
500
$a
Source: Dissertation Abstracts International, Volume: 74-12(E), Section: B.
500
$a
Adviser: Jaime Carbonell.
502
$a
Thesis (Ph.D.)--Carnegie Mellon University, 2013.
520
$a
The development of modern information technology has enabled collecting data of unprecedented size and complexity. Examples include web text data, microarray & proteomics, and data from scientific domains (e.g., meteorology). To learn from these high dimensional and complex data, traditional machine learning techniques often suffer from the curse of dimensionality and unaffordable computational cost. However, learning from large-scale high-dimensional data promises big payoffs in text mining, gene analysis, and numerous other consequential tasks.
520
$a
Recently developed sparse learning techniques provide us a suite of tools for understanding and exploring high dimensional data from many areas in science and engineering. By exploring sparsity, we can always learn a parsimonious and compact model which is more interpretable and computationally tractable at application time. When it is known that the underlying model is indeed sparse, sparse learning methods can provide us a more consistent model and much improved prediction performance. However, the existing methods are still insufficient for modeling complex or dynamic structures of the data, such as those evidenced in pathways of genomic data, gene regulatory network, and synonyms in text data.
520
$a
This thesis develops structured sparse learning methods along with scalable optimization algorithms to explore and predict high dimensional data with complex structures. In particular, we address three aspects of structured sparse learning: 1. Efficient and scalable optimization methods with fast convergence guarantees for a wide spectrum of high-dimensional learning tasks, including single or multi-task structured regression, canonical correlation analysis as well as online sparse learning. 2. Learning dynamic structures of different types of undirected graphical models, e.g., conditional Gaussian or conditional forest graphical models. 3. Demonstrating the usefulness of the proposed methods in various applications, e.g., computational genomics and spatial-temporal climatological data.
520
$a
In addition, we also design specialized sparse learning methods for text mining applications, including ranking and latent semantic analysis. In the last part of the thesis, we also present the future direction of the high-dimensional structured sparse learning from both computational and statistical aspects.
590
$a
School code: 0041.
650
4
$a
Computer Science.
$3
626642
650
4
$a
Operations Research.
$3
626629
650
4
$a
Statistics.
$3
517247
690
$a
0984
690
$a
0796
690
$a
0463
710
2
$a
Carnegie Mellon University.
$b
Machine Learning.
$3
2094142
773
0
$t
Dissertation Abstracts International
$g
74-12B(E).
790
$a
0041
791
$a
Ph.D.
792
$a
2013
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3573623
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9255373
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login