語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
FindBook
Google Book
Amazon
博客來
Optimization-Based Modeling in Investment and Data Science.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Optimization-Based Modeling in Investment and Data Science./
作者:
Sun, Qingyun.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2019,
面頁冊數:
123 p.
附註:
Source: Dissertations Abstracts International, Volume: 82-10, Section: B.
Contained By:
Dissertations Abstracts International82-10B.
標題:
Sparsity. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28330446
ISBN:
9798597029917
Optimization-Based Modeling in Investment and Data Science.
Sun, Qingyun.
Optimization-Based Modeling in Investment and Data Science.
- Ann Arbor : ProQuest Dissertations & Theses, 2019 - 123 p.
Source: Dissertations Abstracts International, Volume: 82-10, Section: B.
Thesis (Ph.D.)--Stanford University, 2019.
This item must not be sold to any third party vendors.
Optimization has played a key role in numerous fields including data science, statistics, machine learning, decision science, control and quantitative investment.Optimization offers a way for users to focus on the modeling step. By formulating an optimization problem• you articulate what you want and what is acceptable to you;• you say what you want, but not how to get it.Convex optimization has been a very successful and powerful modeling framework. By formulating a problem as convex optimization, practitioners could focus on the modeling side without worrying about designing problem-specific optimization algorithms during prototyping time.However, there are hurdles in applying this convex modeling framework. First, lots of signal processing and machine learning problems are most naturally formulated as non-convex problems. Second, not all convex problems are tractable. Third, it may be hard to encode the knowledge of data into a simple regularizer or constraint and specify the mathematical form of the optimization problem. In this thesis, we talk about topics in optimization-based modeling, including1) distributional robust Kelly strategy in investment and gambling;2) convex sparse blind deconvolution;3) missing data imputation via a new structure called matrix network; 4) neural proximal method for compressive sensing.In these works. I try to expand the boundary of convex optimization based modeling by conquering several hurdles.In the distributional robust Kelly problem, the original distributional robust optimization formulation is convex but non-tractable; we transform the problem into a tractable form. In the sparse blind deconvolution problem, blind deconvolution has been perceived as a non-convex problem for a long time, we propose a scalable convex formulation, and find a phase transition for the convex algorithm. In the missing data imputation problem, we study a slice-wise missing pattern on tensorial type data that is beyond the capability of typical tensor completion algorithms. We propose a new type of underlying low-dimensional structure that allows us to impute the missing data. In the first three topics, we solve these problems via convex optimization formulations. In the last topic, we step out of the safety zone of convexity. On the linear inverse problem, we go beyond the sparsity and 1−norm regularizer for compressive sensing. To model complex structure in natural/medical images, we propose a learning-based idea to parameterize the proximal map of an unknown regularizer. This idea is inspired by the convex optimization modeling framework and the learning-based method, although the result need not correspond to convex optimization.
ISBN: 9798597029917Subjects--Topical Terms:
3680690
Sparsity.
Subjects--Index Terms:
Optimization problems
Optimization-Based Modeling in Investment and Data Science.
LDR
:03821nmm a2200337 4500
001
2346861
005
20220706051309.5
008
241004s2019 ||||||||||||||||| ||eng d
020
$a
9798597029917
035
$a
(MiAaPQ)AAI28330446
035
$a
(MiAaPQ)STANFORDss773zy3288
035
$a
AAI28330446
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Sun, Qingyun.
$3
3686059
245
1 0
$a
Optimization-Based Modeling in Investment and Data Science.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2019
300
$a
123 p.
500
$a
Source: Dissertations Abstracts International, Volume: 82-10, Section: B.
500
$a
Advisor: Boyd, Stephen P.;Donoho, David Leigh;Candes, Emmanuel J.
502
$a
Thesis (Ph.D.)--Stanford University, 2019.
506
$a
This item must not be sold to any third party vendors.
520
$a
Optimization has played a key role in numerous fields including data science, statistics, machine learning, decision science, control and quantitative investment.Optimization offers a way for users to focus on the modeling step. By formulating an optimization problem• you articulate what you want and what is acceptable to you;• you say what you want, but not how to get it.Convex optimization has been a very successful and powerful modeling framework. By formulating a problem as convex optimization, practitioners could focus on the modeling side without worrying about designing problem-specific optimization algorithms during prototyping time.However, there are hurdles in applying this convex modeling framework. First, lots of signal processing and machine learning problems are most naturally formulated as non-convex problems. Second, not all convex problems are tractable. Third, it may be hard to encode the knowledge of data into a simple regularizer or constraint and specify the mathematical form of the optimization problem. In this thesis, we talk about topics in optimization-based modeling, including1) distributional robust Kelly strategy in investment and gambling;2) convex sparse blind deconvolution;3) missing data imputation via a new structure called matrix network; 4) neural proximal method for compressive sensing.In these works. I try to expand the boundary of convex optimization based modeling by conquering several hurdles.In the distributional robust Kelly problem, the original distributional robust optimization formulation is convex but non-tractable; we transform the problem into a tractable form. In the sparse blind deconvolution problem, blind deconvolution has been perceived as a non-convex problem for a long time, we propose a scalable convex formulation, and find a phase transition for the convex algorithm. In the missing data imputation problem, we study a slice-wise missing pattern on tensorial type data that is beyond the capability of typical tensor completion algorithms. We propose a new type of underlying low-dimensional structure that allows us to impute the missing data. In the first three topics, we solve these problems via convex optimization formulations. In the last topic, we step out of the safety zone of convexity. On the linear inverse problem, we go beyond the sparsity and 1−norm regularizer for compressive sensing. To model complex structure in natural/medical images, we propose a learning-based idea to parameterize the proximal map of an unknown regularizer. This idea is inspired by the convex optimization modeling framework and the learning-based method, although the result need not correspond to convex optimization.
590
$a
School code: 0212.
650
4
$a
Sparsity.
$3
3680690
650
4
$a
Approximation.
$3
3560410
650
4
$a
Convex analysis.
$3
3681761
650
4
$a
Information processing.
$3
3561808
650
4
$a
Algorithms.
$3
536374
650
4
$a
Optimization.
$3
891104
650
4
$a
Signal processing.
$3
533904
650
4
$a
Neural networks.
$3
677449
653
$a
Optimization problems
653
$a
Convex optimization
690
$a
0984
690
$a
0364
710
2
$a
Stanford University.
$3
754827
773
0
$t
Dissertations Abstracts International
$g
82-10B.
790
$a
0212
791
$a
Ph.D.
792
$a
2019
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28330446
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9469299
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入