語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Learning and Investigating a Style-F...
~
Zhang, Chi.
FindBook
Google Book
Amazon
博客來
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer./
作者:
Zhang, Chi.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2019,
面頁冊數:
44 p.
附註:
Source: Masters Abstracts International, Volume: 80-10.
Contained By:
Masters Abstracts International80-10.
標題:
Computer science. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13806604
ISBN:
9781392010044
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
Zhang, Chi.
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
- Ann Arbor : ProQuest Dissertations & Theses, 2019 - 44 p.
Source: Masters Abstracts International, Volume: 80-10.
Thesis (M.S.)--University of California, Los Angeles, 2019.
This item must not be sold to any third party vendors.
We have just witnessed an unprecedented booming in the research area of artistic style transfer ever since Gatys et. al. introduced the neural method. One of the remaining challenges is to balance a trade-off among three critical aspects-peed, flexibility, and quality: (i) the vanilla optimization-based algorithm produces impressive results for arbitrary styles, but is unsatisfyingly slow due to its iterative nature, (ii) the fast approximation methods based on feed-forward neural networks generate satisfactory artistic effects but bound to only a limited number of styles, and (iii) feature-matching methods like AdaIN achieve arbitrary style transfer in a real-time manner but at a cost of the compromised quality. We find it considerably difficult to balance the trade-off well by merely using a single feed-forward step and ask, instead, whether there exists an algorithm that could adapt quickly to any style, while the adapted model maintains high efficiency and good image quality. Motivated by this idea, we propose a novel method, coined MetaStyle, which formulates the neural style transfer as a bilevel optimization problem and combines learning with only a few post-processing update steps to adapt to a fast approximation model. The qualitative and quantitative analysis in the experiments demonstrates that the proposed approach achieves high-quality arbitrary artistic style transfer effectively, with a good trade-off among speed, flexibility, and quality. We also investigate the style-free representation learned by MetaStyle. Apart from style interpolation and video style transfer, we also implement well-known style transfer methods and examine the style transfer results after substituting the original content image inputs with their style-free representation learned by MetaStyle. This could be thought of as inserting a preprocessing step to the content transformation branch. We show in the experiments that models trained using the MetaStyle preprocessing step produce consistently lower style loss and total loss, with a slightly higher content loss, compared to its counterparts without MetaStyle processing. And therefore, the stylized results achieve a better balance in appropriately combining semantics and styles. This shows that MetaStyle also learns a more general content representation in terms of adapting different artistic styles.
ISBN: 9781392010044Subjects--Topical Terms:
523869
Computer science.
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
LDR
:03447nmm a2200313 4500
001
2209662
005
20191104073800.5
008
201008s2019 ||||||||||||||||| ||eng d
020
$a
9781392010044
035
$a
(MiAaPQ)AAI13806604
035
$a
(MiAaPQ)ucla:17540
035
$a
AAI13806604
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Zhang, Chi.
$3
1033852
245
1 0
$a
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2019
300
$a
44 p.
500
$a
Source: Masters Abstracts International, Volume: 80-10.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Advisor: Zhu, Song-Chun.
502
$a
Thesis (M.S.)--University of California, Los Angeles, 2019.
506
$a
This item must not be sold to any third party vendors.
520
$a
We have just witnessed an unprecedented booming in the research area of artistic style transfer ever since Gatys et. al. introduced the neural method. One of the remaining challenges is to balance a trade-off among three critical aspects-peed, flexibility, and quality: (i) the vanilla optimization-based algorithm produces impressive results for arbitrary styles, but is unsatisfyingly slow due to its iterative nature, (ii) the fast approximation methods based on feed-forward neural networks generate satisfactory artistic effects but bound to only a limited number of styles, and (iii) feature-matching methods like AdaIN achieve arbitrary style transfer in a real-time manner but at a cost of the compromised quality. We find it considerably difficult to balance the trade-off well by merely using a single feed-forward step and ask, instead, whether there exists an algorithm that could adapt quickly to any style, while the adapted model maintains high efficiency and good image quality. Motivated by this idea, we propose a novel method, coined MetaStyle, which formulates the neural style transfer as a bilevel optimization problem and combines learning with only a few post-processing update steps to adapt to a fast approximation model. The qualitative and quantitative analysis in the experiments demonstrates that the proposed approach achieves high-quality arbitrary artistic style transfer effectively, with a good trade-off among speed, flexibility, and quality. We also investigate the style-free representation learned by MetaStyle. Apart from style interpolation and video style transfer, we also implement well-known style transfer methods and examine the style transfer results after substituting the original content image inputs with their style-free representation learned by MetaStyle. This could be thought of as inserting a preprocessing step to the content transformation branch. We show in the experiments that models trained using the MetaStyle preprocessing step produce consistently lower style loss and total loss, with a slightly higher content loss, compared to its counterparts without MetaStyle processing. And therefore, the stylized results achieve a better balance in appropriately combining semantics and styles. This shows that MetaStyle also learns a more general content representation in terms of adapting different artistic styles.
590
$a
School code: 0031.
650
4
$a
Computer science.
$3
523869
690
$a
0984
710
2
$a
University of California, Los Angeles.
$b
Computer Science.
$3
2104007
773
0
$t
Masters Abstracts International
$g
80-10.
790
$a
0031
791
$a
M.S.
792
$a
2019
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13806604
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9386211
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入