Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Learning and Investigating a Style-F...
~
Zhang, Chi.
Linked to FindBook
Google Book
Amazon
博客來
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer./
Author:
Zhang, Chi.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2019,
Description:
44 p.
Notes:
Source: Masters Abstracts International, Volume: 80-10.
Contained By:
Masters Abstracts International80-10.
Subject:
Computer science. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13806604
ISBN:
9781392010044
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
Zhang, Chi.
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
- Ann Arbor : ProQuest Dissertations & Theses, 2019 - 44 p.
Source: Masters Abstracts International, Volume: 80-10.
Thesis (M.S.)--University of California, Los Angeles, 2019.
This item must not be sold to any third party vendors.
We have just witnessed an unprecedented booming in the research area of artistic style transfer ever since Gatys et. al. introduced the neural method. One of the remaining challenges is to balance a trade-off among three critical aspects-peed, flexibility, and quality: (i) the vanilla optimization-based algorithm produces impressive results for arbitrary styles, but is unsatisfyingly slow due to its iterative nature, (ii) the fast approximation methods based on feed-forward neural networks generate satisfactory artistic effects but bound to only a limited number of styles, and (iii) feature-matching methods like AdaIN achieve arbitrary style transfer in a real-time manner but at a cost of the compromised quality. We find it considerably difficult to balance the trade-off well by merely using a single feed-forward step and ask, instead, whether there exists an algorithm that could adapt quickly to any style, while the adapted model maintains high efficiency and good image quality. Motivated by this idea, we propose a novel method, coined MetaStyle, which formulates the neural style transfer as a bilevel optimization problem and combines learning with only a few post-processing update steps to adapt to a fast approximation model. The qualitative and quantitative analysis in the experiments demonstrates that the proposed approach achieves high-quality arbitrary artistic style transfer effectively, with a good trade-off among speed, flexibility, and quality. We also investigate the style-free representation learned by MetaStyle. Apart from style interpolation and video style transfer, we also implement well-known style transfer methods and examine the style transfer results after substituting the original content image inputs with their style-free representation learned by MetaStyle. This could be thought of as inserting a preprocessing step to the content transformation branch. We show in the experiments that models trained using the MetaStyle preprocessing step produce consistently lower style loss and total loss, with a slightly higher content loss, compared to its counterparts without MetaStyle processing. And therefore, the stylized results achieve a better balance in appropriately combining semantics and styles. This shows that MetaStyle also learns a more general content representation in terms of adapting different artistic styles.
ISBN: 9781392010044Subjects--Topical Terms:
523869
Computer science.
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
LDR
:03447nmm a2200313 4500
001
2209662
005
20191104073800.5
008
201008s2019 ||||||||||||||||| ||eng d
020
$a
9781392010044
035
$a
(MiAaPQ)AAI13806604
035
$a
(MiAaPQ)ucla:17540
035
$a
AAI13806604
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Zhang, Chi.
$3
1033852
245
1 0
$a
Learning and Investigating a Style-Free Representation for Fast, Flexible, and High-Quality Neural Style Transfer.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2019
300
$a
44 p.
500
$a
Source: Masters Abstracts International, Volume: 80-10.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Advisor: Zhu, Song-Chun.
502
$a
Thesis (M.S.)--University of California, Los Angeles, 2019.
506
$a
This item must not be sold to any third party vendors.
520
$a
We have just witnessed an unprecedented booming in the research area of artistic style transfer ever since Gatys et. al. introduced the neural method. One of the remaining challenges is to balance a trade-off among three critical aspects-peed, flexibility, and quality: (i) the vanilla optimization-based algorithm produces impressive results for arbitrary styles, but is unsatisfyingly slow due to its iterative nature, (ii) the fast approximation methods based on feed-forward neural networks generate satisfactory artistic effects but bound to only a limited number of styles, and (iii) feature-matching methods like AdaIN achieve arbitrary style transfer in a real-time manner but at a cost of the compromised quality. We find it considerably difficult to balance the trade-off well by merely using a single feed-forward step and ask, instead, whether there exists an algorithm that could adapt quickly to any style, while the adapted model maintains high efficiency and good image quality. Motivated by this idea, we propose a novel method, coined MetaStyle, which formulates the neural style transfer as a bilevel optimization problem and combines learning with only a few post-processing update steps to adapt to a fast approximation model. The qualitative and quantitative analysis in the experiments demonstrates that the proposed approach achieves high-quality arbitrary artistic style transfer effectively, with a good trade-off among speed, flexibility, and quality. We also investigate the style-free representation learned by MetaStyle. Apart from style interpolation and video style transfer, we also implement well-known style transfer methods and examine the style transfer results after substituting the original content image inputs with their style-free representation learned by MetaStyle. This could be thought of as inserting a preprocessing step to the content transformation branch. We show in the experiments that models trained using the MetaStyle preprocessing step produce consistently lower style loss and total loss, with a slightly higher content loss, compared to its counterparts without MetaStyle processing. And therefore, the stylized results achieve a better balance in appropriately combining semantics and styles. This shows that MetaStyle also learns a more general content representation in terms of adapting different artistic styles.
590
$a
School code: 0031.
650
4
$a
Computer science.
$3
523869
690
$a
0984
710
2
$a
University of California, Los Angeles.
$b
Computer Science.
$3
2104007
773
0
$t
Masters Abstracts International
$g
80-10.
790
$a
0031
791
$a
M.S.
792
$a
2019
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13806604
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9386211
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login