Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Entropic Regularization in Wasserste...
~
Reshetova, Daria,
Linked to FindBook
Google Book
Amazon
博客來
Entropic Regularization in Wasserstein Gans: Robustness, Generalization and Privacy /
Record Type:
Electronic resources : Monograph/item
Title/Author:
Entropic Regularization in Wasserstein Gans: Robustness, Generalization and Privacy // Daria Reshetova.
Author:
Reshetova, Daria,
Description:
1 electronic resource (124 pages)
Notes:
Source: Dissertations Abstracts International, Volume: 85-06, Section: B.
Contained By:
Dissertations Abstracts International85-06B.
Subject:
Mathematical functions. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30726895
ISBN:
9798381019605
Entropic Regularization in Wasserstein Gans: Robustness, Generalization and Privacy /
Reshetova, Daria,
Entropic Regularization in Wasserstein Gans: Robustness, Generalization and Privacy /
Daria Reshetova. - 1 electronic resource (124 pages)
Source: Dissertations Abstracts International, Volume: 85-06, Section: B.
In recent years, GANs have emerged as a powerful method for learning distributions from data by modeling the target distribution as a function of a known distribution. The function, often referred to as the generator, is optimized to minimize a chosen distance measure between the generated and target distributions. One commonly used measure for this purpose is the so-called Wasserstein distance. However, Wasserstein distance is hard to compute and optimize, and in practice, entropic regularization techniques are used to facilitate its computation and improve numerical convergence. Introducing regularization, however, changes the problem we are trying to solve, and hence the learned solution. While the computational advantages of entropic regularization have been well-established in the recent literature, the influence of the regularization on the learned generative model and the distribution it generates has remained poorly understood. In this thesis, we study the consequences of regularizing Wasserstein GANs with entropic regularization and show three important impacts of the regularization: 1) we shed light on how entropic regularization impacts the learned GAN solution; 2) we show that it improves sample complexity by removing the curse of dimensionality; 3) we show that entropic regularization can be used to e↵ectively train GANs from di↵erentially privatized data.First, we mathematically derive the learned distribution of the GAN to show the e↵ects of the regularization in a benchmark setting. Prior works focus primarily on evaluating GANs on real data, typically images, and although clearly valuable, such evaluations are often subjective due to a lack of clear baselines for benchmarking. The target distribution of images is too complex to get a precise form of the generated distribution even in the population setting. However, in a benchmark setting where the generator is linear and the target distribution is high-dimensional Gaussian, we are able to characterize the learned solution and prove that regularization leads to the robustness of the learned distribution as well as feature selection. We additionally show that debiasing the Entropy-regularized Wasserstein GAN with the autocorrective term (using Sinkhorn divergence as a loss function) leads to an unbiased solution.Second, we study the e↵ect of regularization on the sample complexity. Without regularization it is known that Wasserstein GANs su↵er from the curse of dimensionality, the number of samples needed for the empirical solution to approximate the population solution with some given error scales exponentially in the dimension of the problem. This dissertation shows that entropic regularization can resolve the curse of dimensionality enabling convergence at the parametric rate for a large class of generators and distributions, namely Lipschitz generators and sub-gaussian real distributions. These conditions are commonly satisfied in practice (for example, for images and sigmoid or tanh activations in the last layer of the generator). We present a theorem that quantifies the benefits of using Entropy-regularized Wasserstein Distance as a loss function. Our findings indicate that this regularization technique can effectively mitigate the challenges posed by high-dimensional data, offering a more robust and ecient learning process.Finally, we show that entropic regularization can extend the application of GANs to the realm of sensitive data privatized with differential privacy.
English
ISBN: 9798381019605Subjects--Topical Terms:
3564295
Mathematical functions.
Entropic Regularization in Wasserstein Gans: Robustness, Generalization and Privacy /
LDR
:04925nmm a22003973i 4500
001
2400496
005
20250522084139.5
006
m o d
007
cr|nu||||||||
008
251215s2023 miu||||||m |||||||eng d
020
$a
9798381019605
035
$a
(MiAaPQD)AAI30726895
035
$a
(MiAaPQD)STANFORDbm099mt8200
035
$a
AAI30726895
040
$a
MiAaPQD
$b
eng
$c
MiAaPQD
$e
rda
100
1
$a
Reshetova, Daria,
$e
author.
$3
3770513
245
1 0
$a
Entropic Regularization in Wasserstein Gans: Robustness, Generalization and Privacy /
$c
Daria Reshetova.
264
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2023
300
$a
1 electronic resource (124 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 85-06, Section: B.
500
$a
Advisors: Ozgur, Ayfer; Gamal, Abbas El; Weissman, Tsachy Committee members: Bent, Stacey F.
502
$b
Ph.D.
$c
Stanford University
$d
2023.
520
$a
In recent years, GANs have emerged as a powerful method for learning distributions from data by modeling the target distribution as a function of a known distribution. The function, often referred to as the generator, is optimized to minimize a chosen distance measure between the generated and target distributions. One commonly used measure for this purpose is the so-called Wasserstein distance. However, Wasserstein distance is hard to compute and optimize, and in practice, entropic regularization techniques are used to facilitate its computation and improve numerical convergence. Introducing regularization, however, changes the problem we are trying to solve, and hence the learned solution. While the computational advantages of entropic regularization have been well-established in the recent literature, the influence of the regularization on the learned generative model and the distribution it generates has remained poorly understood. In this thesis, we study the consequences of regularizing Wasserstein GANs with entropic regularization and show three important impacts of the regularization: 1) we shed light on how entropic regularization impacts the learned GAN solution; 2) we show that it improves sample complexity by removing the curse of dimensionality; 3) we show that entropic regularization can be used to e↵ectively train GANs from di↵erentially privatized data.First, we mathematically derive the learned distribution of the GAN to show the e↵ects of the regularization in a benchmark setting. Prior works focus primarily on evaluating GANs on real data, typically images, and although clearly valuable, such evaluations are often subjective due to a lack of clear baselines for benchmarking. The target distribution of images is too complex to get a precise form of the generated distribution even in the population setting. However, in a benchmark setting where the generator is linear and the target distribution is high-dimensional Gaussian, we are able to characterize the learned solution and prove that regularization leads to the robustness of the learned distribution as well as feature selection. We additionally show that debiasing the Entropy-regularized Wasserstein GAN with the autocorrective term (using Sinkhorn divergence as a loss function) leads to an unbiased solution.Second, we study the e↵ect of regularization on the sample complexity. Without regularization it is known that Wasserstein GANs su↵er from the curse of dimensionality, the number of samples needed for the empirical solution to approximate the population solution with some given error scales exponentially in the dimension of the problem. This dissertation shows that entropic regularization can resolve the curse of dimensionality enabling convergence at the parametric rate for a large class of generators and distributions, namely Lipschitz generators and sub-gaussian real distributions. These conditions are commonly satisfied in practice (for example, for images and sigmoid or tanh activations in the last layer of the generator). We present a theorem that quantifies the benefits of using Entropy-regularized Wasserstein Distance as a loss function. Our findings indicate that this regularization technique can effectively mitigate the challenges posed by high-dimensional data, offering a more robust and ecient learning process.Finally, we show that entropic regularization can extend the application of GANs to the realm of sensitive data privatized with differential privacy.
546
$a
English
590
$a
School code: 0212
650
4
$a
Mathematical functions.
$3
3564295
650
4
$a
Linear programming.
$3
560448
650
4
$a
Mathematicians.
$3
514781
650
4
$a
Privacy.
$3
528582
650
4
$a
Entropy.
$3
546219
650
4
$a
Neural networks.
$3
677449
650
4
$a
Mathematics.
$3
515831
690
$a
0800
690
$a
0405
710
2
$a
Stanford University.
$e
degree granting institution.
$3
3765820
720
1
$a
Ozgur, Ayfer
$e
degree supervisor.
720
1
$a
Gamal, Abbas El
$e
degree supervisor.
720
1
$a
Weissman, Tsachy
$e
degree supervisor.
773
0
$t
Dissertations Abstracts International
$g
85-06B.
790
$a
0212
791
$a
Ph.D.
792
$a
2023
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30726895
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9508816
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login