Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
A Study of Field Theories Via Neural...
~
Maiti, Anindita,
Linked to FindBook
Google Book
Amazon
博客來
A Study of Field Theories Via Neural Networks /
Record Type:
Electronic resources : Monograph/item
Title/Author:
A Study of Field Theories Via Neural Networks // Anindita Maiti.
Author:
Maiti, Anindita,
Description:
1 electronic resource (229 pages)
Notes:
Source: Dissertations Abstracts International, Volume: 84-10, Section: B.
Contained By:
Dissertations Abstracts International84-10B.
Subject:
Theoretical physics. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30419826
ISBN:
9798379417994
A Study of Field Theories Via Neural Networks /
Maiti, Anindita,
A Study of Field Theories Via Neural Networks /
Anindita Maiti. - 1 electronic resource (229 pages)
Source: Dissertations Abstracts International, Volume: 84-10, Section: B.
We propose a theoretical understanding of neural networks in terms of Wilsonian effective field theory. The correspondence relies on the fact that many asymptotic neural networks are drawn from Gaussian processes, the analog of non-interacting field theories. Moving away from the asymptotic limit yields a non-Gaussian process and corresponds to turning on particle interactions, allowing for the computation of correlation functions of neural network outputs with Feynman diagrams. Minimal non-Gaussian process likelihoods are determined by the most relevant non-Gaussian terms, according to the flow in their coefficients induced by the Wilsonian renormalization group. This yields a direct connection between overparameterization and simplicity of neural network likelihoods. Whether the coefficients are constants or functions may be understood in terms of GP limit symmetries, as expected from 't Hooft's technical naturalness. General theoretical calculations are matched to neural network experiments in the simplest class of models allowing the correspondence. Our formalism is valid for any of the many architectures that becomes a GP in an asymptotic limit, a property preserved under certain types of training.Parameter-space and function-space provide two different duality frames in which to study neural networks. We demonstrate that symmetries of network densities may be determined via dual computations of network correlation functions, even when the density is unknown and the network is not equivariant. Symmetry-viaduality relies on invariance properties of the correlation functions, which stem from the choice of network parameter distributions. Input and output symmetries of neural network densities are determined, which recover known Gaussian process results in the infinite width limit. The mechanism may also be utilized to determine symmetries during training, when parameters are correlated, as well as symmetries of the Neural Tangent Kernel. We demonstrate that the amount of symmetry in the initialization density affects the accuracy of networks trained on Fashion-MNIST, and that symmetry breaking helps only when it is in the direction of ground truth.We study the origin of non-Gaussianities in neural network field densities, and demonstrate two distinct methods to constrain these systematically. As examples, we engineer a few nonperturbative neural network field distributions. Lastly, we demonstrate a measure for the locality of neural network actions, via cluster decomposition of connected correlation functions of network output ensembles.
English
ISBN: 9798379417994Subjects--Topical Terms:
2144760
Theoretical physics.
Subjects--Index Terms:
Field theory
A Study of Field Theories Via Neural Networks /
LDR
:03949nmm a22004093i 4500
001
2400425
005
20250522084123.5
006
m o d
007
cr|nu||||||||
008
251215s2023 miu||||||m |||||||eng d
020
$a
9798379417994
035
$a
(MiAaPQD)AAI30419826
035
$a
AAI30419826
040
$a
MiAaPQD
$b
eng
$c
MiAaPQD
$e
rda
100
1
$a
Maiti, Anindita,
$e
author.
$3
3770411
245
1 2
$a
A Study of Field Theories Via Neural Networks /
$c
Anindita Maiti.
264
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2023
300
$a
1 electronic resource (229 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 84-10, Section: B.
500
$a
Advisors: Halverson, James Committee members: Nelson, Brent; Ruehle, Fabian; Bao, Ning.
502
$b
Ph.D.
$c
Northeastern University
$d
2023.
520
$a
We propose a theoretical understanding of neural networks in terms of Wilsonian effective field theory. The correspondence relies on the fact that many asymptotic neural networks are drawn from Gaussian processes, the analog of non-interacting field theories. Moving away from the asymptotic limit yields a non-Gaussian process and corresponds to turning on particle interactions, allowing for the computation of correlation functions of neural network outputs with Feynman diagrams. Minimal non-Gaussian process likelihoods are determined by the most relevant non-Gaussian terms, according to the flow in their coefficients induced by the Wilsonian renormalization group. This yields a direct connection between overparameterization and simplicity of neural network likelihoods. Whether the coefficients are constants or functions may be understood in terms of GP limit symmetries, as expected from 't Hooft's technical naturalness. General theoretical calculations are matched to neural network experiments in the simplest class of models allowing the correspondence. Our formalism is valid for any of the many architectures that becomes a GP in an asymptotic limit, a property preserved under certain types of training.Parameter-space and function-space provide two different duality frames in which to study neural networks. We demonstrate that symmetries of network densities may be determined via dual computations of network correlation functions, even when the density is unknown and the network is not equivariant. Symmetry-viaduality relies on invariance properties of the correlation functions, which stem from the choice of network parameter distributions. Input and output symmetries of neural network densities are determined, which recover known Gaussian process results in the infinite width limit. The mechanism may also be utilized to determine symmetries during training, when parameters are correlated, as well as symmetries of the Neural Tangent Kernel. We demonstrate that the amount of symmetry in the initialization density affects the accuracy of networks trained on Fashion-MNIST, and that symmetry breaking helps only when it is in the direction of ground truth.We study the origin of non-Gaussianities in neural network field densities, and demonstrate two distinct methods to constrain these systematically. As examples, we engineer a few nonperturbative neural network field distributions. Lastly, we demonstrate a measure for the locality of neural network actions, via cluster decomposition of connected correlation functions of network output ensembles.
546
$a
English
590
$a
School code: 0160
650
4
$a
Theoretical physics.
$3
2144760
653
$a
Field theory
653
$a
Machine learning theory
653
$a
Neural networks
653
$a
Wilsonian field theory
690
$a
0753
690
$a
0800
710
2
$a
Northeastern University.
$b
Physics.
$e
degree granting institution.
$3
3770412
720
1
$a
Halverson, James
$e
degree supervisor.
773
0
$t
Dissertations Abstracts International
$g
84-10B.
790
$a
0160
791
$a
Ph.D.
792
$a
2023
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30419826
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9508745
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login