Abstract:
This work attempts to study the simplest classical model systems 
that describe the physics of phase transitions- nearest neighbour Ising models
using neural networks which are often associated with artificial intelligence.
There are two distinct parts to this- one, extracting useful physical information
like order parameters from thermalized samples and two, finding more efficient
ways of obtaining the thermalized samples in the first place. Existing works
already show that feeding thermalized samples to a trained neural network to 
identify any points in the phase space (like temperature) associated with drastic 
changes like phase transition without identifying order parameters. We note that 
the outputs of these networks in the phase space describe a certain order 
parameter that's similar yet different from a standard order parameter like 
magnetization.
We study the sensitivity of neural networks to changes along the phase space of 
Edwards Anderson (EA) model with a stochastic Hamiltonian. Traditionally, Markov 
Chain Monte Carlo (MCMC) methods are used to sample the thermalised spin lattices
like Ising. A recent work proposed an alternative- using autoregressive neural 
neural networks that produces unchained, uncorrelated samples of thermalized spin
lattices along with log-probabilities for every sample. However, their network 
was computationally expensive to implement on large lattices. We build on this to 
optimize the network design for speed by using insights from the underlying 
Boltzmann distribution for Ising models. We then obtain a method that can sample
lattices with time complexity atmost linear to the number of spins in the 
lattice, making it a truly viable alternative to MCMC as a sampling procedure.