Research Assistant at the University of Liverpool with a Ph.D. studying particle physics using quantum computing and machine learning methods.
Experience programming and optimizing CPU and GPU simulations for supercomputers and performing statistical data analysis.
Regularly invited to present research at international conferences and published in academic journals, including PRL which targets a broad physics audience.
Click the PDF icon to download my CV.
The remainder of this webpage is devoted to an introduction of the physics research I worked on.
If you are more interested in programs I wrote, head directly to my github.
Warning this website is under construction! Apologies for any mess or missing content.
Lattice QCD
My main research interest is to understand the interactions of the strong force,
one of the
four fundamental forces, using numerical methods. The strong interaction is responsible
for holding together subatomic particles called hadrons, the two most famous being
protons and neutrons. Studying these small particles requires the merging of quantum mechanics
and relativity, which required the development of quantum field theory starting in the 1920's.
The quantum field theory that describes the strong interaction is called quantum chromodynamics (QCD),
for which there is no analytic solution.
Lattice QCD(LQCD) is the only first-principles approach to understanding the fundamentals of QCD and thus
to the structure and interactions of hadrons. This approach approximates the real world, which is infinite in
volume and continous in space and time, replacing it with a discrete lattice containing a finite number of points.
The "matter" particles, quarks, are placed on the sites of this lattice and the "force carrying" particles, gluons, live in-between the sites.
Calculations are performed on supercomputers via Monte Carlo importance sampling.
The below gif is a visualization of this Monte Carlo process, where each frame corresponds to one statistical sample.
Scattering
Within LQCD I work on multiparticle scattering projects, think of billiards colliding on a billboard table.
Billiards scatter clasically and only change their direction (state) when physically touching.
Hadrons change their state throughout the scattering process via gluons carrying the strong force, even when the hadrons are physically separated.
The nice thing about doing LQCD scattering calculations, is that we can compare with experimental measurements of hadron scattering properties.
For two particle scattering we can completely specify the scattering process by a single parameter called the phase-shift.
This parameter can be determined experimentally and accessed indirectly from LQCD simulations.
The below plot shows a comparison of these determinations for two pion scattering for three different physical cases.
The data points with error bars come from experimental measurements of the phase shift, the lines come from my LQCD determination.
The only noticeable discrepancy comes from the middle figure, which we believe to be from the absence of some of the quarks in our simulation.
Due to computational resource constraints we simply cannot simulate QCD with all six physics quarks, and our simulations only use two.
There are two forefronts in LQCD scattering research, one is the study of baryon-baryon scattering, the other is to push to understand three or more particle scattering.
My research group was the first to publish results for a three-particle resonance, the a1(1260), in which we used LQCD data to estimate the physical resonance parameters using the correct description of three-particle interactions.
The below plot shows the central result which is a direct comparison between lattice and experimental results, the x-axis of the plot is the mass of the particle and the y-axis is its (decay) width.
The blue points show the lattice samples of the resonance position at a heavier than physical pion mass. These should be compared with the orange square which shows the PDG value extracted from experiments.
We do not apriori expect overlap, since the lattice simulation is done with unphysical parameters, nevertheless we find roughly good agreement between the mass of the a1(1260).
Quantum Computing for Supersymmetry
Quantum computing is an approach to computing by performing operations on quantum states instead of classical ones.
The central idea to understand is that quantum bits, qubits, can be in superpositions of states, a qubit can be in the state 0 and the state 1.
A classical bit can only be in the state 0 or the state 1, thus an operation can only be applied to 0 or onto 1.
If a qubit is placed into a superposition of states, a single operation can act on 0 and 1 simultaneously.
This property can be leveraged algorithmically to gain an exponential speedup over classical computations.
Qubit states are nicely visualized on the Bloch sphere, below, where the south pole is 1 and the north pole is 0.
States of qubits are represented as arrows pointing from the center to the edge of a unit sphere (radius 1).
Classical bits are restricted to a vector pointing only to the north and south pole, while a qubit state can take be represented by a vector pointing anywhere on the sphere.
Operations on qubits transform the vector along the surface of the sphere.
For any quantum field theory, quantum computing is an approach that lets us completely avoid the Monte-Carlo approach of LQCD.
This is advantageous because one major drawback of LQCD is having to work in an imaginary time formalism, for scattering this means we can only indirectly compare with experiments, which are of course in real time.
Quantum simulations will always be performed in real time formalisms, as essentially the quantum computer is prepared into states that mimic states of the quantum field theory.
Apart from scattering we can also compute the energy levels of the theory using the variational quantum eigensolver(VQE).
The approach is essentially an application of the variational method using both classical and quantum devices.
First a parameterized state is prepared on the quantum computer and the states energy is computed.
Then classically these paramaters are tuned so that the energy of the state is minimized giving an upper bound on the true ground state energy.
The VQE is an excellent algorithm for the NISQ era because the quantum aspect of the calculation is extremely simple and does not require full fault tolerance, since we are not searching for a precise answer. The VQE has also been extended to allow us to solve for multiple energy levels using the deflation method and valled varitional quantum deflation (VQD).
I have applied quantum computing methods to supersymmetric theories, which are a proposed extension to the standard set of symmetries used in quantum field theories called Poincaré(translations, rotations and boosts).
The additional symmetry is a symmetry transformation that replaces bosons (integer spin particles) with fermions (half integer spin) invariantly, meaning the physical system is indifferent about which type of particle is there.
While we do not have any experimental evidence to suggest supersymmetry is a correct description of the universe, it is a useful theory to study symmetry breaking, especially since supersymmetry must be broken if it were the underlying theory of the universe.
In LQCD simulations, deciding whether supersymmetry is broken or preserved exponentially hard due to the sign problem.
With quantum computing it is an extremely straightforward task due to the fact that supersymmetry is preserved if and only if the ground state energy is exactly zero.
Furthermore any energy state that is non-zero has to be paired, there are always two states of positive energy, one bosonic and one fermionic.
This lets us apply the VQE and VQD to supersymmetric systems to easily check whether or not we expect supersymmetry to be broken.
NOW INCLUDE PLOTS
Other Interests
I am extremely interested in Rust and video game development. More details coming soon...