Participants
Adam O'Hare
The time-dependent Schrödinger equation (TDSE) gives the most accurate description of the motion of matter on the quantum scale and is therefore key to understanding some of the most fundamental processes in nature. However solving it requires some complex mathematics and is computationally very time-consuming. With recent developments in deep-learning algorithms, machine learning has the potential ability to be used to extract and learn patterns in the propgation of the TDSE significantly enhancing the ability to perform simualtions without the use of trial and error.
The proposal of this project is to use a Long-Short term memory (LSTM) model to accurately predict future steps of wavefunctions, without the need to solve the TDSE directly. This was achieved using two models working simultaneously to predict both the position and amplitudes of the wave at each time-step, to then be combined as the overall wavefunction prediction.
Funded by: Newcastle University Research Scholarship
Project Supervisor: Professor Tom Penfold