Testing general relativity with gravitational wave parameter estimation using machine learning
Summary
The ability to detect gravitational waves from coalescing binary black holes has opened up the possibility of testing general relativity in its most extreme regime of highly dynamical, strong spacetime curvature. Model-independent methods to do this exist, which allow for deviations from the predictions of general relativity by introducing testing parameters at selected places in the mathematical expression for the waveform, and these are routinely applied to signals from binary black holes. However, their computational requirements do not scale well with the increased number of detections expected when considering the upgrades of the LIGO and Virgo interferometers. Furthermore, the addition of these testing parameters will increase the parameter space needed to explore, thus asking for more computational resources. Therefore, a faster but equally robust method is urgently needed.
In this research, we present a simulation-based technique to speed up these measurements. To begin with, we consider individual binary black hole signals and apply a normalizing flow neural network for fast measurements of all parameters. The testing parameters studied in the thesis relate to the modified dispersion relation in the propagation of gravitational waves. We show that our presented model is a reliable method to infer the probability distributions for the testing parameters.