Tetrad Constraints Beyond The Linear Case
MetadataShow full item record
Tetrad constraints can tell us something about the existence and structure of latent parents shared by groups of observed variables. They are however unspecified in the case that variables are related through non-linear functions because they are defined as a vanishing constraint on the covariance matrix. Still, it might be the case that the distributions of the observed variables contain enough information to judge if a tetrad constraint holds, even if the relations in the model are non-linear. To find out of this is true, a random forest classifier is trained on the kernel mean embedding of the distributions of the observed variables. The classifier is tested against the Wishart test, a statistical test for tetrad constraints, on test data sampled from a multitude of different pure and impure measurement models. It is found that if the test and training data share the same underlying graphical structures and data generating process, then the classifier can beat the Wishart test in cases in which the tetrad constraints are not specified. But if the test data is sampled from a more complex graph than the training data, the results of the classifier degrade. A possible explanation is that the more complex graphs contain distributions that do not exist in the training graphs. Further research is needed to see how the variety of training distributions can be increased without the search space growing too large.