Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_stat_ml 88% Match 1 month ago

Consistent causal discovery with equal error variances: a least-squares perspective

reinforcement-learning › offline-rl
📄 Abstract

Abstract: We consider the problem of recovering the true causal structure among a set of variables, generated by a linear acyclic structural equation model (SEM) with the error terms being independent and having equal variances. It is well-known that the true underlying directed acyclic graph (DAG) encoding the causal structure is uniquely identifiable under this assumption. In this work, we establish that the sum of minimum expected squared errors for every variable, while predicted by the best linear combination of its parent variables, is minimised if and only if the causal structure is represented by any supergraph of the true DAG. This property is further utilised to design a Bayesian DAG selection method that recovers the true graph consistently.