Problems with information theoretic approaches to causal learning

10/24/2021
by   Nithin Nagaraj, et al.
0

The language of information theory is favored in both causal reasoning and machine learning frameworks. But, is there a better language than this? In this study, we demonstrate the pitfalls of infotheoretic estimation using first order statistics on (short) sequences for causal learning. We recommend the use of data compression based approaches for causality testing since these make very little assumptions on data as opposed to infotheoretic measures, and are more robust to finite data length effects. We conclude with a discussion on the challenges posed in modeling the effects of conditioning process X with another process Y in causal machine learning. Specifically, conditioning can increase 'confusion' which can be difficult to model by classical information theory. A conscious causal agent creates new choices, decisions and meaning which poses huge challenges for AI.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset