Causal discovery for observational sciences using supervised machine learning

Research output: Contribution to journalJournal articleResearchpeer-review

Standard

Causal discovery for observational sciences using supervised machine learning. / Petersen, Anne Helby; Ramsey, Joseph; Ekstrøm, Claus Thorn; Spirtes, Peter.

In: Journal of Data Science, Vol. 21, No. 2, 2023, p. 255-280.

Research output: Contribution to journalJournal articleResearchpeer-review

Harvard

Petersen, AH, Ramsey, J, Ekstrøm, CT & Spirtes, P 2023, 'Causal discovery for observational sciences using supervised machine learning', Journal of Data Science, vol. 21, no. 2, pp. 255-280. https://doi.org/10.6339/23-JDS1088

APA

Petersen, A. H., Ramsey, J., Ekstrøm, C. T., & Spirtes, P. (2023). Causal discovery for observational sciences using supervised machine learning. Journal of Data Science, 21(2), 255-280. https://doi.org/10.6339/23-JDS1088

Vancouver

Petersen AH, Ramsey J, Ekstrøm CT, Spirtes P. Causal discovery for observational sciences using supervised machine learning. Journal of Data Science. 2023;21(2):255-280. https://doi.org/10.6339/23-JDS1088

Author

Petersen, Anne Helby ; Ramsey, Joseph ; Ekstrøm, Claus Thorn ; Spirtes, Peter. / Causal discovery for observational sciences using supervised machine learning. In: Journal of Data Science. 2023 ; Vol. 21, No. 2. pp. 255-280.

Bibtex

@article{30fc766bb5a347189cd7e77c1db1ab36,
title = "Causal discovery for observational sciences using supervised machine learning",
abstract = "Causal inference can estimate causal effects, but unless data are collected experimentally, statistical analyses must rely on pre-specified causal models. Causal discovery algorithms are empirical methods for constructing such causal models from data. Several asymptotically correct discovery methods already exist, but they generally struggle on smaller samples. Moreover, most methods focus on very sparse causal models, which may not always be a realistic representation of real-life data generating mechanisms. Finally, while causal relationships suggested by the methods often hold true, their claims about causal non-relatedness have high error rates. This non-conservative error trade off is not ideal for observational sciences, where the resulting model is directly used to inform causal inference: A causal model with many missing causal relations entails too strong assumptions and may lead to biased effect estimates. We propose a new causal discovery method that addresses these three shortcomings: Supervised learning discovery (SLdisco). SLdisco uses supervised machine learning to obtain a mapping from observational data to equivalence classes of causal models. We evaluate SLdisco in a large simulation study based on Gaussian data and we consider several choices of model size and sample size. We find that SLdisco is more conservative, only moderately less informative and less sensitive towards sample size than existing procedures. We furthermore provide a real epidemiological data application. We use random subsampling to investigate real data performance on small samples and again find that SLdisco is less sensitive towards sample size and hence seems to better utilize the information available in small datasets.",
author = "Petersen, {Anne Helby} and Joseph Ramsey and Ekstr{\o}m, {Claus Thorn} and Peter Spirtes",
year = "2023",
doi = "10.6339/23-JDS1088",
language = "English",
volume = "21",
pages = "255--280",
journal = "Journal of Data Science",
issn = "1683-8602",
number = "2",

}

RIS

TY - JOUR

T1 - Causal discovery for observational sciences using supervised machine learning

AU - Petersen, Anne Helby

AU - Ramsey, Joseph

AU - Ekstrøm, Claus Thorn

AU - Spirtes, Peter

PY - 2023

Y1 - 2023

N2 - Causal inference can estimate causal effects, but unless data are collected experimentally, statistical analyses must rely on pre-specified causal models. Causal discovery algorithms are empirical methods for constructing such causal models from data. Several asymptotically correct discovery methods already exist, but they generally struggle on smaller samples. Moreover, most methods focus on very sparse causal models, which may not always be a realistic representation of real-life data generating mechanisms. Finally, while causal relationships suggested by the methods often hold true, their claims about causal non-relatedness have high error rates. This non-conservative error trade off is not ideal for observational sciences, where the resulting model is directly used to inform causal inference: A causal model with many missing causal relations entails too strong assumptions and may lead to biased effect estimates. We propose a new causal discovery method that addresses these three shortcomings: Supervised learning discovery (SLdisco). SLdisco uses supervised machine learning to obtain a mapping from observational data to equivalence classes of causal models. We evaluate SLdisco in a large simulation study based on Gaussian data and we consider several choices of model size and sample size. We find that SLdisco is more conservative, only moderately less informative and less sensitive towards sample size than existing procedures. We furthermore provide a real epidemiological data application. We use random subsampling to investigate real data performance on small samples and again find that SLdisco is less sensitive towards sample size and hence seems to better utilize the information available in small datasets.

AB - Causal inference can estimate causal effects, but unless data are collected experimentally, statistical analyses must rely on pre-specified causal models. Causal discovery algorithms are empirical methods for constructing such causal models from data. Several asymptotically correct discovery methods already exist, but they generally struggle on smaller samples. Moreover, most methods focus on very sparse causal models, which may not always be a realistic representation of real-life data generating mechanisms. Finally, while causal relationships suggested by the methods often hold true, their claims about causal non-relatedness have high error rates. This non-conservative error trade off is not ideal for observational sciences, where the resulting model is directly used to inform causal inference: A causal model with many missing causal relations entails too strong assumptions and may lead to biased effect estimates. We propose a new causal discovery method that addresses these three shortcomings: Supervised learning discovery (SLdisco). SLdisco uses supervised machine learning to obtain a mapping from observational data to equivalence classes of causal models. We evaluate SLdisco in a large simulation study based on Gaussian data and we consider several choices of model size and sample size. We find that SLdisco is more conservative, only moderately less informative and less sensitive towards sample size than existing procedures. We furthermore provide a real epidemiological data application. We use random subsampling to investigate real data performance on small samples and again find that SLdisco is less sensitive towards sample size and hence seems to better utilize the information available in small datasets.

U2 - 10.6339/23-JDS1088

DO - 10.6339/23-JDS1088

M3 - Journal article

VL - 21

SP - 255

EP - 280

JO - Journal of Data Science

JF - Journal of Data Science

SN - 1683-8602

IS - 2

ER -

ID: 360175308