Abstract
We propose a novel approach for learning graphical models when data coming from different experimental conditions are available. We argue that classical constraint–based algorithms can be easily applied to mixture of experimental data given an appropriate conditional independence test. We show that, when perfect statistical inference are assumed, a sound conditional independence test for mixtures of experimental data can consist in evaluating the null hypothesis of conditional independence separately for each experimental condition. We successively indicate how this test can be modified in order to take in account statistical errors. Finally, we provide “Proof-of-Concept” results for demonstrating the validity of our claims.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Eberhardt, F.: Sufficient condition for pooling data from different distributions. In: ERROR (2006)
Aliferis, C.F., Statnikov, A., Tsamardinos, I., Mani, S., Koutsoukos, X.D.: Local causal and markov blanket induction for causal discovery and feature selection for classification part i: Algorithms and empirical evaluation. J. Mach. Learn. Res. 11, 171–234 (2010)
Aliferis, C.F., Statnikov, A., Tsamardinos, I., Mani, S., Koutsoukos, X.D.: Local Causal and Markov Blanket Induction for Causal Discovery and Feature Selection for Classification Part II: Analysis and Extensions. J. Mach. Learn. Res. 11, 235–284 (2010)
Lagani, V., Tsamardinos, I.: Structure-based variable selection for survival data. Bioinformatics 26(15), 1887–1894 (2010)
Cooper, G.F., Yoo, C.: Causal Discovery from a Mixture of Experimental and Observational Data. In: UAI (1999)
Tian, J., Pearl, J.: Causal discovery from changes. In: UAI (2001)
Pearl, J.: Causality: Models, Reasoning, and Inference. Cambridge University Press (March 2000)
Eaton, D., Murphy, K.: Exact bayesian structure learning from uncertain interventions. In: AISTAT (2007)
Holm, S.: A Simple Sequentially Rejective Multiple Test Procedure. Scandinavian Journal of Statistics 6(2), 65–70 (1979)
Murphy, K.P.: The Bayes Net Toolbox for MATLAB
Tsamardinos, I., Brown, L., Constantin, A.: The max-min hill-climbing Bayesian network structure learning algorithm. Machine Learning 65(1), 31–78 (2006)
Richardson, T., Spirtes, P.: Ancestral Graph Markov Models. The Annals of Statistics 30(4), 962–1030 (2002)
He, Y.-B., Geng, Z.: Active Learning of Causal Networks with Intervention Experiments and Optimal Designs. Journal of Machine Learning Research 9, 2523–2547 (2008)
Hyttinen, A., Hoyer, P.O., Eberhardt, F.: Noisy-OR Models with Latent Confounding. In: UAI (2011)
Claassen, T., Heskes, T.: Learning causal network structure from multiple (in)dependence models. In: PGM (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lagani, V., Tsamardinos, I., Triantafillou, S. (2012). Learning from Mixture of Experimental Data: A Constraint–Based Approach. In: Maglogiannis, I., Plagianakos, V., Vlahavas, I. (eds) Artificial Intelligence: Theories and Applications. SETN 2012. Lecture Notes in Computer Science(), vol 7297. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30448-4_16
Download citation
DOI: https://doi.org/10.1007/978-3-642-30448-4_16
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-30447-7
Online ISBN: 978-3-642-30448-4
eBook Packages: Computer ScienceComputer Science (R0)