fuzzy pc gauss
fuzzy fci gauss
fuzzy pc KCI
fuzzy fci KCI
fuzzy pc SCI 20
fuzzy fci SCI 20
corr τ
random
(a) Legend.
0.1
0.1
0.1
0.1
0.1
0.2
0.2
0.2
0.2
0.2
0.3
0.3
0.3
0.3
0.3
0.4
0.4
0.4
0.4
0.4
0.5
0.5
0.5
0.5
0.5
0.6
0.6
0.6
0.6
0.6
0.7
0.7
0.7
0.7
0.7
0.8
0.8
0.8
0.8
0.8
0.9
0.9
0.9
0.9
0.9
100 200 300 400 500
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
n
precision
(b)
0.1
0.1
0.1
0.1
0.1
0.2
0.2
0.2
0.2
0.2
0.3
0.3
0.3
0.3
0.3
0.4
0.4
0.4
0.4
0.4
0.5
0.5
0.5
0.5
0.5
0.6
0.6
0.6
0.6
0.6
0.7
0.7
0.7
0.7
0.7
0.8
0.8
0.8
0.8
0.8
0.9
0.9
0.9
0.9
0.9
100 200 300 400 500
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
n
recall
(c)
Fig. 5: For the 8 approaches presented in the legend (5a),
mean scores of precision 5b and recall 5c as a function of
the number of observations n. All results over at least 485
simulations.
The first perspective of this work is to apply our proposition
to a real application in collaboration with experts. Then,
another task will be to work on the fuzzy set extraction step.
We plan to optimize the interpretability and causal relevance
of fuzzy sets. We also plan to consider the joint effects that
correspond to the case of “and”/“or” combinations in the fuzzy
premises. However, this introduction of the joined effect in our
procedure of fuzzy causal discovery would lead to an increase
in complexity that would need to be optimized.
REFERENCES
[1] J. Pearl, “Theoretical impediments to machine learning with seven sparks
from the causal revolution,” arXiv preprint arXiv:1801.04016, 2018.
[2] C. P. Agueda, “Causality in sciencie,” Pensamiento Matem
´
atico, no. 1,
p. 12, 2011.
[3] E. Sanchez, “Solutions in composite fuzzy relation equations: applica-
tion to medical diagnosis in brouwerian logic,” in Readings in Fuzzy
Sets for Intelligent Systems, pp. 159–165, Elsevier, 1993.
[4] H. Hajri, J.-P. Poli, and L. Boudet, “Towards monotonous functions
approximation from few data with gradual generalized modus ponens:
Application to materials science,” in 2021 IEEE 33rd International
Conference on Tools with Artificial Intelligence (ICTAI), pp. 796–800,
IEEE, 2021.
[5] J. Pearl, Causality. Cambridge university press, 2009.
[6] R. Guo, L. Cheng, J. Li, P. R. Hahn, and H. Liu, “A survey of learning
causality with data: Problems and methods,” ACM Computing Surveys
(CSUR), vol. 53, no. 4, pp. 1–37, 2020.
[7] C. Glymour, K. Zhang, and P. Spirtes, “Review of causal discovery
methods based on graphical models,” Frontiers in genetics, vol. 10,
p. 524, 2019.
[8] P. Spirtes, C. N. Glymour, R. Scheines, and D. Heckerman, Causation,
prediction, and search. MIT press, 2000.
[9] D. Colombo, M. H. Maathuis, M. Kalisch, and T. S. Richardson,
“Learning high-dimensional directed acyclic graphs with latent and
selection variables,” The Annals of Statistics, pp. 294–321, 2012.
[10] P. L. Spirtes, C. Meek, and T. S. Richardson, “Causal inference in
the presence of latent variables and selection bias,” arXiv preprint
arXiv:1302.4983, 2013.
[11] D. M. Chickering, “Optimal structure identification with greedy search,”
Journal of machine learning research, vol. 3, no. Nov, pp. 507–554,
2002.
[12] S. Shimizu, P. O. Hoyer, A. Hyv
¨
arinen, A. Kerminen, and M. Jordan,
“A linear non-gaussian acyclic model for causal discovery.,” Journal of
Machine Learning Research, vol. 7, no. 10, 2006.
[13] P. Bonissone, M. Henrion, L. Kanal, and J. Lemmer, “Equivalence and
synthesis of causal models,” in Uncertainty in artificial intelligence,
vol. 6, p. 255, Elsevier Science & Technology, 1991.
[14] G. Schwarz et al., “Estimating the dimension of a model,” Annals of
statistics, vol. 6, no. 2, pp. 461–464, 1978.
[15] D. Dubois and H. Prade, “Fuzzy relation equations and causal reason-
ing,” Fuzzy sets and systems, vol. 75, no. 2, pp. 119–134, 1995.
[16] D. Dubois and H. Prade, “A glance at causality theories for artificial
intelligence,” in A Guided Tour of Artificial Intelligence Research,
pp. 275–305, Springer, 2020.
[17] D. Dubois and H. Prade, “An overview of ordinal and numerical
approaches to causal diagnostic problem solving,” Abductive reasoning
and learning, pp. 231–280, 2000.
[18] J. Dunn, “A graph theoretic analysis of pattern classification via tamura’s
fuzzy relation,” IEEE Transactions on Systems, Man, and Cybernetics,
no. 3, pp. 310–313, 1974.
[19] J. C. Bezdek, “Objective function clustering,” in Pattern recognition with
fuzzy objective function algorithms, pp. 43–93, Springer, 1981.
[20] D. Malinsky and D. Danks, “Causal discovery algorithms: A practical
guide,” Philosophy Compass, vol. 13, no. 1, 2018.
[21] P. Spirtes and R. Scheines, “Causal inference of ambiguous manipula-
tions,” Philosophy of Science, vol. 71, no. 5, pp. 833–845, 2004.
[22] K. Zhang, J. Peters, D. Janzing, and B. Sch
¨
olkopf, “Kernel-based
conditional independence test and application in causal discovery,” arXiv
preprint arXiv:1202.3775, 2012.
[23] K. Fukumizu, F. R. Bach, and M. I. Jordan, “Dimensionality reduction
for supervised learning with reproducing kernel hilbert spaces,” Journal
of Machine Learning Research, vol. 5, no. Jan, pp. 73–99, 2004.
[24] A. Marx and J. Vreeken, “Testing conditional independence on discrete
data using stochastic complexity,” in The 22nd International Conference
on Artificial Intelligence and Statistics, pp. 496–505, PMLR, 2019.
[25] M. Kalisch and P. B
¨
uhlmann, “Robustification of the pc-algorithm
for directed acyclic graphs,” Journal of Computational and Graphical
Statistics, vol. 17, no. 4, pp. 773–789, 2008.
[26] M. Kalisch and P. B
¨
uhlman, “Estimating high-dimensional directed
acyclic graphs with the pc-algorithm.,” Journal of Machine Learning
Research, vol. 8, no. 3, 2007.