The FEDHC Bayesian network learning algorithm

Michail Tsagris

The paper proposes a new hybrid Bayesian network learning algorithm, termed Forward Early Dropping Hill Climbing (FEDHC), designed to work with either continuous or categorical data. FEDHC consists of a skeleton identification phase (learning the conditional associations among the variables) followed by the scoring phase that assigns the causal directions. Specifically for the case of continuous data, a robust to outliers version of FEDHC is also proposed. The paper manifests that the only implementation of MMHC in the statistical software \textit{R}, is prohibitively expensive and a new implementation is offered. The FEDHC is tested via Monte Carlo simulations that distinctly show it is computationally efficient, and produces Bayesian networks of similar to, or of higher accuracy than MMHC and PCHC. FEDHC yields more accurate Bayesian networks than PCHC with continuous data but less accurate with categorical data. Finally, an application of FEDHC, PCHC and MMHC algorithms to real data, from the field of economics, is demonstrated using the statistical software \textit{R}.

Knowledge Graph



Sign up or login to leave a comment