We introduce Ensembled Dynamic Bayesian Networks (EDBN), an ensemble approach to learning salient dependence relationships and to predicting values for continuous temporal data. By training individual Bayesian networks on both a subset of the data (bagging) and a subset of the attributes in the data (randomization), EDBN produces models for continuous domains that can be used to identify important variables in a dataset and to identify relationships between those variables. We use linear Gaussian distributions within our ensembles, allowing EDBN to handle continuous data while providing efficient network-level inference. By ensembling these networks, we are able to represent nonlinear relationships. EDBN can also be used to explore the properties of a domain, both by counting how frequently two variables are found to be dependent upon one another when creating networks, and by estimating the importance of each variable for prediction.
We empirically demonstrate the effectiveness of EDBN on two meteorological domains. The first is the Storm Scale Ensemble Forecast (SSEF), which contains multiple meteorological model forecasts. We show that EDBN can be used to combine these forecasts, resulting in rainfall prediction that is better than the mean prediction. In the second domain, we demonstrate EDBN's utility for storm prediction, with empirical results showing that EDBN achieves better prediction results than the model prediction and persistence prediction.
Scott Hellman (2012). Learning Ensembled Dynamic Bayesian Networks. Master's Thesis, School of Computer Science, University of Oklahoma.
The code for the thesis can be released on request.
Created by amcgovern [at] ou.edu.
Last modified June 12, 2017 12:57 PM