Learning Conditional Independence Relations from a Probabilistic Model
نویسنده
چکیده
We consider the problem of learning conditional independencies, expressed as a Markov network, from a probabilistic model. An eecient algorithm employing a greedy search has been developed earlier with promising empirical results. However, two issues were not addressed. First, the reason why the myopic search works so well globally has not been fully understood. Second, whether the algorithm can nd a correct Markov network in all cases has not been formally established. In this paper, we prove that, for any given probabilistic model, the algorithm will always produce a Markov network whose structure is an independence map of the underlying model and whose associated probability distribution is identical to the underlying model. The proof also ooers deeper insight into the algorithm's working mechanism. As the problem of learning a minimal independence map of a given probabilistic model is NP-hard in general, our polynomial time algorithm does not guarantee minimality in all cases. We show that, however , if the given probabilistic model belongs to a subclass that has a singly connected independence map, the algorithm will always produce a Markov network whose structure is a minimal independence map.
منابع مشابه
Bayesian Test of Significance for Conditional Independence: The Multinomial Model
Conditional independence tests have received special attention lately in machine learning and computational intelligence related literature as an important indicator of the relationship among the variables used by their models. In the field of probabilistic graphical models, which includes Bayesian network models, conditional independence tests are especially important for the task of learning ...
متن کاملMarkov Properties for Linear Causal Models with Correlated Errors
A linear causal model with correlated errors, represented by a DAG with bi-directed edges, can be tested by the set of conditional independence relations implied by the model. A global Markov property specifies, by the d-separation criterion, the set of all conditional independence relations holding in any model associated with a graph. A local Markov property specifies a much smaller set of co...
متن کاملGeneralized Conditional Independence and Decomposition Cognizant Curvature: Implications for Function Optimization
We introduce conditional independence for real-valued set functions which generalizes probabilistic conditional independence. We show that a natural semantics of conditional independence is that of local modularity. Generalized conditional independence leads to a spectrum between two extremes: modular functions and functions without any local modularity. We develop a decomposition theory and re...
متن کاملReasoning about Independence in Probabilistic Models of Relational Data
Bayesian networks leverage conditional independence to compactly encode joint probability distributions. Many learning algorithms exploit the constraints implied by observed conditional independencies to learn the structure of Bayesian networks. The rules of d -separation provide a theoretical and algorithmic framework for deriving conditional independence facts from model structure. However, t...
متن کاملAn Introduction to Inference and Learning in Bayesian Networks
Bayesian networks (BNs) are modern tools for modeling phenomena in dynamic and static systems and are used in different subjects such as disease diagnosis, weather forecasting, decision making and clustering. A BN is a graphical-probabilistic model which represents causal relations among random variables and consists of a directed acyclic graph and a set of conditional probabilities. Structure...
متن کامل