Combinatorial optimization for graphical structures in machine learning
MetadataShow full item record
Graphs are an essential topic in machine learning. In this proposal, we explore problems in graphical models (where a probability distribution has conditional independencies specified by a graph), causality (where a directed graph specifies causal directions), and clustering (where a weighted graph is used to denote related items). For our first contribution, we consider the facility location problem. In this problem our goal is to select a set of k "facilities" such that the average benefit from a "town" to its most beneficial facility is maximized. As input, we receive a bipartite graph where every edge has a weight denoting the benefit the town would receive from the facility if selected. The input graph is often dense, with O(n²) edges. We analyze sparsifying the graph with nearest neighbor methods. We give a tight characterization for how sparse each method can make the graph while approximately maintaining the value of the optimal solution. We then demonstrate these approaches experimentally and see that they lead to large speedups and high quality solutions. Next, we consider the MAP inference problem in discrete Markov Random Fields. MAP inference is the problem of finding the most likely configuration of a Markov Random Field. One common approach to finding the MAP solution is with integer programming. We are interested in analyzing the complexity of integer programming under the assumption that there is a polynomial number of fractional vertices in the linear programming relaxation of the integer program. We show that under this assumption the optimal MAP assignment can be found in polynomial time. Our result can be generalized to arbitrary integer programs such that the solution set is contained in the unit hypercube: we show that any integer program in the unit hypercube with a polynomial number of fractional vertices in the linear programming relaxation can be solved in polynomial time. We then consider the minimum cost intervention design problem: given the essential graph of a causal graph and a cost to intervene on a variable, identify the set of interventions with minimum total cost that can learn any causal graph with the given essential graph. We first show that this problem is NP-hard. We then prove that we can achieve a constant factor approximation to this problem with a greedy algorithm. Our approach to proving this guarantee uses tools from submodular optimization and knapsack quantization. Next we consider the problem of learning Ising models when an adversary can corrupt the samples we receive. For this problem we give nearly tight lower and upper bounds on the sample complexity. Finally, we consider the problem of conditional sampling from invertible generative models. We first establish hardness results of generating conditional samples. We then develop a scheme using variational inference that allows us to approximately solve the problem. Our approach is able to utilize the given invertible model to improve sample quality.