Graphical
model estimation is a seemingly impossible task
when several pairs of variables are not observed
jointly. Recovering the edges of the graph in
such settings requires one to infer conditional
dependencies between variables with no evidence
of their marginal dependence. This unexplored
statistical problem arises in several
situations, such as in large-scale neuroimaging
where, because of technology limitations, it is
impossible to record the activities of thousands
of neurons simultaneously. We call this
statistical challenge the "Graph Quilting
problem". We study this problem for Gaussian
Graphical models and first show that, under mild
conditions, it is possible to correctly identify
edges connecting the observed pairs of nodes.
Additionally, we show that we can recover a
minimal superset of the edges connecting
variables that are never jointly observed. Thus,
we show that one can infer conditional
relationships even when marginal relationships
are unknown. To accomplish this, we devise a
novel technique that we call the
"Recursive-Complement" algorithm. We propose an
L1-regularized graph quilting estimator and
establish its rates of convergence for graph
estimation and selection in high-dimensions. We
illustrate our approach using synthetic data, as
well as data obtained from in vivo calcium
imaging of ten thousand neurons in mouse visual
cortex. We further discuss several other
applications.