Sharmodeep Bhattacharyya (Oregon State University)

Title: Change Point Detection in Network Sequences

Abstract: We consider the problem of change point detection in sequence of network data. We focus on both offline and online versions of the change point detection problem. We develop algorithms for offline change point detection using cumulative sum (cusum) based statistics and binary segmentation for multiple change points. The algorithms for online change point detection are also based on cusum statistic and window-based approaches. We also provide a theoretical analysis of the change point detection method for network sequences with each network snapshot generated from an inhomogeneous random graph model and its variants for different situations. The proposed algorithms detect change points optimally for the offline setup and consistently for the online setup even for sparse networks. We validate and compare our change point estimator using simulated data sets.


Eric D. Kolaczyk (Boston University)

Title: How hard is it to work with a network "average"?

Abstract: I will briefly summarize work with colleagues that places the question raised in the title into an appropriate context, with respect to the corresponding geometry and probability, in order to motivate principled notions of network averages with provably analogous behavior to that of your standard scalar and vector averages. Leveraging these results for the purposes of statistical inference in practice, however, requires addressing various computational challenges, particularly in the case of unlabeled networks. The extent to which a latent network perspective could alleviate some of these challenges is an interesting open question.


Liza Levina (University of Michigan)

Title: Latent space models for multiplex networks with shared structure

Abstract: Latent space models are frequently used for modelling single-layer networks and include many popular special cases, such as the stochastic block model and the random dot product graph. Yet in practice, more complex network structures are becoming increasingly common. Here we propose a new latent space model for multiplex networks: multiple, heterogeneous networks observed on a shared node set. Multiplex networks can represent a network sample with shared node labels, a network evolving over time, or a single network with multiple types of edges. The key feature of our model is that it learns from data how much of the network structure is shared between layers, and pools information across layers as appropriate. We establish identifiability, develop a fitting procedure using convex optimization in combination with a nuclear norm penalty, and prove a guarantee of recovery for the latent positions as long as there is sufficient separation between the shared and the individual latent subspaces. The new model compares favorably to other methods in the literature on simulated networks and on a multiplex network describing the worldwide trade of agricultural products. Joint work with Peter MacDonald and Ji Zhu.


Marianna Pensky (University of Central Florida)

Title: ALMA: Alternating Minimization Algorithm for Clustering Mixture Multilayer Network

Abstract: The paper considers a Mixture Multilayer Stochastic Block Model (MMLSBM), where layers can be partitioned into groups of similar networks, and networks in each group are equipped with a distinct Stochastic Block Model. The goal is to partition the multilayer network into clusters of similar layers, and to identify communities in those layers. Jing et al. (2020) introduced the MMLSBM and developed a clustering methodology, TWIST, based on regularized tensor decomposition. The present paper proposes a different technique, an alternating minimization algorithm (ALMA), that aims at simultaneous recovery of the layer partition, together with estimation of the matrices of connection probabilities of the distinct layers. Compared to TWIST, ALMA achieves higher accuracy both theoretically and numerically. Joint work with Xing Fan, Feng Yu and Teng Zhang.


Carey E. Priebe (Johns Hopkins University)

Title: Practical nonparametric spectral graph comparison

Abstract: Given a collection of networks G_j, j=1,2,...,m, we wish to construct an m-by-m dissimilarity matrix D whence multiple graph inference may proceed. We focus on practical issues associated with the nonparametric pairwise spectral approach of Tang et al., Bernoulli, 2017.


Purnamrita Sarkar (University of Texas at Austin)

Title: Subsampling and Jackknife for networks

Abstract: Networks show up in a variety of applications, starting from social networks to brain networks in neuroscience, from recommender systems to the internet, from who-eats-whom networks in ecosystems to protein-protein interaction networks. An important question that often arises is centered around estimating the underlying distribution of network statistics, like eigenvalues, subgraph densities, etc. In this talk I will discuss some recent work on subsampling and the jackknife for these inferential tasks. Despite the dependence of network data induced by edges between pairs of nodes, under the graphon model, we show that jackknife and subsampling largely behave similarly to their IID counterparts. This is joint work with Robert Lunde and Qiaohui Lin.