RIS ID

109929

Publication Details

Krivitsky, P. N. (2017). Using contrastive divergence to seed Monte Carlo MLE for exponential-family random graph models. Computational Statistics and Data Analysis, 107 149-161.

Abstract

Exponential-family models for dependent data have applications in a wide variety of areas, but the dependence often results in an intractable likelihood, requiring either analytic approximation or MCMC-based techniques to fit, the latter requiring an initial parameter configuration to seed their simulations. A poor initial configuration can lead to slow convergence or outright failure. The approximate techniques that could be used to find them tend not to be as general as the simulation-based and require implementation separate from that of the MLE-finding algorithm. Contrastive divergence is a more recent simulation-based approximation technique that uses a series of abridged MCMC runs instead of running them to stationarity. Combining it with the importance sampling Monte Carlo MLE yields a method for obtaining adequate initial values that is applicable to a wide variety of modeling scenarios. Practical issues such as stopping criteria and selection of tuning parameters are also addressed. A simple generalization of the Monte Carlo MLE partial stepping algorithm to curved exponential families (applicable to MLE-finding as well) is also proposed. The proposed approach reuses the aspects of an MLE implementation that are model-specific, so little to no additional implementer effort is required to obtain adequate initial parameters. This is demonstrated on a series of network datasets and models drawn from exponential-family random graph model computation literature, also exploring the limitations of the techniques considered.

Share

COinS
 

Link to publisher version (DOI)

http://dx.doi.org/10.1016/j.csda.2016.10.015