University of Wollongong
Browse

Using contrastive divergence to seed Monte Carlo MLE for exponential-family random graph models

Download (571.1 kB)
preprint
posted on 2024-11-16, 00:33 authored by Pavel Krivitsky
Exponential-family models for dependent data have applications in a wide variety of areas, but the dependence often results in an intractable likelihood, requiring either analytic approximation or MCMC-based techniques to t, the latter requiring an initial parameter configuration to seed their simulations. A poor value can lead to slow convergence or outright failure. The approximate techniques that could be used to seed them tend not to be as general as the simulation-based, and require implementation separate from that of the MLE-finding algorithm. Contrastive divergence is a more recent simulation-based approximation technique that uses a series of abridged MCMC runs instead of running them to stationarity. We combine it with the importance sampling Monte Carlo MLE for a general method to obtain adequate initial values the MLE-finding techniques, describe and extend it to a wide variety of modeling scenarios, and address practical issues such as stopping criteria and selection of tuning parameters. Our approach reuses the aspects of an MLE implementation that are model-specific, so little to no additional implementer effort is required to obtain adequate initial parameters. We demonstrate this on a series of network datasets and models drawn from ERGM computation literature.

History

Article/chapter number

11-15

Total pages

26

Language

English

Usage metrics

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC