Stochastic gradient algorithms are widely used for large-scale learning and inference problems. However, their use in practice is typically guided by heuristics and trial-and-error rather than rigorous, generalizable theory. We take a step toward …

We consider sequential prediction with expert advice when data are generated from distributions varying arbitrarily within an unknown constraint set. We quantify relaxations of the classical i.i.d. assumption in terms of these constraint sets, with …

This work extends Roberts et al. (1997) by considering limits of Random Walk Metropolis (RWM) applied to block IID target distributions, with corresponding block-independent proposals. The extension verifies the robustness of the optimal scaling …