Multiscale modeling has become an invaluable tool in many research areas, coupling models and simulations across scales with the goal of systematically ensuring consistency between resolutions. Often, multiscale modeling occurs in a bottom-up fashion, with higher resolution models coarse-grained to remove presumably less-important degrees of freedom. While loss of potentially relevant information is inherent in this process, it is also necessary for simulations to reach longer time and length scales. Recently, however, machine-learned generative models have emerged as a route towards providing tighter coupling between resolutions, even promising to accelerate higher resolution simulations through coupled sampling of lower resolution models. I will present our recent work in developing probabilistic backmappings that “close the loop” in multiscale modeling by recovering higher resolution information. Through their probabilistic nature, our backmapping models can be incorporated into molecular dynamics and Monte Carlo simulations via reweighting or Metropolization, which grounds these machine-learned models in physical principles. I will discuss how such grounding not only allows generated configurations to satisfy a specified thermodynamic ensemble, but also provides a general route to critically assessing the performance of machine-learned generative models.