This work aims at estimating generative distributions of structured objects that satisfy certain semantic constraints (in first-order logic). The authors achieve this goal by adding a "semantic loss" to the GAN’s learning objective and using Knowledge compilation (KC) to build a circuit that allows efficient evaluation. Experiments on game-level generation tasks and a molecule generation task support the proposed method. Strengths: i) Incorporating structured constraints in GAN models is both intellectually and practically interesting; ii) The experiments are comprehensive and convincing in most cases; and iii) the paper is clearly written for most parts. Weaknesses: i) The main novelty and similarities/differences/limitations in comparison with other related approaches (e.g., Xu 2018) are not clearly discussed; and ii) technical descriptions for the KC techniques should be made clearer and self-contained. The paper is recommended for acceptance. The authors are urged to inculcate reviewers' suggestions and address the outlined weaknesses in their final draft. Jingyi Xu, et al. "A Semantic Loss Function for Deep Learning with Symbolic Knowledge." International Conference on Machine Learning. 2018.