__ Summary and Contributions__: This paper proposes ACNet, a generic form of Archimedean copulas. The authors thoroughly study their theoretical properties. In particular they show that Archimedean copulas can be learned in noisy settings and that conditionals can be extracted easily after parameters learning. Experimentally, they show that ACNets are able to express Clayton, Joe and Frank copulas. In addition, they show that ACNet performs as well as the best parametric copula on 3 small datasets.

__ Strengths__: The paper is well written (except typos) and easy to follow. ACNet is a novel idea, it is well introduced and the discussions about the interpretation of its parameters is very interesting. In addition, the authors show why Archimedean copulas could be preferred to other density estimation methods such as GANs, VAEs or NFs.

__ Weaknesses__: The experiments presented in the paper are not very convincing if the authors want to keep the comparison with neural density estimation methods. Indeed they do not compare their results with NFs whereas they could easily look at the results of some standard flows on the table 2. They could also apply their method to the standard benchmarks in neural density estimation(POWER, GAS, MINIBOONE, HEPMASS and BSDS300). However, it seems to me that ACNet may be too expensive to run on large datasets due to the inversion required at both training and testing times. This does not completely destroy the contribution but this should be discussed if this is something that avoids further comparison. Finally, the comparison with the parametric copulas show that ACNet do not lead to improvement if we are able to select the best parametric copula and thus again comes the computation time question: Is training ACNet faster than training the 3 types of copulas considered in this work?

__ Correctness__: The empirical methodology could be improved. I suggest the following:
- Error bars.
- Explicit test and train split (do you perform the uniformisation independently?).
- Computing time.
- Comparison to other methods and on more standard benchmarks.

__ Clarity__: The paper is well written but there are many singular/plural mistakes in the text.

__ Relation to Prior Work__: I think authors should perform a thorougher comparison with other density estimation techniques (flows + other copulas + neural nets related works).
You could take a look at the following papers:
- Masked autoregressive flows for density estimation
- Neural autoregressive flows
- Neural spline flows.

__ Reproducibility__: Yes

__ Additional Feedback__: ——————— Post review response feedback ————————
In light of the other reviews and the response sent by the authors I believe the idea presented is sound and deserved to be publicized and I decided to raise my score to 6. However the authors have a lot of work to achieve in order to provide the results they claim they will add. What intruigues me most is the response regarding the way the authors performed uniformisation. The authors said they performed it independently on the train and test. It seems to me that this is in general a very bad practice. Instead you should consider each test sample independently and thus be able to apply a uniformisation even if the test set is made of few samples. I would like the authors to clarify what they did in their experiments and be careful with such uncommon (because of undesired) practice.
In addition the authors avoided to comment on my question regarding the computation cost compared to the copulas they compare to. This is not yet clear to me what exactly is the advantage of using the proposed method instead of testing the other copulas, this should probably be clarified somewhere.

__ Summary and Contributions__: This paper proposes a differentiable way to estimate Archimedean Copulas. This is especially interesting to approximate Copulas which can't be estimated in closed form. This enables learning such Copulas but at the same time benefits of the benefits offered by the Copula approach. In their results they show that it performs similar or better than Clayton, Frank and Gumbel Copulas on real and synthetic data.
I think the concerns other reviewers are raising are valid. And I also agree that they are not all properly addressed by the authors. I feel like the authors included ~50% of the feedback of the reviewers to improve the manuscript and are defensive on the other half (which is not that uncommon).
However I still think the paper could be accepted and mainly because of two reasons:
- we kind of all agree that the paper is interesting and unique. I think the paper could make the conference more interesting and diversify the topics of accepted papers. Also no of the original reviews was worse than marginally below threshold
- we probably all agree that the method might be limited in certain directions (mainly data set complexity or computational). I personally don't value computational times that high (especially if all of them are under 2 hours) - the real issues usually arise when the computational complexity explodes which I don't think is the case here.

__ Strengths__: The paper is nicely written and motivated. The motivation to use Copulas is convincing and the approach taken is sound to me. The experiments show the basic properties necessary to render the approach useful.

__ Weaknesses__: They did not cite my work, when talking about where Copulas are applied :)
I guess the only thing that I would see as negative is probably the experimental evaluation. Whilst the evaluation done is good and necessary, it might not convince everybody. Especially it would be nice to compare against Copula networks or the mixture methods cited in the related works section. I'm however not familiar with those works, and am therefore not sure if this comparison would be feasible and fair?

__ Correctness__: The methods are described and formulated nicely. I however did not check all the details carefully, especially not the sketch of the proof of theorem 2

__ Clarity__: The paper is nicely written

__ Relation to Prior Work__: yes - except for not citing my work :P

__ Reproducibility__: Yes

__ Additional Feedback__: The broader impact statement was a joy to read - at the same time it is legit.

__ Summary and Contributions__: The paper introduces a method for learning Archimedean Copulas by making use of deep learning. Specifically, the introduce a novel differentiable neural network architecture to this end.

__ Strengths__: Archimedean copulas are an important tool for data-driven learning of the dependence structure between variables. The idea of using a deep learning architecture to learn the copula is novel and useful. The proposed architecture is correctly crafted to allow for effecting the copula learning problem. Its usefulness becomes more immense when it comes to modelling dependencies between more than two variables. Traditionally, this is a problem difficult to address; for instance, vines structures are used, which are both cumbersome to design and hard to effectively train.

__ Weaknesses__: The provided empirical evidence is far from convincing. We needed many more datasets to compare with, and especially comparisons against alternative vines in the context of multivariate scenarios. A comparative discussion on computational costs is also a must.

__ Correctness__: The theoretical claims are correct, as is the derivation of the model.

__ Clarity__: The paper is generally well-written, and provides all the details needed for it to be self-contained.

__ Relation to Prior Work__: The authors provide a brief summary on related work. I do have the feeling, though, that a bit lengthier discussion on vines would help the reader unfamiliar with the topic.

__ Reproducibility__: Yes

__ Additional Feedback__: I have read the rebuttal. It clarified some issues, but also left many points that need further consideration.

__ Summary and Contributions__: This paper proposed a new neural network module for estimating the copula of data within the Archimedean copula family. In particular, the generator characterizing this Archimedean copula is parameterized with the proposed ACNet structure, which is designed to satisfy the completely monotone property and differentiable. In addition, a Markov renewal interpretation on the weights is provided. Empirical results show the proposed approach is flexible enough to fit data generated from known Archimedean copulas. For real data, the proposed approach yields even lower test loss than the parametric copulas considered.

__ Strengths__: The proposed approach is novel in terms of designing a new neural network architecture to learn the generator of copula, improving the tradeoff between flexibility and tractability within the Archimedean copula family.
The copula-based framework focuses on the dependence structure of data and neglects the marginal information. Probabilistic quantities such as conditional probabilities can be evaluated by the CDF/copula function.
Despite that the Archimedean copula family has its own limitations, the experiments highlight its advantages in modeling some special aspects of data, such as tail dependencies.

__ Weaknesses__: Although the methodology development is quite novel, the empirical validation is weak.
(1) Although Archimedean copula is able to model dependence structure in high dimension, it seems in this paper only bivariate cases are evaluated.
(2) Meanwhile, I am wondering whether there are other approaches for parametrizing the generator of Archimedean copula in the literature, which can serve as baselines for comparison.
(3) Most of the experimental results are qualitative. How is the test loss defined? Is there a more rigorous metric for goodness-of-fit test of copulas in bivariate/high-dimension?

__ Correctness__: The proposed approach seems correct but I didn't closely check the mathematical derivations.

__ Clarity__: This paper is well-written and organized.

__ Relation to Prior Work__: It seems there is a literature on semiparametric Archimedean copulas that is missing from the discussion.
For example,
Hernández-Lobato, José Miguel, and Alberto Suárez. "Semiparametric bivariate Archimedean copulas." Computational statistics & data analysis 55.6 (2011): 2038-2058.
APA
Vandenhende, François, and Philippe Lambert. "Local dependence estimation using semiparametric Archimedean copulas." Canadian Journal of Statistics 33.3 (2005): 377-388.
Hoyos-Argüelles, Ricardo, and Luis Nieto-Barajas. "A Bayesian semiparametric Archimedean copula." Journal of Statistical Planning and Inference 206 (2020): 298-311.
Najjari, Vadoud, Tomáš Bacigál, and Hasan Bal. "An Archimedean copula family with hyperbolic cotangent generator." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 22.05 (2014): 761-768.

__ Reproducibility__: Yes

__ Additional Feedback__: What do you mean by "two-phased" nature in the dataset?
*******
After Rebuttal: Thank you for the response! The points related to semi-parametric Archimedean copula approaches are fair. I still have hesitation on what makes ACNet an approach of wide applicability. I understand bivariate copula is the cornerstone for high-dimension but for the bivariate case there're also many other choices. To show the advantages of the proposed Archimedean copula-based approach I think it would be helpful to evaluate and also compare to fewer choices in high-dimensional cases (such as Gaussian copula). I also find the training with uncertainty part very interesting, but the empirical evaluation of this part needs more work.