synthval.metrics.KLDivergenceEstimation ======================================= .. py:class:: synthval.metrics.KLDivergenceEstimation(drop_duplicates = True) Bases: :py:obj:`SimilarityMetric` Similarity Metric computing an estimation of the Kullback-Leibler divergence based on the methodology proposed in the referenced paper. It should be noted that the algorithm used may cause a division-by-zero error if duplicates are present in the distributions considered. .. attribute:: drop_duplicates Flag controlling if the duplicates in the distribution can be dropped automatically (default: True). :type: bool, Optional .. rubric:: References Pérez-Cruz, F. - Kullback-Leibler divergence estimation of continuous distributions - IEEE International Symposium on Information Theory, 2008. .. py:method:: calculate(real_dist_df, synth_dist_df) Compute an estimation of the Kullback-Leibler divergence between two set of samples originating from two multivariate distribution real_dist and synth_dist. :param real_dist_df: Set of samples representing distribution real_dist. :type real_dist_df: pandas.DataFrame :param synth_dist_df: Set of samples representing distribution synth_dist. :type synth_dist_df: pandas.DataFrame :returns: A numpy array containing the estimated value of the Kullback-Leibler divergence. :rtype: numpy.ndarray