dodiscover.ci.kernel_utils.f_divergence_score#

dodiscover.ci.kernel_utils.f_divergence_score(y_stat_q, y_stat_p)[source]#

Compute f-divergence upper bound on KL-divergence.

See definition 4 in [1], where the function is reversed to give an upper-bound for the sake of gradient descent.

The f-divergence bound gives an upper bound on KL-divergence:

\[D_{KL}(p || q) \le \sup_f E_{x \sim q}[exp(f(x) - 1)] - E_{x \sim p}[f(x)]\]
Parameters:
y_stat_qarray_like of shape (n_samples_q,)

Samples from the distribution Q, the variational class.

y_stat_p: array_like of shape (n_samples_p,)

Samples from the distribution P, the joint distribution.

Returns:
f_divfloat

The f-divergence score.