Description Usage Arguments Value Note Author(s) References
This function calculates Symmetrised Kullback - Leibler divergence (KL-Divergence) between each class. Designed for KLFDA.
1 | KL_divergence(obj)
|
obj |
The KLFDA object. Users can mdify it to adapt your own purpose. |
Returns a symmetrised version of the KL divergence between each pair of class
This function is useful for extimating the loss between reduced features and the original features. It has been adopted in TSNE to determine its projection performance.
qinxinghu@gmail.com
Van Erven, T., & Harremos, P. (2014). Renyi divergence and Kullback-Leibler divergence. IEEE Transactions on Information Theory, 60(7), 3797-3820.
Pierre Enel (2019). Kernel Fisher Discriminant Analysis (https://www.github.com/p-enel/MatlabKFDA), GitHub.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.