View source: R/information.plugin.R
CMI.plugin | R Documentation |
CMI.plugin measures the expected mutual information between two random variables conditioned on the third one from the joint probability distribution table.
CMI.plugin(probs, unit = c("log", "log2", "log10"))
probs |
the joint probability distribution table of three random variables. |
unit |
the base of the logarithm. The default is natural logarithm, which is "log". For evaluating entropy in bits, it is suggested to set the unit to "log2". |
CMI.plugin returns the conditional mutual information.
Wyner, A. D. (1978). A definition of conditional mutual information for arbitrary ensembles. Information & Computation, 38(1), 51-59.
# three numeric vectors corresponding to three continuous random variables
x <- c(0.0, 0.2, 0.2, 0.7, 0.9, 0.9, 0.9, 0.9, 1.0)
y <- c(1.0, 2.0, 12, 8.0, 1.0, 9.0, 0.0, 3.0, 9.0)
z <- c(3.0, 7.0, 2.0, 11, 10, 10, 14, 2.0, 11)
# corresponding joint count table estimated by "uniform width" algorithm
count_xyz <- discretize3D(x, y, z, "uniform_width")
# the joint probability distribution table of the count data
library("entropy")
probs_xyz <- freqs.empirical(count_xyz)
# corresponding conditional mutual information
CMI.plugin(probs_xyz)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.