Description Usage Arguments Details Value Author(s) References See Also Examples
nmfsc
: R implementation of nmfsc
.
1 |
X |
the data matrix. |
p |
number of hidden factors = number of biclusters; default = 5. |
cyc |
maximal number of iterations; default = 100. |
sL |
sparseness loadings; default = 0.6. |
sZ |
sparseness factors; default = 0.6. |
Non-negative Matrix Factorization represents positive matrix X by positive matrices L and Z that are sparse.
Objective for reconstruction is Euclidean distance and sparseness constraints.
Essentially the model is the sum of outer products of vectors:
X = ∑_{i=1}^{p} λ_i z_i^T
where the number of summands p is the number of biclusters. The matrix factorization is
X = L Z
Here λ_i are from R^n, z_i from R^l, L from R^{n \times p}, Z from R^{p \times l}, and X from R^{n \times l}.
If the nonzero components of the sparse vectors are grouped together then the outer product results in a matrix with a nonzero block and zeros elsewhere.
The model selection is performed by a constraint optimization according to Hoyer, 2004. The Euclidean distance (the Frobenius norm) is minimized subject to sparseness and non-negativity constraints.
Model selection is done by gradient descent on the Euclidean objective and thereafter projection of single vectors of L and single vectors of Z to fulfill the sparseness and non-negativity constraints.
The projection minimize the Euclidean distance to the original vector given an l_1-norm and an l_2-norm and enforcing non-negativity.
The projection is a convex quadratic problem which is solved iteratively where at each iteration at least one component is set to zero. Instead of the l_1-norm a sparseness measurement is used which relates the l_1-norm to the l_2-norm.
The code is implemented in R.
|
object of the class |
Sepp Hochreiter
Patrik O. Hoyer, ‘Non-negative Matrix Factorization with Sparseness Constraints’, Journal of Machine Learning Research 5:1457-1469, 2004.
D. D. Lee and H. S. Seung, ‘Algorithms for non-negative matrix factorization’, In Advances in Neural Information Processing Systems 13, 556-562, 2001.
fabia
,
fabias
,
fabiap
,
fabi
,
fabiasp
,
mfsc
,
nmfdiv
,
nmfeu
,
nmfsc
,
extractPlot
,
extractBic
,
plotBicluster
,
Factorization
,
projFuncPos
,
projFunc
,
estimateMode
,
makeFabiaData
,
makeFabiaDataBlocks
,
makeFabiaDataPos
,
makeFabiaDataBlocksPos
,
matrixImagePlot
,
fabiaDemo
,
fabiaVersion
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 | #---------------
# TEST
#---------------
dat <- makeFabiaDataBlocks(n = 100,l= 50,p = 3,f1 = 5,f2 = 5,
of1 = 5,of2 = 10,sd_noise = 3.0,sd_z_noise = 0.2,mean_z = 2.0,
sd_z = 1.0,sd_l_noise = 0.2,mean_l = 3.0,sd_l = 1.0)
X <- dat[[1]]
Y <- dat[[2]]
X <- abs(X)
resEx <- nmfsc(X,3,30,0.6,0.6)
## Not run:
#---------------
# DEMO
#---------------
dat <- makeFabiaDataBlocks(n = 1000,l= 100,p = 10,f1 = 5,f2 = 5,
of1 = 5,of2 = 10,sd_noise = 3.0,sd_z_noise = 0.2,mean_z = 2.0,
sd_z = 1.0,sd_l_noise = 0.2,mean_l = 3.0,sd_l = 1.0)
X <- dat[[1]]
Y <- dat[[2]]
X <- abs(X)
resToy <- nmfsc(X,13,100,0.6,0.6)
extractPlot(resToy,ti="NMFSC",Y=Y)
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.