Create New Account
Login
Search or Buy Articles
Browse Journals
Browse Proceedings
Submit your Paper
Submission Information
Journal Review
Recommend to Your Library
Call for Papers
ITERATIVE sIB ALGORITHM BASED ON SIMULATED ANNEALING
H. Yuan, Y. Ye, and J. Deng
References
[1] N. Tishby, F.C. Pereira, & W. Bialek, The information bottleneck method,
Proc. 37th Annual Allerton Conference on Communication, Control and Computing
, Monticello, 1999, 368–377.
[2] T.M. Cover & J.A. Thomas,
Elements of information theory
(New York: Plenum Press, 1991).
[3] N. Slonim & N. Tishby, Agglomerative information bottleneck,
Proc. Advances in Neural Information Processing Systems(NIPS-1999)
, Denver, USA, 1999, 617–623.
[4] N. Slonim,
The information bottleneck: Theory and application
, Doctoral Dissertation, Hebrew University of Jerusalem, Jerusalem, Israel, 2002.
[5] N. Slonim, N. Friedman, & N. Tishby, Unsupervised document classification using sequential information maximization,
Proc. 25th Ann. Int. ACM SIGIR Conf. on Research and Development in Information Retrieval
, Tampere, Finland, 2002, 129–136.
[6] G. Chechik & N. Tishby, Extracting relevant structures with side information,
Proc. Advances in Neural Information Processing Systems(NIPS-2002)
, Vancouver, British Columbia, CA, 2002, 857–864.
[7] J. Cardinal, Compression of side information,
IEEE International Conference on Multimedia and Expo(ICME’03)
, Maryland, USA, 2003, 569–572.
[8] D. Gondek & T. Hofmann, Conditional information bottleneck clustering,
Proc. 3rd IEEE International Conference on Data Mining, Workshop on Clustering Large Data Sets
, Melbourne, Florida, USA, 2003.
[9] D. Gondek & T. Hofmann, Non-redundant data clustering,
Proc. 4th IEEE International Conference on Data Mining
, Washington DC, USA, 2004, 75–82.
[10] N. Friedman, O. Mosenzon, N. Slonim, & N. Tishby, Multivariate information bottleneck,
Proc. 17th Conference on Uncertainty in Artificial Intelligence
, Seattle, Washington, USA, 2001, 152–161.
[11] N. Friedman, N. Slonim, & T. Tishby, Agglomerative multivariate information bottleneck,
Proc. Advances in Neural Information Processing Systems (NIPS-2001)
, Vancouver, British Columbia, CA, 2001, 617–623.
[12] G. Chechik, A. Globerson, N. Tishby, & Y. Weiss, Information bottleneck for Gaussian variables,
Proc. Advances in Neural Information Processing Systems (NIPS-2003)
, Vancouver, British Columbia, CA, 2003, 165–188.
[13] G. Chechik & A. Globerson,
Information bottleneck and linear projections of Gaussian process
, Technical Report, School of Engineering and Computer Science, Hebrew University of Jerusalem, Jerusalem, 2003.
[14] A. Cloberson & N. Tishby, Sufficient dimensionality reduction,
Journal of Machine Learning Research, 3
, 2003, 1307–1331.
[15] R. Gilad-Bachrach, A. Navot, & N. Tishby, An information theoretic tradeoff between complexity and accuracy,
Proc. 16’th Conference on Learning Computational Theory
, Washington DC, USA, 2003, 595–609.
[16] M.R. Garey, D.S. Johnson, & H.S. Witsenhausen, The complexity of the generalized Lloyd-Max problem,
IEEE Transactions on Information Theory, 28
(2), 1982, 255–256.
[17] J. Peltonen, J. Sinkkonen, & S. Kaski, Sequential information bottleneck for finite data,
Proc. 21st International Conference on Machine Learning (ICML)
, Banff, Alberta, CA, 2004, 647–654.
[18] R. El-Yaniv, S. Fine, & N. Tishby, Agnostic classification of Markovian sequences,
Proc. Advances in Neural Information Processing Systems (NIPS-1997)
, Denver, Colorado, USA, 1997, 465–471.
[19] J. Lin, Divergence measures based on the Shannon entropy,
IEEE Transactions on Information Theory, 37
(1), 1991, 145–151.
[20] A.K.C. Wong & M. You, Entropy and distance of random graphs with application to structural pattern recognition,
IEEE Transactions on Pattern Analysis and Machine Intelligence, 7
, 1985, 599–609.
Important Links:
Abstract
DOI:
10.2316/Journal.202.2010.3.202-2717
From Journal
(202) International Journal of Computers and Applications - 2010
Go Back