- 1.R. O. Duda and P. E. Hart, Pattern Classification and Scene Analysis, John Wiley & Sons, New York, 1973.Google Scholar
- 2.V. Castelli and T. M. Cover, "Classification rules in the unknown mixture parameter case: relative value of labeled and unlabeled examples," Proc. 1994 IEEE Int. Syrup. Inform. Theory, p. 111, Trondheim, Norway, 1994.Google ScholarCross Ref
- 3.V. Castelli and T. M. Cover, "On the exponential value of labeled samples," to appear Pattern Recognition Letters. Google ScholarDigital Library
- 4.T. M. Cover, "Learning and generalization," in Proc. Jth Annual Workshop on Computational Learning Theory, (eds. L. G. Valiant and M. K. Warmuth), p. 3, Morgan Kaufmann, San Mateo, California, 1991. Google ScholarDigital Library
- 5.D. Haussler, "Decision theoretic generalizations of the PAC model for neural net and other learning applications," Technical Report: University of California, Santa Cruz, UCSC-CRL-91-02, 1989. Google ScholarDigital Library
- 6.D. Pollard, Convergence of Stochastic Processes, Springer Verlag, New York, 1984.Google Scholar
- 7.J. Ratsaby, The Complexity of Learning from a Mixture of Labeled and Unlabeled Examples, Ph.D. Thesis, University of Pennsylvania, 1994. Google ScholarDigital Library
- 8.H. Teicher, "Identifiability of finite mixtures," Annals of Mathematical Statistics, vol. 34, pp. 1265- 1269, 1963.Google ScholarCross Ref
- 9.L. G. Valiant, "A Theory of the learnable," Comm. ACM, vol. 27, no. 11, pp. 1134-1142, 1984. Google ScholarDigital Library
- 10.A. Blumer, A. Ehrenfeucht, D. Haussler, and M. Warmuth, "Learnability and the Vapnik- Chervonenkis dimension," JACM, vol. 36, no. 4, pp. 929-965, 1989. Google ScholarDigital Library
- 11.N. Glick, "Sample-based classification procedures derived from density estimators," Y. American Statistical Association, vol. 67, 1972.Google Scholar
- 12.S. J. Yakowitz and J. D. Spragins, "On identifiability of finite mixtures," Annals of Mathematical Statistics, vol. 39, pp. 209-214, 1968.Google ScholarCross Ref
Index Terms
- Learning from a mixture of labeled and unlabeled examples with parametric side information
Recommendations
Global/local hybrid learning of mixture-of-experts from labeled and unlabeled data
HAIS'11: Proceedings of the 6th international conference on Hybrid artificial intelligent systems - Volume Part IThe mixture-of-experts (ME) models can be useful to solve complicated classification problems in real world. However, in order to train the ME model with not only labeled data but also unlabeled data which are easier to come, a new learning algorithm ...
Learning Instance Weighted Naive Bayes from labeled and unlabeled data
In real-world data mining applications, it is often the case that unlabeled instances are abundant, while available labeled instances are very limited. Thus, semi-supervised learning, which attempts to benefit from large amount of unlabeled data ...
Comments