Next: Appendix
Up: Popular Ensemble Methods: An
Previous: 6. Conclusions
- Ali Pazzani1996
-
Ali, K. Pazzani, M. 1996.
Error reduction through learning multiple descriptions
Machine Learning, 24, 173-202.
- Alpaydin1993
-
Alpaydin, E. 1993.
Multiple networks for function learning
In Proceedings of the 1993 IEEE International Conference on
Neural Networks, I, 27-32 San Francisco.
- Arbib1995
-
Arbib, M.. 1995.
The Handbook of Brain Theory and Neural Networks.
MIT Press.
- Asker Maclin1997a
-
Asker, L. Maclin, R. 1997a.
Ensembles as a sequence of classifiers
In Proceedings of the Fifteenth International Joint Conference
on Artificial Intelligence, 860-865 Nagoya, Japan.
- Asker Maclin1997b
-
Asker, L. Maclin, R. 1997b.
Feature engineering and classifier selection: A case study in
Venusian volcano detection
In Proceedings of the Fourteenth International Conference on
Machine Learning, 3-11 Nashville, TN.
- Bauer Kohavi1999
-
Bauer, E. Kohavi, R. 1999.
An empirical comparison of voting classification algorithms:
Bagging, boosting, and variants
Machine Learning, 36, 105-139.
- Baxt1992
-
Baxt, W. 1992.
Improving the accuracy of an artificial neural network using
multiple differently trained networks
Neural Computation, 4, 772-780.
- Breiman1996a
-
Breiman, L. 1996a.
Bagging predictors
Machine Learning, 24(2), 123-140.
- Breiman1996b
-
Breiman, L. 1996b.
Bias, variance, and arcing classifiers
460, UC-Berkeley, Berkeley, CA.
- Breiman1996c
-
Breiman, L. 1996c.
Stacked regressions
Machine Learning, 24(1), 49-64.
- Clemen1989
-
Clemen, R. 1989.
Combining forecasts: A review and annotated
bibliography
Journal of Forecasting, 5, 559-583.
- Drucker Cortes1996
-
Drucker, H. Cortes, C. 1996.
Boosting decision trees
In Touretsky, D., Mozer, M., Hasselmo, M., Advances
in Neural Information Processing Systems, 8, 479-485 Cambridge, MA. MIT Press.
- Drucker
et al.1994
-
Drucker, H., Cortes, C., Jackel, L., LeCun, Y., Vapnik, V.
1994.
Boosting and other machine learning algorithms
In Proceedings of the Eleventh International Conference on
Machine Learning, 53-61 New Brunswick, NJ.
- Efron Tibshirani1993
-
Efron, B. Tibshirani, R. 1993.
An Introduction to the Bootstrap.
Chapman and Hall, New York.
- Fisher McKusick1989
-
Fisher, D. McKusick, K. 1989.
An empirical comparison of ID3 and back-propagation
In Proceedings of the Eleventh International Joint Conference
on Artificial Intelligence, 788-793 Detroit, MI.
- Freund Schapire1996
-
Freund, Y. Schapire, R. 1996.
Experiments with a new boosting algorithm
In Proceedings of the Thirteenth International Conference on
Machine Learning, 148-156 Bari, Italy.
- Friedman1996
-
Friedman, J. 1996.
On bias, variance, 0/1-loss, and the
curse-of-dimensionality
Journal of Data Mining and Knowledge Discovery, 1.
- Friedman et al.1998
-
Friedman, J., Hastie, T., Tibshirani, R. 1998.
Additive logistic regression: A statistical view of
boosting
(http://www-stat.stanford.edu/jhf).
- Geman et al.1992
-
Geman, S., Bienenstock, E., Doursat, R. 1992.
Neural networks and the bias/variance dilemma
Neural Computation, 4, 1-58.
- Granger1989
-
Granger, C. 1989.
Combining forecasts: Twenty years later
Journal of Forecasting, 8, 167-173.
- Grove Schuurmans1998
-
Grove, A. Schuurmans, D. 1998.
Boosting in the limit: Maximizing the margin of learned
ensembles
In Proceedings of the Fifteenth National Conference on
Artificial Intelligence, 692-699 Madison, WI.
- Hampshire Waibel1989
-
Hampshire, J. Waibel, A. 1989.
The meta-pi network: Building distributed knowledge
representations for robust pattern recognition
CMU-CS-89-166, CMU, Pittsburgh, PA.
- Hansen Salamon1990
-
Hansen, L. Salamon, P. 1990.
Neural network ensembles
IEEE Transactions on Pattern Analysis and Machine
Intelligence, 12, 993-1001.
- Hashem1997
-
Hashem, S. 1997.
Optimal linear combinations of neural networks
Neural Networks, 10(4), 599-614.
- Jacobs et al.1991
-
Jacobs, R., Jordan, M., Nowlan, S., Hinton, G. 1991.
Adaptive mixtures of local experts
Neural Computation, 3, 79-87.
- Kohavi Wolpert1996
-
Kohavi, R. Wolpert, D. 1996.
Bias plus variance decomposition for zero-one loss
functions
In Proceedings of the Thirteenth International Conference on
Machine Learning, 275-283 Bari, Italy.
- Kong Dietterich1995
-
Kong, E. Dietterich, T. 1995.
Error-correcting output coding corrects bias and variance
In Proceedings of the Twelfth International Conference on
Machine Learning, 313-321 Tahoe City, CA.
- Krogh Vedelsby1995
-
Krogh, A. Vedelsby, J. 1995.
Neural network ensembles, cross validation, and active
learning
In Tesauro, G., Touretzky, D., Leen, T., Advances in
Neural Information Processing Systems, 7, 231-238 Cambridge, MA. MIT Press.
- Lincoln Skrzypek1989
-
Lincoln, W. Skrzypek, J. 1989.
Synergy of clustering multiple back propagation networks
In Touretzky, D., Advances in Neural Information Processing
Systems, 2, 650-659 San Mateo, CA. Morgan
Kaufmann.
- Maclin1998
-
Maclin, R. 1998.
Boosting classifiers regionally
In Proceedings of the Fifteenth National Conference on
Artificial Intelligence, 700-705 Madison, WI.
- Maclin Opitz1997
-
Maclin, R. Opitz, D. 1997.
An empirical evaluation of bagging and boosting
In Proceedings of the Fourteenth National Conference on
Artificial Intelligence, 546-551 Providence, RI.
- Maclin Shavlik1995
-
Maclin, R. Shavlik, J. 1995.
Combining the predictions of multiple classifiers: Using
competitive learning to initialize neural networks
In Proceedings of the Fourteenth International Joint Conference
on Artificial Intelligence, 524-530 Montreal, Canada.
- Mani1991
-
Mani, G. 1991.
Lowering variance of decisions by using artificial neural
network portfolios
Neural Computation, 3, 484-486.
- Mooney et al.1989
-
Mooney, R., Shavlik, J., Towell, G., Gove, A. 1989.
An experimental comparison of symbolic and connectionist
learning algorithms
In Proceedings of the Eleventh International Joint Conference
on Artificial Intelligence, 775-780 Detroit, MI.
- Murphy Aha1994
-
Murphy, P. M. Aha, D. W. 1994.
UCI repository of machine learning databases
(machine-readable data repository)
University of California-Irvine, Department of Information and
Computer Science.
- Nowlan Sejnowski1992
-
Nowlan, S. Sejnowski, T. 1992.
Filter selection model for generating visual motion
signals
In Hanson, S., Cowan, J., Giles, C., Advances in
Neural Information Processing Systems, 5, 369-376 San Mateo, CA. Morgan Kaufmann.
- Opitz Shavlik1996a
-
Opitz, D. Shavlik, J. 1996a.
Actively searching for an effective neural-network
ensemble
Connection Science, 8(3/4), 337-353.
- Opitz Shavlik1996b
-
Opitz, D. Shavlik, J. 1996b.
Generating accurate and diverse members of a neural-network
ensemble
In Touretsky, D., Mozer, M., Hasselmo, M., Advances
in Neural Information Processing Systems, 8, 535-541 Cambridge, MA. MIT Press.
- Perrone1992
-
Perrone, M. 1992.
A soft-competitive splitting rule for adaptive tree-structured
neural networks
In Proceedings of the International Joint Conference on Neural
Networks, 689-693 Baltimore, MD.
- Perrone1993
-
Perrone, M. 1993.
Improving Regression Estimation: Averaging Methods for Variance
Reduction with Extension to General Convex Measure Optimization.
Ph.D. thesis, Brown University, Providence, RI.
- Quinlan1993
-
Quinlan, J. 1993.
C4.5: Programs for Machine Learning.
Morgan Kaufmann, San Mateo, CA.
- Quinlan1996
-
Quinlan, J. R. 1996.
Bagging, boosting, and c4.5
In Proceedings of the Thirteenth National Conference on
Artificial Intelligence, 725-730. Portland, OR.
- Rumelhart et al.1986
-
Rumelhart, D., Hinton, G., Williams, R. 1986.
Learning internal representations by error propagation
In Rumelhart, D. McClelland, J., Parallel
Distributed Processing: Explorations in the microstructure of cognition.
Volume 1: Foundations, 318-363. MIT Press, Cambridge, MA.
- Schapire1990
-
Schapire, R. 1990.
The strength of weak learnability
Machine Learning, 5(2), 197-227.
- Schapire et al.1997
-
Schapire, R., Freund, Y., Bartlett, P., Lee, W. 1997.
Boosting the margin: A new explanation for the effectiveness of
voting methods
In Proceedings of the Fourteenth International Conference on
Machine Learning, 322-330 Nashville, TN.
- Sollich Krogh1996
-
Sollich, P. Krogh, A. 1996.
Learning with ensembles: How over-fitting can be useful
In Touretsky, D., Mozer, M., Hasselmo, M., Advances
in Neural Information Processing Systems, 8, 190-196 Cambridge, MA. MIT Press.
- Tresp Taniguchi1995
-
Tresp, V. Taniguchi, M. 1995.
Combining estimators using non-constant weighting
functions
In Tesauro, G., Touretzky, D., Leen, T., Advances in
Neural Information Processing Systems, 7, 419-426 Cambridge, MA. MIT Press.
- Wolpert1992
-
Wolpert, D. 1992.
Stacked generalization
Neural Networks, 5, 241-259.
- Zhang et al.1992
-
Zhang, X., Mesirov, J., Waltz, D. 1992.
Hybrid system for protein secondary structure prediction
Journal of Molecular Biology, 225, 1049-1063.
David Opitz
1999-08-24