MACHINE LEARNING
276
3. Dasarathy, B.V Nearest Neighbor Pattern Classifica-
tion Techniques, (edited collection), Los Alamitos,
CA: IEEEPress, 1991.
4. Draper, N.R. and Smith, H., Applied Regression
Analysis, (second edition), New York, NY: John
Wiley & Sons, 1981.
5. Fahlman, S.E. and Lebiere, C., The Cascade-Correla-
tion Learning Architecture, (technical report), CMU-
CS-90-100, Pittsburgh, PA: Carnegie-Mellon Univer-
sity, 1991.
6. Katz, A.J., Gately, M.T., and Collins, D.R., “Robust
classifiers without robust features”, Neural Computa-
tion , 2, pp. 472-479, 1990.
7. Turney, P.D. and Halasz, M., “Contextual normaliza-
tion applied to aircraft gas turbine engine diagnosis”,
(inpress), Journal of Applied Intelligence, 1993.
8. Deterding, D., Speaker Normalization for Automatic
Speech Recognition, (Ph.D. thesis), Cambridge, UK:
University of Cambridge, Department ofEngineering,
1989.
9. Robinson, A.J., Dynamic Error Propagation
Networks, (Ph.D. thesis), Cambridge, UK: University
of Cambridge, Department ofEngineering, 1989.
10. Murphy, P.M. and Aha, D.W., UCI Repository of
Machine Learning Databases, Irvine, CA: University
of California, Department of Information and
Computer Science, 1991.
11. Diaconis, P. and Efron, B., “Computer-intensive
methods in statistics”, Scientific American, 248,
(May),pp. 116-131, 1983.
12. Cestnik, G., Konenenko, I., and Bratko, I., “Assistant-
86: a knowledge-elicitation tool for sophisticated
users”, in Progress in Machine Learning, edited by I.
Bratko and N. Lavrac, pp. 31-45, Wilmslow, England:
SigmaPress, 1987.