[1]. FEIGENBAUM, E. A., “Expert System in the 1980s”,
in Infotech State of the Art Report on Machine
Intelligence, Ed:A. Bond, Maidenhead, Pergamon,
Infotch, 1981.
[2]. BOSE, I., MAHAPATRA K.R., “Business Data
Mining – A Machine learning Perspective”,
Information & Management 39, 211-225, 2001.
[3]. VAPNIK V.N., CHERVONENKIS A.Y., “On the
uniform convergence of relative frequencies of events
to their probabilities”, Theory of Probability and Its
Applications, 16(2), 264-280, 1971.
[4]. VALIANT, L.G., “A theory of the learnable”,
Communications of the ACM, Vol.27, 1134-1142; or,
Readings in Machine Learning, (1990) Shavlik, J.W.
and Dietterich, T.G. (Eds), Morgan Kaufmann, San
Mateo, CA, 192-200, 1984.
[5]. OBLOW, E.M., “Implementing Valiant’s learnability
theory using random sets”, Machine Learning, 8(1),
Kluwer Academic Publishers, Boston, 45-73, 1992.
[6]. KRZYSZTOF J. C., “An Algorithm Which Learns
Multiple Covers Via Integer Linear Programming Part
I: the CLILP2 algorithm”, Kybernetes, Vol. 24 No. 2,
pp. 29-50, MCB University Press, 0368-492X, 1995.
[7]. MICHALSKI, R.S., “A theory and methodology of
inductive learning”, Machine Learning - An Artificial
Intelligence Approach, Michalski; R.S., Carbonell, J.G.
and Mitchell, T.M. (Eds), Morgan Kaufmann, Los
Altos, CA, 83-134, 1983.
[8]. MITCHELL, T.M., “The need for biases in learning
generalizations”, Readings in Machine Learning,
Shavlik, J.W. and Dietterich, T.G. (Eds), Morgan
Kaufmann, San Mateo, CA, 184-191, 1990.
[9]. QUINLAN, J.R., “Learning efficient classification
procedures and their application to chess end games”,
Machine Learning - An Artificial Intelligence
Approach, Eds: Michalski; R.S., Carbonell, J.G. and
Mitchell, T.M. (Eds), Tioga Publishing Co, Palo Alto.
CA, 463-482, 1983.
[10]. DIETTERICH, T.G., “Limitations on inductive
learning”, Proceedings of the 6th International
Workshop on Machine Learning (89 ML), Ithaca,
NY. and Segre, A.M. (Ed), Morgan Kaufmann, San
Mateo, CA, 124-128, 1989.
[11]. BOLC, L., CYTOWSKI, J., “Search Methods for
Artificial Intelligence”, Academic Press, London,
1992.
[12]. HAUSSLER, D. “Quantifying inductive bias: AI
learning algorithms and Valiant’s learning
framework”, Artificial Intelligence, 36, 177-221;
reproduced in:, Readings in Machine learning, 1990.
[13]. WANG, X., “Inductive Learning Algorithms”, Ph.D.
Thesis, University of Wales Cardiff, 1997.
[14]. MICHALSKI, R.S., KODRATOFF, Y., “Research
in machine learning: recent progress, classification
of methods, and future directions”, Machine
Learning Vol.3, Morgan Kaufmann, San Mateo, CA,
3-30, 1990.
[15]. CENDROWSKA, J., “Knowledge Acquisition for
Expert Systems: Inducing Modular Rules from
Examples”, PhD Thesis, The Open University, 1990.
[16]. BRAMER, M.A., “Automatic Induction of
Classification Rules from Examples Using NPrism”,.
Research and Development in Intelligent
Systems XVI. Springer-Verlag, pp. 99-121, 2000.
[17]. HUNT, E. B., MARIN, J., STONE, P. J.,
“Experiments in induction”, Academic Pess, New
York, 1966.
[18]. QUINLAN, J.R., “Induction of decision trees”,
Machine Learning Vol.1, Kluwer Academic
Publishers, Boston, 81-106; reproduced in:,
Readings in Machine learning, 1990.
[19]. CHENG, J., et al., “Improved decision trees: A
generalized version of ID3”, Proceedings of the Fifth
international conference on Machine Learning, Ann
Arbor, Michigan, 100-106, 1988.
[20]. SCHLIMMER, J.C., FISHER, D.H., “A case study
of incremental concept induction”, AAAI 86-
Proceedings of the 5th National Conference on
Artificial Intelligence, Philadelphia, PA, 496-501,
1986.
[21]. UTGOFF, P.E., “ID5:An incremental ID3”,
Proceedings of the Fifth International Conference on
Machine Learning, The University of Michigan, 107-
120, 1988.
[22]. UTGOFF, P.E., “Incremental induction of decision
trees”, Machine Learning Vol.4, 161-186, 1989.
[23]. QUINLAN, J.R., “C4.5: Programs for Machine
Learning”, Morgan Kaufmann, San Mateo, CA,
1993.
[24]. BREIMAN, L.,et al. “Classification and Regression
Trees”, Wadsworth International Group, Belmont,
California, 1984.
[25]. CRAWFORD, S.L., “Extensions to the CART
algorithm”, Machine Learning and Uncertain
Reasoning - Knowledge-Based Systems, Vol. 3,
Gaines, B and Boose, J. (Eds), Academic Press,
London, 15-35, 1990.
[26]. ZHONG, N., DONG, J., OHSUGA, S., “Rule
discovery by soft induction techniques”,
Neurocomputing 36, p: 171-204, 2001.
[27]. MICHALSKI, R.S., “Synthesis of optimal and quasioptimal
variable-valued logic formulas”, Proceeding
of the 1975 Int. Symposium on Multiple-Valued
Logic, Bloomington, Indiana, 76-87, 1975.
[28]. AKSOY, M.S., “New Algorithms for Machine
Learning”, Thesis of PhD, University of Walles,
Cardiff, United Kingdom, 1993.
[29]. TOLUN, M. R., ABU-SOUD S.M., “ILA:An
Inductive Learning Algorithm For Rule Extraction”,
Expert Systems With Applications, Vol: 14, p:361-
370, 1998.
[30]. TOLUN, M. R., et al., “Improved Rule Discovery
Performance on Uncertainty”, The Second Pacific-
Asia Conference on Knowledge Discovery and Data
Mining (PAKDD-98), Melbourne, Australia, 15-17
April 1998.
[31]. AKGÖBEK, Ö., “Endüktif Öğrenmede Bilgi
Kazanımı için Yeni Algoritmalar”, Doktora Tezi,
Sakarya Üniversitesi, Adapazarı, 2003.
[32]. BLAKE, C.L., MERZ, C.J., “UCI Repository of
Machine Learning Databases”, [http://ftp.ics.uci.edu
/pub/ml-repos/machine-learning-databases/], 1998.
[33]. SGI Standard Template Library Programmer’s
Guide, Silicon Graphics Inc., http://www.sgi.
com/tech/mcl/db., 1996.
[34]. BRAMER, M.A., “Using J-pruned to reduce
overfitting in classification trees”, Knowledge-Based
Systems, Vol:15, 301-308, 2002.
[35]. WU, X., “Rule Induction with Extension Matrices”,
Journal of the American Society for Information
Science, Volume 49, Vol 5, 435-454, 1998.
[36]. HAMILTON, H. J., et al., “RIAC:A Rule Induction
Algorithm Based on Approximate Classification
Technical Report”, S4S 0A2, CS-96-06, ISSN 0828-
3494 ISBN 0-7731-0321-X, 35-37, 1996.
[37]. THRUN, S.B., BALA, J.,“The MONK's Problems A
Performance Comparison of Different Learning
Algorithms”, Carnegie Mellon University, CMUCS-
91-197, 1991.
[38]. AN, A., “Learning Classification Rules From Data”,
Computers & Mathematics with Applications, Vol
45, Issues 4-5, 737-748, 2003.
[39]. FOURNIER, D., CREMILLEUX, B., “A Quality
Index For decision Tree Pruning”, Knowledge-Based
System 15, 37-43, 2002.
Thank you for copying data from http://www.arastirmax.com