[1] H. Bozdogan, ICOMP: A new model-selection criterion. In classification and related
methods of data analysis, H. H. Bock (Ed.), Elsevier Science Publishers,
Amsterdam, 1988, pp.599-608.
[2] H. Bozdogan, On the information-based measure of covariance complexity and its
application to the evaluation of multivariate linear models. Communications in
Statistics, Theory and Methods. 19, 221-278 (1990).
[3] H. Bozdogan, Mixture-model cluster analysis using a new informational complexity
and model selection criteria. In Multivariate Statistical Modeling, H. Bozdogan (Ed.),
Vol. 2, Proceedings of the First US/Japan Conference on the Frontiers of Statistical
Modeling: An Informational Approach, Kluwer Academic Publishers, the
Netherlands, Dordrecht, 1994, pp.69-113.
[4] H. Bozdogan, Akaike's information criterion and recent developments in information
complexity. Journal of Mathematical Psychology. 44, 62-91 (2000).
[5] H. Bozdogan, Statistical Modeling and Model Evaluation: A New Informational
Approach. To appear (2004).
[6] H. Bozdogan, Information Complexity and Multivariate Learning in High Dimensions
in Data Mining. To appear (2011).
[7] H. Akaike, Information theory and an extension of the maximum likelihood
principle. In B.N. Petrov and F. Csáki (Eds.), Second international symposium on
information theory, AcadémiaiKiadó, Budapest, 267-281 (1973).
[8] M.H. van Emden, An Analysis of Complexity. Mathematical Centre Tracts,
Amsterdam, 35 (1971).
[9] J. Rissanen, Minmax entropy estimation of models for vector processes. In System
Identification: R.K. Mehra and D.G. Lainiotis (Eds.), Academic Press, New York,
1976, pp.97-119.
[10] H. Bozdogan, Model selection and Akaike's Information Criterion (AIC): The general
theory and its analytical extensions. Psychometrika. 52, 3, 345-370 (1987).
[11] H. Cramér, Mathematical Methods of Statistics. Princeton University Press,
Princeton, NJ, 1946.
[12] C.R. Rao, Information and accuracy attainable in the estimation of statistical
parameters. Bull. Calcutta Math Soc. 37, 81 (1945).
[13] C.R. Rao, Minimum variance and the estimation of several parameters. Proc. Cam.
Phil. Soc. 43, 280 (1947).
[14] C.R. Rao, Sufficient statistics and minimum variance estimates. Proc. Cam. Phil.
Soc. 45, 213 (1948).
[15] H. Bozdogan and D.M.A. Haughton, Informational complexity criteria for regression
models. Computational Statistics and Data Analysis. 28, 51-76 (1998).
[16] H. Bozdogan and M. Ueno, A unified approach to information-theoretic and
Bayesian model selection criteria. Invited paper presented in the Technical Session
Track C on: Information Theoretic Methods and Bayesian Modeling at the 6th World
H. Bozdogan / 4stanbul Üniversitesi 4sletme Fakültesi Dergisi 39, 2, (2010) 370-398 © 2010
397
Meeting of the International Society for Bayesian Analysis (ISBA), May 28-June 1,
2000, Hersonissos-Heraklion, Crete (2000).
[17] H. Bozdogan and P. M. Bearse, Subset selection in vector autoregressive models
using the genetic algorithm with informational complexity as the fitness function.
Systems Analysis, Modeling, Simulation (SAMS) (1998).
[18] S. Kullback, Information Theory and Statistics. Dover, New York, 1968.
[19] C.J. Harris, An information theoretic approach to estimation. In M. J. Gregson (Ed.),
Recent Theoretical Developments in Control, Academic Press, London, 1978,
pp.563-590.
[20] H. Theil and D.G. Fiebig, Exploiting Continuity: Maximum Entropy Estimation of
Continuous Distributions. Ballinger Publishing Company, Cambridge, MA, (1984).
[21] S. Kullback, and R. Leibler, On information and sufficiency. Ann. Math. Statist. 22,
79-86 (1951).
[22] C.E. Shannon, A mathematical theory of communication. Bell Systems Technology
Journal, 27, 1948, pp. 379-423.
[23] S. Watanabe, Pattern Recognition: Human and Mechanical. John Wiley and Sons,
New York, 1985.
[24] J. Rissanen, Stochastic Complexity in Statistical Inquiry. World Scientific Publishing
Company, Teaneck, NJ, 1989.
[25] R.E. Blahut, Principles and Practice of Information Theory. Addison-Wesley
Publishing Company, Reading, MA, 1987.
[26] C.R. Rao, Linear Statistical Inference and Its Applications. John Wiley& Sons, New
York, 1965, p. 532.
[27] S.A. Mulaik, Linear causal modeling with structural equations, CRC Press, A
chapman and Hall Book, 2009, p. 368.
[28] S.A. Mustonen, measure of total variability in multivariate normal distribution.
Comp. Statist. and Data Ana. 23, 321-334 (1997).
[29] S.D. Morgera, Information theoretic covariance complexity and its relation to
pattern recognition. IEEE Trans. on Syst., Man, and Cybernetics. SMC 15, 608-619
(1985).
[30] J.B. Conway, Functions of one complex variable I, Second edition, Springer-Verlag,
1995.
[31] L. Ljung and J. Rissanen, On canonical forms, parameter identifiability and the
concept of complexity. In Identification and System Parameter Estimation, N. S.
Rajbman (Ed.), North-Holland, Amsterdam, 1415-1426 (1978).
[32] M. S. Maklad and T.Nichols, A new approach to model structure discrimination. IEEE
Trans. on Syst., Man, and Cybernetics. SMC 10, 78-84 (1980).
[33] D.S. Poskitt, Precision, Complexity and Bayesian model determination. J. Roy.
Statist. Soc. 49, 199-208 (1987).
[34] B.R. Frieden, Physics from fisher information, Cambridge University press, 1998.
[35] J.Rissanen, Modeling by shortest data description. Automatica, 14, 465-471 (1978).
[36] G.Schwarz, Estimating the dimension of a model. Ann. Statist., 6, 461-464 (1978).
H. Bozdogan / 4stanbul Üniversitesi 4sletme Fakültesi Dergisi 39, 2, (2010) 370-398 © 2010
398
[37] A.D.R. McQuarie, and C-L. Tsai, Regression and Time Series Model Selection. World
Scientific Publishing Company, Singapore, 1998.
[38] K.P. Burnham and D. R. Anderson, Model Selection and Inference: A Practical
Information-Theoretic Approach. Springer, New York, 1998.
[39] D.V. Lindley, On a measure of information provided by an experiment, The Annals
of Mathematical Statistics 27, 4, 986-1005 (1956).
[40] K. Chaloner and I. Verdinelli, Bayesian experimental design a review. Statistical
Science. 10, 3, 273-304 (1995).
[41] R.E. Kass, L. Tierney, and J.B. Kadane, The validity of posterior expansions based
on Laplace’s method. In: GEISSER, S. et al. (Ed.), Bayesian and likelihood methods
in statistics and econometrics: essays in honor of George A. Barnard. Amsterdam:
North-Holland, 1990. 473-488, 1990.
[42] X. Chen, Model Selection in Nonlinear Regression Analysis. Unpublished Ph.D.
Thesis, the University of Tennessee, Knoxville, TN, 1996.
[43] K. Takeuchi, Distribution of information statistics and a criterion of model fitting.
Suri-Kagaku. Mathematical Sciences. 153, 12-18 (1976).
[44] J.R.M. Hosking, Language-multiplier tests of time-series models. Journal of the
Royal Statistics Society. Series B, 42, 170-181 (1980).
[45] R. Shibata, Statistical aspects of model selection. In J.C. Willems (Ed.), From the
data to modeling, Berlin: Springer-Verlag, 1989, pp. 216-240.
[46] A. Howe and H. Bozdogan, Regularized SVM classification with information
complexity and the genetic algorithm, appear in multivariate high dimensional data
mining forthcoming edited book, 2011.
[47] V. Vapnik, The nature of statistical learning theory, springer-verlag, New York,
1995.
[48] C. Hsu and C. Lin, A comparison of method for multiclass support vector machines.
IEEE Transactions on Neural Networks. 13, 2 (2002).
[49] F. Camillo, Personal correspondence, 2007.
[50] F. Camillo, C. Liberati and K.A. Athappilly, Profiling of customer data base through a
sample survey, unpublished report, 2009.
[51] E. Wegman, hyperdimensional data analysis using parallel coordinates. Technical
Report No.1. George Mason University Center for Computational Statistics (1986).
[52] S.H. Baek and H. Bozdogan, Multi-class support vector machine recursive feature
elimination using information complexity, working paper (2011).
Thank you for copying data from http://www.arastirmax.com