Buradasınız

Genetik Algoritma Kullanılarak İleri Beslemeli Bir Sinir Ağında Etkinlik Fonksiyonlarının Belirlenmesi

Determination of Activation Functions in A Feedforward Neural Network by using Genetic Algorithm

Journal Name:

Publication Year:

Abstract (2. Language): 
In this study, activation functions of all layers of the multilayered feedforward neural network have been determined by using genetic algorithm. The main criteria that show the efficiency of the neu¬ral network is to approximate to the desired output with the same number nodes and connection weights. One of the important parameter to determine this performance is to choose a proper activation function. In the classical neural network designing, a network is designed by choosing one of the generally known activation function. In the presented study, a table has been generated for the activation functions. The ideal activation function for each node has been chosen from this table by using the genetic algorithm. Two dimensional regression problem clusters has been used to com¬pare the performance of the classical static neural network and the genetic algorithm based neural network. Test results reveal that the proposed method has a high level approximation capacity.
Abstract (Original Language): 
Bu çalışmada çok katmanlı ileri besleneli bir sinir ağının tüm katmanlarındaki üyelik fonksiyonları genetik algoritma kullanarak belirlenmiştir. Bir sinir ağının etkinliğini gösteren temel ölçüt aynı sayıda düğüm ve bağlantı ağırlığı ile istenen sonuca daha iyi yaklaşabilmektir. Bu performansı belirleyen en önemli parametrelerden birisi uygun etkinlik fonksiyonlarının seçilmesidir. Klasik sinir ağı tasarı¬mında genellikle bilinen etkinlik fonksiyonlarından birisi seçilerek ağ tasarımı gerçekleştirilmektedir. Bu çalışmada etkinlik fonksiyonları için bir tablo oluşturulmuş ve her bir düğüm için en uygun etkin¬lik fonksiyonu genetik algoritma ile bu tablodan seçilmiştir. Klasik sabit yapılı bir sinir ağı ile önerilen genetik tabanlı sinir ağının performansının karşılaştırılması için 2 boyutlu regresyon problem kümesi kullanılmıştır. Test sonuçları ortaya konulan yöntemin oldukça yüksek bir yaklaşım kapasitesine sa¬hip olduğunu göstermiştir.
395
403

REFERENCES

References: 

Angeline, P. J., Saunders, G. M. and Pollack, J. B. 1994. An Evolutionary algorithm that constructs recurrent neural networks. IEEE Transactions on Neural Networks. 5 (1), 54-65.
Arifovica, J. and Gencay, R. 2001. Using genetic algorithms to select architecture of a feedforward artificial neural network. Physica A. 289 (3-4), 574-594.
Blanco, A., Delgado, M. and Pegalajar, M. C. 2001. A real-coded genetic algorithm for training recurrent neural net¬works. Neural Networks. 14 (1), 93-105.
Daqi, G. and Genxing, Y. 2003. Influences of variable scales and activation functions on the performances of multi¬layer feedforward neural networks. Pattern Recognition. 36 (4), 869 - 878.
Ferentinos, K. P. 2005. Biological engineering applications of feedforward neural networks designed and parameter¬ized by genetic algorithms. Neural Networks. 18 (7), 934¬950.
Guarnieri, S., Piazza, F. and Uncini, A. 1999. Multilayer feedforward networks with adaptive spline activation function. IEEE Transactions on Neural Networks. 10 (3), 672-683.
Hwang, J. N. S., Lay, R. Maechler, M., Martin, R. D. & Schi-mert, J. 1994. Regression modeling in back-propagation and projection pursuit learning. IEEE Transactions on Neu¬ral Networks. 5 (3), 342-353.
Kwok, T. Y. and Yeung, D. Y. 1997. Objective functions for training new hidden units in constructive neural networks. IEEE Transactions on Neural Networks. 8 (5), 1131-1148.
Leung, F. H., Lam, F. H., Ling, K. S. H. and Tam, P. K. S. 2003. Tuning of the structure and parameters of a neural net¬work using an ımproved genetic algorithm. IEEE Transac¬tions on Neural Networks. 14 (1), 79-88.
Ma, L. and Khorasani, K. 2005. Constructive feedforward neural networks using hermite polynomial activation functions. IEEE Transactions on Neural Networks. 16 (4),
821-833.
Marwala, T. 2007. Bayesian training of neural networks us¬ing genetic programming. Pattern Recognition Letters. 28
(12), 1452-1458.
Mayer, H. A. and Schwaiger, R. 2002. Differentiation of neu¬ron types by evolving activation function templates for ar¬tificial neural Networks. In Proceedings of the International Joint Conference on Neural Networks 12-17 May 2002. Honolulu, Hawaii. Vol. 2, 1773-1778.
Oha, S.
K
. and Pedrycz, W. 2006. Multi-layer self-organizing polynomial neural networks and their development with the use of genetic algorithms. Journal of the Franklin Institute. 343 (2), 125-136.
Pedrajas, N. G., Boyer, D. O. and Martinez, C. H. 2006. An alternative approach for neural network evolution with a genetic algorithm: crossover by combinatorial optimiza¬tion. Neural Networks. 19 (4), 514-528.
Sexton, R. S. and Gupta, J. N. D. 2000. Comparative evalua¬tion of genetic algorithm and backpropagation for training neural networks. Information Sciences, 129 (1-4), 45-59.
Ustun, O. 2009a. A Nonlinear full model of switched reluc¬tance motor with artificial neural network. Energy Conversion and Management. (50), 2413-2421.
Ustun, O. 2009b. Measurement and real-time modeling of ınductance and flux linkage in switched reluctance mo¬tors. IEEE Transactions on Magnetics. (Accepted paper).
Wang, C., Qin, S.Y. and Wan, B.W. 1991. A novel neural network structure with fast convergence based on optimizing combination of different activation function. In Proceedings of the Annual International Conference of the IEEE En¬gineering in Medicine and Biology Society. 31 Oct-3 Nov 1991. Orlando, Florida, Vol. 13 (3), 1399-1400.
Wong, K. W., Leung, C.S. and Chang, S. J. 2002. Use of pe¬riodic and monotonic activation functions in multilayer feedforward neural networks trained by extended kalman filter algorithm. IEE Proc. Vis. Image Signal Process. 149 (4),
217-224.
Xu, S. and Zhang, M. 2001. A novel adaptive activation function. In Proceedings of the International Joint Confer¬ence on Neural Networks, 15-19 July 2001. Washington, DC, Vol. 4, 2779-2782.
Yao, X. 1999. Evolving artificial neural networks. Proceedings of the IEEE. 9 (87), 1423-1447.

Thank you for copying data from http://www.arastirmax.com