Skip to main content
Log in

Reducing the complexity of an adaptive radial basis function network with a histogram algorithm

  • Review
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this paper, a constructive training technique known as the dynamic decay adjustment (DDA) algorithm is combined with an information density estimation method to develop a new variant of the radial basis function (RBF) network. The RBF network trained with the DDA algorithm (i.e. RBFNDDA) is able to learn information incrementally by creating new hidden units whenever it is necessary. However, RBFNDDA exhibits a greedy insertion behaviour that absorbs both useful and non-useful information during its learning process, therefore increasing its network complexity unnecessarily. As such, we propose to integrate RBFNDDA with a histogram (HIST) algorithm to reduce the network complexity. The HIST algorithm is used to compute distribution of information in the trained RBFNDDA network. Then, hidden nodes with non-useful information are identified and pruned. The effectiveness of the proposed model, namely RBFNDDA-HIST, is evaluated using a number of benchmark data sets. A performance comparison study between RBFNDDA-HIST and other classification methods is conducted. The proposed RBFNDDA-HIST model is also applied to a real-world condition monitoring problem in a power generation plant. The results are analysed and discussed. The outcome indicates that RBFNDDA-HIST not only can reduce the number of hidden nodes significantly without requiring a long training time but also can produce promising accuracy rates.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Zhang GP (2000) Neural networks for classification: a survey. IEEE Trans Syst Man Cybern Part C Appl Rev 30(4):451–462

    Article  Google Scholar 

  2. Karnin ED (1990) A simple procedure for pruning back-propagation trained neural networks. IEEE Trans Neural Netw 1(2):239–242

    Article  Google Scholar 

  3. Reed R (1993) Pruning algorithms—a survey. IEEE Trans Neural Netw 4(5):740–747

    Article  Google Scholar 

  4. Yu H (2010) Network complexity analysis of multilayer feedforward artificial neural networks. In: Schumann J, Liu Y (eds) Applications of neural networks in high assurance systems se—3, vol 268., SpringerBerlin, Heidelberg, pp 41–55

    Chapter  Google Scholar 

  5. Sabo D, Yu X-H (2008) Neural network dimension selection for dynamical system identification. In: IEEE international conference on control applications, pp 972–977

  6. Zhang GP (2007) Avoiding pitfalls in neural network research. IEEE Trans Syst Man Cybern Part C Appl Rev 37(1):3–16

    Article  Google Scholar 

  7. Medeiros CS, Barreto G (2013) A novel weight pruning method for MLP classifiers based on the MAXCORE principle. Neural Comput Appl 22(1):71–84

    Article  Google Scholar 

  8. Abid S, Chtourou M, Djemel M (2013) Pseudo-entropy based pruning algorithm for feed forward neural networks. Aust J Basic Appl Sci 7(8):214–223

    Google Scholar 

  9. Yu H, Reiner PD, Xie T, Bartczak T, Wilamowski BM (2014) An incremental design of radial basis function networks. IEEE Trans Neural Netw Learn Syst 25(10):1793–1803

    Article  Google Scholar 

  10. Montazer GA, Giveki D (2015) An improved radial basis function neural network for object image retrieval. Neurocomputing. doi:10.1016/j.neucom.2015.05.104

  11. Hsu C-F, Lin C-M, Yeh R-G (2013) Supervisory adaptive dynamic RBF-based neural-fuzzy control system design for unknown nonlinear systems. Appl Soft Comput 13(4):1620–1626

    Article  Google Scholar 

  12. Qasem SN, Shamsuddin SM (2011) Radial basis function network based on time variant multi-objective particle swarm optimization for medical diseases diagnosis. Appl Soft Comput 11(1):1427–1438

    Article  Google Scholar 

  13. Ghodsi A, Schuurmans D (2003) Automatic basis selection techniques for RBF networks. Neural Netw 16(5–6):809–816

    Article  Google Scholar 

  14. Polikar R, Upda L, Upda SS, Honavar V (2001) Learn ++: an incremental learning algorithm for supervised neural networks. IEEE Trans Syst Man Cybern Part C Appl Rev 31(4):497–508

    Article  Google Scholar 

  15. Buschermohle A, Schoenke J, Rosemann N, Brockmann W (2013) The incremental risk functional: basics of a novel incremental learning approach. In: IEEE international conference on systems, man, and cybernetics (SMC), pp 1500–1505

  16. Berthold MR, Diamond J (1995) Boosting the performance of RBF networks with dynamic decay adjustment. In: Tesauro G, Touretzky D, Alspector (eds) Advances in neural information processing, vol 7, p 8

  17. Paetz J (2004) Reducing the number of neurons in radial basis function networks with dynamic decay adjustment. Neurocomputing 62:79–91

    Article  Google Scholar 

  18. Narasimha PL, Delashmit WH, Manry MT, Li J, Maldonado F (2008) An integrated growing-pruning method for feedforward network training. Neurocomputing 71(13–15):2831–2847

    Article  Google Scholar 

  19. Attik M, Bougrain L, Alexandre F (2005) Neural network topology optimization. In: Duch W, Kacprzyk J, Oja E, Zadrożny S (eds) Artificial neural networks: formal models and their applications—ICANN 2005 SE—9, vol 3697. Springer, Berlin, pp 53–58

    Google Scholar 

  20. García-Pedrajas N, Ortiz-Boyer D (2007) A cooperative constructive method for neural networks for pattern recognition. Pattern Recogn 40(1):80–98

    Article  MATH  Google Scholar 

  21. Ma L, Khorasani K (2004) New training strategies for constructive neural networks with application to regression problems. Neural Netw 17(4):589–609

    Article  Google Scholar 

  22. Legg PA, Rosin PL, Marshall D, Morgan JE (2007) Improving accuracy and efficiency of registration by mutual information using Sturges’ histogram rule. In: Medical image understanding and analysis, pp 26–30

  23. Bache K, Lichman M (2013) UCI machine learning repository. http://archive.ics.uci.edu/ml/

  24. Anders U, Korn O (1999) Model selection in neural networks. Neural Netw 12(2):309–323

    Article  Google Scholar 

  25. Heo G-S, Oh I-S (2008) Simultaneous node pruning of input and hidden layers using genetic algorithms. In: 2008 international conference on machine learning and cybernetics, vol 6, pp 3428–3433

  26. Kaylani A, Georgiopoulos M, Mollaghasemi M, Anagnostopoulos GC (2009) AG-ART: an adaptive approach to evolving ART architectures. Neurocomputing 72(10–12):2079–2092

    Article  Google Scholar 

  27. Huynh TQ, Setiono R (2005) Effective neural network pruning using cross-validation. In: IEEE international joint conference on neural networks (IJCNN), vol 2, pp 972–977

  28. Leung CS, Wang H-J, Sum J (2010) On the selection of weight decay parameter for faulty networks. IEEE Trans Neural Netw 21(8):1232–1244

    Article  Google Scholar 

  29. Martínez-Martínez JM, Escandell-Montero P, Soria-Olivas E, Martín-Guerrero JD, Magdalena-Benedito R, Gómez-Sanchis J (2011) Regularized extreme learning machine for regression problems. Neurocomputing 74(17):3716–3721

    Article  Google Scholar 

  30. Huang G-B, Saratchandran P, Sundararajan N (2005) A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation. IEEE Trans Neural Netw 16(1):57–67

    Article  Google Scholar 

  31. Abbas H (2014) A decorrelation approach for pruning of multilayer perceptron networks. In: El Gayar N, Schwenker F, Suen C (eds) Artificial neural networks in pattern recognition SE 2, vol 8774. Springer, Berlin, pp 12–22

    Google Scholar 

  32. Leung C, Tsoi A (2005) Combined learning and pruning for recurrent radial basis function networks based on recursive least square algorithms. Neural Comput Appl 15(1):62–78

    Article  Google Scholar 

  33. Chan PPK, Wu X-R, Ng WWY, Yeung DS (2011) Sensitivity based growing and pruning method for RBF network in online learning environments. In: 2011 international conference on machine learning and cybernetics (ICMLC), vol 3, pp 1107–1112

  34. Ioannidis Y (2003) The history of histogram. In: Proceedings of the 29th international conference on very large data bases, vol 29, pp 19–30

  35. Scott DW, Sain SR (2005) Multi-dimensional density estimation. In: Rao CR, Wegman EJ, Solka JL (eds) Handbook of statistics 24: data mining and visualization. Elsevier, Amsterdam, pp 229–261

    Chapter  Google Scholar 

  36. Mazzeo GM, Sacca D, Sirangelo C (2005) Hierarchical binary histograms for summarizing multi-dimensional data. In: Proceedings of the 2005 ACM symposium on applied computing, pp 598–603

  37. Tjahyadi R, Liu W, Svetha Venkatesh (2004) Application of the DCT energy histogram for face recognition. In: 2nd international conference on information technology for application (ICITA), pp 305–310

  38. Goldstein M, Dengel A. (2012) Histogram-based outlier score (HBOS): A fast unsupervised anomaly detection algorithm. In: 35th German conference on artificial intelligence (KI-2012), pp 59– 63

  39. Gao S, Zhang C, Chen W-B. (2010) A variable bin width histogram based image clustering algorithm. In: 2010 IEEE fourth international conference on semantic computing (ICSC), pp 166–171

  40. Steinbiss V, Tran B-H, Ney H (1994) Improvement in beam search. In: International conference on spoken language processing, pp 2143–2146

  41. Kashino K, Kurozumi T, Murase H (2003) A quick search method for audio and video signals based on histogram pruning. IEEE Trans Multimed 5(3):348–357

    Article  Google Scholar 

  42. Zhang WQ, Liu J (2007) Two-stage method for specific audio retrieval. In: IEEE International conference on acoustics, speech and signal processing (ICASSP 2007), vol 4, pp IV–85–IV–88

  43. Bors AG, Pitas I (1996) Median radial basis function neural network. IEEE Trans Neural Netw 7(6):1351–1364

    Article  Google Scholar 

  44. Wei W, Leen TK, Barnard E (1999) A fast histogram-based postprocessor that improves posterior probability estimates. Neural Comput 11(5):1235–1248

    Article  Google Scholar 

  45. Shimazaki H, Shinomoto S (2007) A method for selecting the bin size of a time histogram. J Neural Comput 19(6):1503–1527

    Article  MathSciNet  MATH  Google Scholar 

  46. Paetz J (2006) Evolution of real valued weights for RBF-DDA networks. In: International joint conference on neural networks (IJCNN), pp 2907–2913

  47. Ding S, Xu L, Su C, Jin F (2012) An optimizing method of RBF neural network based on genetic algorithm. Neural Comput Appl 21(2):333–336

    Article  Google Scholar 

  48. Kusy M, Kluska J (2013) Probabilistic neural network structure reduction for medical data classification. In: Rutkowski L, Korytkowski M, Scherer R, Tadeusiewicz R, Zadeh L, Zurada J (eds) Artificial intelligence and soft computing SE—11, vol 7894. Springer, Berlin, pp 118–129

    Chapter  Google Scholar 

  49. Specht DF (1992) Enhancements to probabilistic neural networks. In: International Joint Conference on Neural Networks (IJCNN), vol 1, pp 761–768

  50. Tenaga National Sdn Bhd Malaysia (1999) System description and operating procedures, vol 14

  51. Lam W, Keung C-K, Danyu Liu (2002) Discovering useful concepts prototypes for classification based on filtering and abstration. IEEE Trans Pattern Anal Mach Intell 24(8):1075–1090

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pey Yun Goh.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Goh, P.Y., Tan, S.C., Cheah, W.P. et al. Reducing the complexity of an adaptive radial basis function network with a histogram algorithm. Neural Comput & Applic 28 (Suppl 1), 365–378 (2017). https://doi.org/10.1007/s00521-016-2350-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-016-2350-4

Keywords

Navigation