Abstract
This study investigates whether feedforward neural networks with two hidden layers generalise better than those with one. In contrast to the existing literature, a method is proposed which allows these networks to be compared empirically on a hidden-node-by-hidden-node basis. This is applied to ten public domain function approximation datasets. Networks with two hidden layers were found to be better generalisers in nine of the ten cases, although the actual degree of improvement is case dependent. The proposed method can be used to rapidly determine whether it is worth considering two hidden layers for a given problem.
Original language | English |
---|---|
Title of host publication | EANN: International Conference on Engineering Applications of Neural Networks |
Place of Publication | Switzerland |
Publisher | Springer International Publishing |
Pages | 279-290 |
Number of pages | 12 |
Volume | 744 |
ISBN (Electronic) | 9783319651729 |
ISBN (Print) | 9783319651712 |
DOIs | |
Publication status | Published - 2 Aug 2017 |
Event | EANN: International Conference on Engineering Applications of Neural Networks - Athens, Greece, 25-27 August 2017 Duration: 2 Aug 2017 → … |
Publication series
Name | Communications in Computer and Information Sciences |
---|
Conference
Conference | EANN: International Conference on Engineering Applications of Neural Networks |
---|---|
Period | 2/08/17 → … |
Bibliographical note
The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-65172-9_24Fingerprint
Dive into the research topics of 'Two Hidden Layers are Usually Better than One'. Together they form a unique fingerprint.Profiles
-
Robert Morgan
- School of Arch, Tech and Eng - Professor of Thermal Propulsion Systems
- Advanced Engineering Centre
Person: Academic