Two Hidden Layers are Usually Better than One

Alan Thomas, Miltiadis Petridis, Simon Walters, Mohammad Malekshahi Gheytassi, Robert Morgan

Research output: Chapter in Book/Conference proceeding with ISSN or ISBNConference contribution with ISSN or ISBN

Abstract

This study investigates whether feedforward neural networks with two hidden layers generalise better than those with one. In contrast to the existing literature, a method is proposed which allows these networks to be compared empirically on a hidden-node-by-hidden-node basis. This is applied to ten public domain function approximation datasets. Networks with two hidden layers were found to be better generalisers in nine of the ten cases, although the actual degree of improvement is case dependent. The proposed method can be used to rapidly determine whether it is worth considering two hidden layers for a given problem.
Original languageEnglish
Title of host publicationEANN: International Conference on Engineering Applications of Neural Networks
Place of PublicationSwitzerland
PublisherSpringer International Publishing
Pages279-290
Number of pages12
Volume744
ISBN (Electronic)9783319651729
ISBN (Print)9783319651712
DOIs
Publication statusPublished - 2 Aug 2017
EventEANN: International Conference on Engineering Applications of Neural Networks - Athens, Greece, 25-27 August 2017
Duration: 2 Aug 2017 → …

Publication series

NameCommunications in Computer and Information Sciences

Conference

ConferenceEANN: International Conference on Engineering Applications of Neural Networks
Period2/08/17 → …

Fingerprint

Feedforward neural networks

Bibliographical note

The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-65172-9_24

Cite this

Thomas, A., Petridis, M., Walters, S., Malekshahi Gheytassi, M., & Morgan, R. (2017). Two Hidden Layers are Usually Better than One. In EANN: International Conference on Engineering Applications of Neural Networks (Vol. 744, pp. 279-290). (Communications in Computer and Information Sciences). Switzerland: Springer International Publishing. https://doi.org/10.1007/978-3-319-65172-9_24
Thomas, Alan ; Petridis, Miltiadis ; Walters, Simon ; Malekshahi Gheytassi, Mohammad ; Morgan, Robert. / Two Hidden Layers are Usually Better than One. EANN: International Conference on Engineering Applications of Neural Networks. Vol. 744 Switzerland : Springer International Publishing, 2017. pp. 279-290 (Communications in Computer and Information Sciences).
@inproceedings{81462a90caa14496a4eeb21782372da2,
title = "Two Hidden Layers are Usually Better than One",
abstract = "This study investigates whether feedforward neural networks with two hidden layers generalise better than those with one. In contrast to the existing literature, a method is proposed which allows these networks to be compared empirically on a hidden-node-by-hidden-node basis. This is applied to ten public domain function approximation datasets. Networks with two hidden layers were found to be better generalisers in nine of the ten cases, although the actual degree of improvement is case dependent. The proposed method can be used to rapidly determine whether it is worth considering two hidden layers for a given problem.",
author = "Alan Thomas and Miltiadis Petridis and Simon Walters and {Malekshahi Gheytassi}, Mohammad and Robert Morgan",
note = "The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-65172-9_24",
year = "2017",
month = "8",
day = "2",
doi = "10.1007/978-3-319-65172-9_24",
language = "English",
isbn = "9783319651712",
volume = "744",
series = "Communications in Computer and Information Sciences",
publisher = "Springer International Publishing",
pages = "279--290",
booktitle = "EANN: International Conference on Engineering Applications of Neural Networks",

}

Thomas, A, Petridis, M, Walters, S, Malekshahi Gheytassi, M & Morgan, R 2017, Two Hidden Layers are Usually Better than One. in EANN: International Conference on Engineering Applications of Neural Networks. vol. 744, Communications in Computer and Information Sciences, Springer International Publishing, Switzerland, pp. 279-290, EANN: International Conference on Engineering Applications of Neural Networks, 2/08/17. https://doi.org/10.1007/978-3-319-65172-9_24

Two Hidden Layers are Usually Better than One. / Thomas, Alan; Petridis, Miltiadis; Walters, Simon; Malekshahi Gheytassi, Mohammad; Morgan, Robert.

EANN: International Conference on Engineering Applications of Neural Networks. Vol. 744 Switzerland : Springer International Publishing, 2017. p. 279-290 (Communications in Computer and Information Sciences).

Research output: Chapter in Book/Conference proceeding with ISSN or ISBNConference contribution with ISSN or ISBN

TY - GEN

T1 - Two Hidden Layers are Usually Better than One

AU - Thomas, Alan

AU - Petridis, Miltiadis

AU - Walters, Simon

AU - Malekshahi Gheytassi, Mohammad

AU - Morgan, Robert

N1 - The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-65172-9_24

PY - 2017/8/2

Y1 - 2017/8/2

N2 - This study investigates whether feedforward neural networks with two hidden layers generalise better than those with one. In contrast to the existing literature, a method is proposed which allows these networks to be compared empirically on a hidden-node-by-hidden-node basis. This is applied to ten public domain function approximation datasets. Networks with two hidden layers were found to be better generalisers in nine of the ten cases, although the actual degree of improvement is case dependent. The proposed method can be used to rapidly determine whether it is worth considering two hidden layers for a given problem.

AB - This study investigates whether feedforward neural networks with two hidden layers generalise better than those with one. In contrast to the existing literature, a method is proposed which allows these networks to be compared empirically on a hidden-node-by-hidden-node basis. This is applied to ten public domain function approximation datasets. Networks with two hidden layers were found to be better generalisers in nine of the ten cases, although the actual degree of improvement is case dependent. The proposed method can be used to rapidly determine whether it is worth considering two hidden layers for a given problem.

U2 - 10.1007/978-3-319-65172-9_24

DO - 10.1007/978-3-319-65172-9_24

M3 - Conference contribution with ISSN or ISBN

SN - 9783319651712

VL - 744

T3 - Communications in Computer and Information Sciences

SP - 279

EP - 290

BT - EANN: International Conference on Engineering Applications of Neural Networks

PB - Springer International Publishing

CY - Switzerland

ER -

Thomas A, Petridis M, Walters S, Malekshahi Gheytassi M, Morgan R. Two Hidden Layers are Usually Better than One. In EANN: International Conference on Engineering Applications of Neural Networks. Vol. 744. Switzerland: Springer International Publishing. 2017. p. 279-290. (Communications in Computer and Information Sciences). https://doi.org/10.1007/978-3-319-65172-9_24