Intelligent optical otolith classification for species recognition of osteichthytes.

D. Lefkaditis, G. Awcock, R.J. Howlett, A. Kallianiotis, E. Koutrakis

Research output: Chapter in Book/Conference proceeding with ISSN or ISBNChapterpeer-review


Otolith study is a well-established source of information for understanding the life of fish and fish populations. Conducting fish species identification from otolith samples found in the stomach contents of marine animals finds interesting applications such as dietary studies, stock monitoring and management. Fish species identification can produce useful data for climatology and archaeology, as otoliths can be sourced from geological sediments or archaeological excavations. An automated otolith classification can assist a wide variety of scientific research. This paper presents the development of an automated fish specied identification system. The main focus of this investigation is on the commercially interesting fish of the Northern Aegean Sea. The methodology described in this paper exploits the inherent shape variability of fish otoliths according to their corresponding species. This is based on the processing and analysis of images acquired using a stereoscopic microscope fitted with a digital camera. A feature vector is then constructed that describes the morphology as well as the image statistics of the otoliths. The recognition is carried out by an intelligent classifier made of artificial neural networks. Several configurations of multi-layer perception neural networks are tested in pursuit of a practical and expandable classification system.
Original languageEnglish
Title of host publicationProceedings of the International Topical Meeting in Optical Sensing and Artificial Vision
Place of PublicationSt Petersburg, Russia
Number of pages8
ISBN (Print)9875757703336
Publication statusPublished - 2008


Dive into the research topics of 'Intelligent optical otolith classification for species recognition of osteichthytes.'. Together they form a unique fingerprint.

Cite this