Phyllometric parameters and artificial neural networks for the identification of Banksia accessions

G. Messina, C. Pandolfi, S. Mugnai, E. Azzarello, Kingsley Dixon, S. Mancuso

    Research output: Contribution to journalArticlepeer-review

    3 Citations (Scopus)

    Abstract

    Taxonomic identification is traditionally carried out with dichotomous keys, or at least computer-based identification keys, often on the basis of subjective visual assessment and frequently unable to detect small differences at subspecies and varietal ranks. The aims of the present work were to (1) clearly discriminate a wide group of accessions (species, subspecies and varieties) belonging to the genus Banksia on the basis of 14 phyllometric parameters determined by image analysis of the leaves, and (2) unequivocally identify the accessions with a relatively simple back-propagation neural-network (BPNN) architecture (single hidden layer) in order to develop a complementary method for fast botanical identification. The results indicate that this kind of network could be effectively and successfully used to discriminate among Banksia accessions, as the BPNN enabled a 93% unequivocal and correct simultaneous identification. Our BPNN had the advantage of being able to resolve subtle associations between characters, and of making incomplete data (i.e. absence of Banksia flower parameters such as the colour or size of styles) useful in species diagnostics. This method is relatively useful; it is easy to execute as no particular competences are necessary, equipment is low cost (scanner connected to a PC and software available as freeware) and data acquisition is fast and effective.
    Original languageEnglish
    Pages (from-to)31-38
    JournalAustralian Systematic Botany
    Volume22
    Issue number1
    DOIs
    Publication statusPublished - 2009

    Fingerprint

    Dive into the research topics of 'Phyllometric parameters and artificial neural networks for the identification of Banksia accessions'. Together they form a unique fingerprint.

    Cite this