Open Access Repository

The construction of training signals from incomplete information for use with sequential input classifiers


Downloads per month over past year

Lewis, Carl Simon 1999 , 'The construction of training signals from incomplete information for use with sequential input classifiers', PhD thesis, University of Tasmania.

[img] PDF (Whole thesis)
whole_LewisCarl...pdf | Request a copy
Full text restricted
Available under University of Tasmania Standard License.


Input and training signals are presented to recurrent artificial neural networks in an inherently sequential manner. However, of the two broad categories of practical problems to which recurrent neural networks are applied, signal prediction/reproduction and supervised signal classification, the information necessary to construct a training signal that can be presented over the full length of the input signal is usually only present in the first category. It is this absence of information in the second group that gives rise to differing problems, techniques and nomenclature in what could be otherwise be considered a subset of the signal prediction problem.
The only information usually available for use as a training signal with classification tasks is the desired final classification of each training example. Normal practice is to present this piece of information, as a training signal of length one, concurrently with the last piece of input data, ignoring the possibility of providing a training signal prior to that point. The sequential nature of the training signal for classification tasks has been, to all intents and purposes, forgotten.
This work explores the deficiency by examining the nature of suitable values, whether or not they can be constructed from the available information, and whether or not standard neural network training algorithms can be (or need to be) modified to make use of such values. It finds that methods do exist and develops a family of such methods, signal melding, which demonstrates the usefulness and possibilities inherent in filling in the missing values in the training signal. The signal melding technique makes use of the dynamic information seen at a network's outputs during the presentation of input. As a benefit, this information offers some important clues about the diversity of examples within an apparently homogeneous class.
To make full use of these values, a modification to back propagation through time is developed, attentive back propagation through time, which shows clear advantages over the standard algorithm when using training signals over the full length of the input signal. Simulations taking over three years of run-time are presented which show that on simple classification tasks, when signal melding and attentive back propagation through time are combined, performance levels at least comparable with existing methods are achieved, together with much improved stability.

Item Type: Thesis - PhD
Authors/Creators:Lewis, Carl Simon
Keywords: Neural circuitry, Neural networks (Neurobiology)
Copyright Holders: The Author
Copyright Information:

Copyright 1999 the Author - The University is continuing to endeavour to trace the copyright
owner(s) and in the meantime this item has been reproduced here in good faith. We
would be pleased to hear from the copyright owner(s).

Additional Information:

Thesis (Ph.D.)--University of Tasmania, 1999. Includes bibliographical references

Item Statistics: View statistics for this item

Actions (login required)

Item Control Page Item Control Page