Mixture density networks

Abstract

Minimization of a sum-of-squares or cross-entropy error function leads to network outputs which approximate the conditional averages of the target data, conditioned on the input vector. For classifications problems, with a suitably chosen target coding scheme, these averages represent the posterior probabilities of class membership, and so can be regarded as optimal. For problems involving the prediction of continuous variables, however, the conditional averages provide only a very limited description of the properties of the target variables. This is particularly true for problems in which the mapping to be learned is multi-valued, as often arises in the solution of inverse problems, since the average of several correct target values is not necessarily itself a correct value. In order to obtain a complete description of the data, for the purposes of predicting the outputs corresponding to new input vectors, we must model the conditional probability distribution of the target data, again conditioned on the input vector. In this paper we introduce a new class of network models obtained by combining a conventional neural network with a mixture density model. The complete system is called a Mixture Density Network, and can in principle represent arbitrary conditional probability distributions in the same way that a conventional neural network can represent arbitrary functions. We demonstrate the effectiveness of Mixture Density Networks using both a toy problem and a problem involving robot inverse kinematics.

Divisions: Aston University (General)
Uncontrolled Keywords: NCRG sum-of-squares cross-entropy error function classifications problems coding scheme conditional probability distribution network models neural network mixture density model Mixture Density Network inverse kinematics
ISBN: NCRG/94/004
Last Modified: 19 Mar 2024 08:39
Date Deposited: 21 Jul 2009 11:38
PURE Output Type: Technical report
Published Date: 1994
Authors: Bishop, Christopher M.

Download

Export / Share Citation


Statistics

Additional statistics for this record