Neural network with semi-localized non-linear mapping of the input space
First Claim
1. A neural network for effecting non-linear mapping of an input space, comprising:
- an input layer for receiving a multi-dimensional input vector xi, for i ranging from 1 to n, n being an integer;
a hidden layer, having m hidden units, m being an integer, each of said hidden units receiving a plurality of inputs and providing a single output and having an activation function;
##EQU12## where;
h is the number of the associated one of said hidden units,μ
hi is the center of ah,fhi is a localized function in the xi dimension and non-localized in at least one other of said xi dimensions,Chi is a constant and {Phi } is a set of parameters for fhi ;
a connection matrix for interconnecting the output of select ones of the input vectors xi to the inputs of said hidden units in accordance with an interconnection scheme;
an output layer comprising M output units, M being an integer, each of said output units for providing a single output Yi for i ranging from 1 to M, each of said output units having a predetermined transfer function g for i ranging from 1 to M; and
an output interconnection matrix for interconnecting the output of select ones of said hidden units to the inputs of select ones of said M output units in accordance with an interconnection scheme, each of said interconnections between the output of said hidden units and the associated one of said M output units having an output weight associated therewith.
2 Assignments
0 Petitions
Accused Products
Abstract
A neural network includes an input layer comprising a plurality of input units (24) interconnected to a hidden layer with a plurality of hidden units (26) disposed therein through an interconnection matrix (28). Each of the hidden units (26) is a single output that is connected to output units (32) in an output layer through an interconnection matrix (30). Each of the interconnections between one of the hidden units (26) to one of the output units (32) has a weight associated therewith. Each of the hidden units (26) has an activation in the i'"'"'th dimension and extending across all the other dimensions in a non-localized manner in accordance with the following equation: ##EQU1## that the network learns by the Back Propagation method to vary the output weights and the parameters of the activation function μhi and σhi.
-
Citations
17 Claims
-
1. A neural network for effecting non-linear mapping of an input space, comprising:
-
an input layer for receiving a multi-dimensional input vector xi, for i ranging from 1 to n, n being an integer; a hidden layer, having m hidden units, m being an integer, each of said hidden units receiving a plurality of inputs and providing a single output and having an activation function;
##EQU12## where;
h is the number of the associated one of said hidden units,μ
hi is the center of ah,fhi is a localized function in the xi dimension and non-localized in at least one other of said xi dimensions, Chi is a constant and {Phi } is a set of parameters for fhi ; a connection matrix for interconnecting the output of select ones of the input vectors xi to the inputs of said hidden units in accordance with an interconnection scheme; an output layer comprising M output units, M being an integer, each of said output units for providing a single output Yi for i ranging from 1 to M, each of said output units having a predetermined transfer function g for i ranging from 1 to M; and an output interconnection matrix for interconnecting the output of select ones of said hidden units to the inputs of select ones of said M output units in accordance with an interconnection scheme, each of said interconnections between the output of said hidden units and the associated one of said M output units having an output weight associated therewith. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A method for mapping an input space in the neural network, comprising the steps of:
-
receiving a multi-dimensional input vector xi, for i ranging from 1 to n, n being an integer; providing m hidden units, m being an integer, each of the hidden units providing a single output value and having an activation function;
##EQU15## where;
h is the number of the associated one of the hidden units,μ
hi is the localized center of ah in the x1 dimension, andfhi is a localized function in the xi dimension and non-localized in at least one other of the xi dimensions; interconnecting the output of select ones of the input vectors xi to the inputs of the hidden units in accordance with an interconnection scheme; providing a plurality of output units in an output layer, each of the output units for providing a single output yi for i ranging from 1 to M, M being an integer, each of the output units having associated therewith a predetermined control function g; and interconnecting the output of select ones of the hidden units to the inputs of predetermined ones of the M output units in accordance with the predetermined interconnection scheme, each of the interconnections between the output of the hidden units and the associated one of the M output units having an output weight associated therewith. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17)
-
Specification