Systems and Methods for Neural Language Modeling
First Claim
1. A computer-implemented neural network, comprising:
- a plurality of neural nodes, each of the neural nodes having a plurality of input weights corresponding to a vector of real numbers;
an input neural node corresponding to a linguistic unit selected from an ordered list of a plurality of linguistic units;
an embedding layer comprising a plurality of embedding node partitions, each embedding node partition comprising one or more neural nodes, wherein each of the embedding node partitions corresponds to a position in the ordered list relative to a focus term, is configured to receive an input from an input node, and is configured to generate an output; and
a classifier layer comprising a plurality of neural nodes, each neural node in the classifier layer configured to receive the embedding outputs from the embedding layer, and configured to generate an output corresponding to a probability that a particular linguistic unit is the focus term.
6 Assignments
0 Petitions
Accused Products
Abstract
In some aspects, the present disclosure relates to neural language modeling. In one embodiment, a computer-implemented neural network includes a plurality of neural nodes, where each of the neural nodes has a plurality of input weights corresponding to a vector of real numbers. The neural network also includes an input neural node corresponding to a linguistic unit selected from an ordered list of a plurality of linguistic units, and an embedding layer with a plurality of embedding node partitions. Each embedding node partition includes one or more neural nodes. Each of the embedding node partitions corresponds to a position in the ordered list relative to a focus term, is configured to receive an input from an input node, and is configured to generate an output. The neural network also includes a classifier layer with a plurality of neural nodes, each configured to receive the embedding outputs from the embedding layer, and configured to generate an output corresponding to a probability that a particular linguistic unit is the focus term.
225 Citations
20 Claims
-
1. A computer-implemented neural network, comprising:
-
a plurality of neural nodes, each of the neural nodes having a plurality of input weights corresponding to a vector of real numbers; an input neural node corresponding to a linguistic unit selected from an ordered list of a plurality of linguistic units; an embedding layer comprising a plurality of embedding node partitions, each embedding node partition comprising one or more neural nodes, wherein each of the embedding node partitions corresponds to a position in the ordered list relative to a focus term, is configured to receive an input from an input node, and is configured to generate an output; and a classifier layer comprising a plurality of neural nodes, each neural node in the classifier layer configured to receive the embedding outputs from the embedding layer, and configured to generate an output corresponding to a probability that a particular linguistic unit is the focus term. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A computer-implemented neural network, comprising:
-
a computer-implemented data structure including; an input matrix syn0 of N rows and M columns, wherein each row corresponds to a linguistic unit and all columns are partitioned into P partitions, wherein each partition is a vector of an equal number of numerical values, and an output matrix syn1 of M rows and N columns, wherein each column corresponds to a linguistic unit and all rows are partitioned into P partitions of equal number and size to the partitions in syn0, wherein each partition is a vector of an equal number of numerical values, wherein the neural network is trained to perform functions that include predicting a linguistic unit given its surrounding context linguistic units within a sentence, wherein each context linguistic unit relates to a partition of its corresponding row in syn0 corresponding to its relative position relative to the focus term and multiples by the corresponding partition in syn1 corresponding to this same relative partition in the focus term. - View Dependent Claims (11, 12, 13, 14)
-
-
15. A system having one or more processors configured to implement:
-
a plurality of neural nodes, each of the neural nodes having a plurality of input weights corresponding to a vector of real numbers; an input neural node corresponding to a linguistic unit selected from an ordered list of a plurality of linguistic units; an embedding layer comprising a plurality of embedding node partitions, each embedding node partition comprising one or more neural nodes, wherein each of the embedding node partitions corresponds to a position in the ordered list relative to a focus term, is configured to receive an input from an input node, and is configured to generate an output; and a classifier layer comprising a plurality of neural nodes, each neural node in the classifier layer configured to receive the embedding outputs from the embedding layer, and configured to generate an output corresponding to a probability that a particular linguistic unit is the focus term. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification