Generative text using a personality model
First Claim
Patent Images
1. A system for generating natural language, the system comprising:
- a prospect modeling component, operable to correlate known information about a target prospect personality with a quantitative personality model, the quantitative personality model being expressed as a vector indicating a relative expression of a plurality of relatively mutually orthogonal personality traits; and
a neural sequence-to-sequence encoder-decoder, wherein the encoder is operable to deconstruct a source text and represent it as a sequence of weights on a pre-built conditional text model, and wherein the decoder is operable to create a generated text with approximately equal semantic content but differing syntax and word choice,wherein the quantitative personality model provides an input to the decoder, andwherein the syntax and word choice of the generated text varies as a function of the relative expression of the quantitative personality model.
2 Assignments
0 Petitions
Accused Products
Abstract
A personality model is created for a population and used as an input to a text generation system. Alternative texts are created based upon the emotional effect of the generated text. Certain words or phrases are “pinned” in the output, reducing the variability of the generated text so as to preserve required information content, and a number of tests provide input to a discriminator network so that proposed outputs both match an outside objective regarding the information content, emotional affect, and grammatical acceptability. A feedback loop provides new “ground truth” data points for refining the personality model and associated generated text.
33 Citations
20 Claims
-
1. A system for generating natural language, the system comprising:
-
a prospect modeling component, operable to correlate known information about a target prospect personality with a quantitative personality model, the quantitative personality model being expressed as a vector indicating a relative expression of a plurality of relatively mutually orthogonal personality traits; and a neural sequence-to-sequence encoder-decoder, wherein the encoder is operable to deconstruct a source text and represent it as a sequence of weights on a pre-built conditional text model, and wherein the decoder is operable to create a generated text with approximately equal semantic content but differing syntax and word choice, wherein the quantitative personality model provides an input to the decoder, and wherein the syntax and word choice of the generated text varies as a function of the relative expression of the quantitative personality model. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A method of generating natural language, the method comprising:
-
collecting representative text information attributable to one or more natural persons; preprocessing and tokenizing the representative text information; converting the tokenized text information into at least one corresponding input personality embedding, wherein the at least one corresponding input personality embedding indicates a relative expression of a plurality of mutually orthogonal personality traits of a quantitative personality model expressed in the representative text information; providing a text prompt and the at least one corresponding input personality embedding as inputs to a text prediction layer; sampling a maximum likelihood next token from the text prediction layer; outputting a natural language element corresponding to the maximum likelihood next token; and collecting the output natural language element into a generated text. - View Dependent Claims (9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A system, including a processor and instructions included in a non-volatile computer-readable medium, wherein the instructions when interpreted by the processor cause the processor to:
-
collect representative text information attributable to one or more natural persons; impute a strength of one or more mutually orthogonal personality traits as expressed in the representative text; divide the one or more natural persons into one or more modeled classes based upon shared relative values of the imputed personality trait; receive an input text including a first semantic content and a first expressive content, wherein the first expressive content is measured by the imputed personality trait; and generate an output text that includes the first semantic content and a second expressive content. - View Dependent Claims (18, 19, 20)
-
Specification