×

Methods for spontaneously generating behavior in two and three-dimensional images and mechanical robots, and of linking this behavior to that of human users

  • US 10,207,405 B2
  • Filed: 01/22/2015
  • Issued: 02/19/2019
  • Est. Priority Date: 03/31/2014
  • Status: Active Grant
First Claim
Patent Images

1. A method of creating apparent emotions or moods in a computer-controlled artificial character or robot using a computer or a computer-controlled device, the method comprising:

  • (a) using one or more processors to select one or more variable oscillator mathematical formulas/equations, where each of these generate an adjustable oscillation which is controlled by changing a value of one or more parameter contained within the variable oscillator mathematical formulas/equations, in the manner of a spring equation which calculates the motion of a virtual spring, wherein the parameter known as a spring constant, k, determines the virtual spring'"'"'s strength, and therefore the virtual spring'"'"'s speed of oscillation, or bouncing back and forth, and wherein the additional parameter, known as damping, d, determines the rapidity with which the virtual spring ceases bouncing and comes to rest;

    creating one or more of an array of Physiological derived behaviors (PDB) using the Variable oscillator mathematical formulas/equations;

    (b) running the one or more variable oscillator mathematical formulas/equations on the one or more processors of a computer, whereby each of the variable oscillator mathematical formulas/equations is calculating a numerical output in real time, in the manner in which the spring equation calculates the numerical output which represents the displacement of the virtual spring being modeled;

    (c) Sending the numerical output of the one or more mathematical equations from the one or more computer processors and receiving the numerical output into the one or more processors of the computer controlled artificial character (artificial character) or robot, which affects some element of a behavior of the artificial character or the robot;

    in the case of the artificial character, the numerical output is sent to a display where the artificial character is shown to a user;

    in the case of the robot, the numerical output is imported into the one or more processors which modifies the position or sound of the robot;

    the one or more processors manipulates the robot or the artificial character in one or more of the following manners;

    Facial Expression—

    in the case of the artificial character, the numerical output (derived within the computer processor) is applied to the artificial character controls running within the processor that modify the artificial character'"'"'s face;

    in the case of the robot, the numerical output regulates the devices that deform the robot'"'"'s face to show emotion;

    Body Movements—

    in the case of the artificial character, the numerical output is applied to the artificial character controls running within the processor that modify a degree of one or more body movements, whereas, in the case of the robot, the numerical output regulates the degree of the mechanical manipulation of the robot;

    Vocal Patterns—

    the numerical output produced by the processor is applied within a device modifying characteristics of an artificial voice to control one or more of the following;

    the artificial voice'"'"'s pitch, the artificial voice'"'"'s volume, the rate at which the artificial voice is speaking via separate oscillators;

    (f) using one or more of a sensing device while the user is interacting with the computer generated artificial character, computer-controlled device or robot to measure natural patterns of the user'"'"'s behavior which resemble the variable oscillation mathematical equations/formulas which generate an adjustable oscillation, whereby the sensing devices including one or more of the following;

    A Touchpad—

    used to record patterns of user input which reflect emotional state, such as fast jittery movements, or slow sluggish movement;

    A User-Facing Camera—

    used in combination with computer vision software to extract the user'"'"'s facial expression, as well as the user'"'"'s movements;

    (g) transmitting the natural patterns of the user'"'"'s behavior measured by the sensing device to the one or more computer processor;

    (h) mathematically linking, via the one or more computer processors, the natural patterns of the user'"'"'s behavior to the Variable oscillation mathematical equations/formulas which generate adjustable oscillations, in the manner of coupled oscillators, whereby patterns and apparent moods induced in the artificial character or robot appear to mirror changing moods of the user as sensed by one or more of the sensing devices.

View all claims
  • 0 Assignments
Timeline View
Assignment View
    ×
    ×