×

Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects

  • US 9,507,417 B2
  • Filed: 01/07/2015
  • Issued: 11/29/2016
  • Est. Priority Date: 01/07/2014
  • Status: Expired due to Fees
First Claim
Patent Images

1. A method of rendering a user interface on a real-time gesture based interactive system comprising an image capture system including at least two cameras, an image processing system and a display device, the method comprising:

  • rendering an initial user interface comprising a set of interface objects using the image processing system, where each interface object comprises;

    a graphical element that is rendered when the interface object is rendered for display;

    a target zone that defines at least one region in the user interface in which a targeting three-dimensional (3D) gesture targets the interface object; and

    a description of a set of permitted interactions;

    displaying the rendered user interface using the display;

    capturing image data using the image capture system;

    detecting an input via a 3D head gesture input modality from the captured image data using the image processing system;

    changing the manner in which the initial user interface is rendered in response to detection of an input via a 3D head gesture input modality using the image processing system;

    displaying the rendered user interface using the display;

    identifying a 3D interaction zone within the captured image data that maps to the user interface;

    determining the location of at least a portion of a human head within the 3D interaction zone from the captured image data;

    identifying a first pose of the at least a portion of a human head within the target zone that corresponds to a targeting 3D head gesture;

    mapping the location of the at least a portion of a human head within the 3D interaction zone to a location within the user interface;

    determining that the mapped location within the user interface falls within the target zone of a specific interface object in the user interface;

    identifying the specific interface object as a targeted interface object in response to an identification of the first pose as a targeting head gesture and a determination that the mapped location of the at least a portion of the human head falls within the target zone of the specific interface object in the user interface; and

    changing the rendering of at least the targeted interface object within the user interface in response to the targeting 3D head gesture using the image processing system;

    displaying the user interface via the display;

    capturing additional image data using the image capture system;

    determining that the targeting 3D head gesture targets the targeted interface object for a predetermined period of time, where the determination considers the targeting 3D head gesture to be targeting the targeted interface object during any period of time in which the targeting 3D head gesture does not target the targeted interface object that is less than a hysteresis threshold;

    enabling a set of one or more interaction 3D head gestures for the targeted interface object in response to the detection of the targeting 3D head gesture using the image processing system wherein each of the one or more interaction gestures is associated with a permitted interaction in a set of permitted interactions allowed for the targeted interface object and each permitted interaction is an action performed via the user interface to manipulate the targeted interface object;

    displaying an interaction element indicating the time remaining to interact with the targeted interface object in response to a determination that the targeting 3D head gesture has targeted the interface object for a predetermined period of time using the image processing system;

    tracking the motion of at least a portion of a human head within the 3D interaction zone in additional captured image data captured within a predetermined time period from the detection of the targeting 3D head gesture input using the image processing system;

    identifying a change in pose for the at least a portion of a human head within the 3D interaction zone from the first pose to a second pose during the motion of the at least a portion of the human head within the 3D interaction zone during the motion using the image processing system;

    determining the motion of the at least a portion of the human head corresponds to a specific interaction 3D head gesture from the set of one or more interaction gestures enabled for the targeted interface object that identifies a specific interaction with the targeted interface object using the image processing system;

    verifying that the specific interaction 3D head gesture is associated with a specific interaction within the set of permitted interactions for the interface object using the image processing system;

    modifying the user interface in response to the specific interaction with the targeted interface object identified by the specific interaction 3D head gesture using the image processing system;

    rendering the modified user interface using the image processing system; and

    displaying the rendered user interface using the display.

View all claims
  • 4 Assignments
Timeline View
Assignment View
    ×
    ×