×

Systems and Methods for Implementing Three-Dimensional (3D) Gesture Based Graphical User Interfaces (GUI) that Incorporate Gesture Reactive Interface Objects

  • US 20140298273A1
  • Filed: 08/12/2013
  • Published: 10/02/2014
  • Est. Priority Date: 04/02/2013
  • Status: Active Grant
First Claim
Patent Images

1. A method of rendering a user interface on a computing device, comprising:

  • rendering an initial user interface comprising a set of interface objects using a computing device, where each interface object in the set of interface objects includes a graphical element that is rendered when the interface object is rendered for display and a target zone;

    detecting a targeting 3D gesture in captured image data that identifies a targeted interface object within the user interface using the computing device by;

    identifying a 3D interaction zone within the captured image data that maps to the user interface;

    determining the location of at least a portion of a human hand within the 3D interaction zone;

    identifying a pose of the at least a portion of a human hand corresponding to a targeting 3D gesture;

    mapping the location of the at least a portion of a human hand within the 3D interaction zone to a location within the user interface;

    determining that the mapped location within the user interface falls within the target zone of a specific interface object; and

    detecting the occurrence of a targeting 3D gesture targeting the specific interface object; and

    enabling a set of one or more interaction gestures for the targeted interface object in response to the detection of the targeting 3D gesture using the computing device wherein each of the one or more interaction gestures is associated with a permitted interaction in a set of permitted interactions allowed for the targeted interface object and each permitted interaction is an action performed via the user interface to manipulate the targeted interface object;

    changing the rendering of at least the targeted interface object within the user interface in response to the targeting 3D gesture that targets the interface object using the computing device;

    detecting an interaction 3D gesture from the set of one or more interaction gestures for the targeted interface object in additional captured image data that identifies a specific interaction from the set of permitted interactions with the targeted interface object using the computing device, where the detection of the interaction 3D gesture comprises;

    tracking the motion of at least a portion of a human hand within the 3D interaction zone; and

    determining that the pose of at least a portion of a human hand within the 3D interaction zone has changed and corresponds to an interaction 3D gesture from the set of one or more interaction gestures for the targeted interface object irrespective of the location of the at least a portion of a human hand within the 3D interaction zone; and

    modifying the user interface in response to the specific interaction with the targeted interface object identified by the detected interaction 3D gesture using the computing device; and

    rendering the modified user interface using the computing device.

View all claims
  • 5 Assignments
Timeline View
Assignment View
    ×
    ×