×

Systems and methods for gesture based interaction with viewpoint dependent user interfaces

  • US 9,619,105 B1
  • Filed: 01/30/2014
  • Issued: 04/11/2017
  • Est. Priority Date: 01/30/2014
  • Status: Active Grant
First Claim
Patent Images

1. A real-time gesture based interactive system configured to enable gesture based interaction with user interfaces rendered from a 3D object model in a viewpoint dependent manner, comprising:

  • a processor;

    a camera system configured to capture image data;

    memory containing;

    an operating system including a 3D object model that describes three dimensional spatial relationships between a set of user interface objects comprising a first user interface object and a second user interface object;

    a head tracking application; and

    an object tracking application;

    wherein the operating system configures the processor to;

    capture image data using the camera system;

    detect first physical coordinates of a user'"'"'s head by processing at least a portion of the image data using the head tracking application;

    determine a user viewpoint from which to render a user interface display based on the first physical coordinates of the user'"'"'s head such that a portion of the first user interface object is occluded by the second user interface object in the rendered user interface display;

    determine an object location by processing at least a portion of the captured image data using the object tracking application;

    map the object location to a cursor location comprising three dimensional coordinates;

    render a user interface display from the 3D object model and the cursor location based upon the user viewpoint determined based on the first physical coordinates of the user'"'"'s head;

    capture additional image data using the camera system;

    detect second physical coordinates of the user'"'"'s head by processing at least a portion of the additional image data using the head tracking application, the second physical coordinates being different from the first physical coordinates;

    determine an updated user viewpoint from which to render a user interface display based on the second physical coordinates of the user'"'"'s head, the updated user viewpoint being different from the user viewpoint such that the portion of the first user interface object is not occluded by the second user interface object in the updated user interface display;

    determine an updated object location by processing at least a portion of the additional captured image data using the object tracking application;

    map the updated object location to an updated cursor location comprising three dimensional coordinates; and

    render an updated user interface display from the 3D object model and the updated cursor location based upon the updated user viewpoint determined based on the second physical coordinates of the user'"'"'s head and the updated object location, where the updated user interface display is rendered to simulate motion parallax based upon depth of the user interface objects and the updated cursor location in the 3D object model.

View all claims
  • 3 Assignments
Timeline View
Assignment View
    ×
    ×