×

Depth-based user interface gesture control

  • US 9,389,779 B2
  • Filed: 03/14/2013
  • Issued: 07/12/2016
  • Est. Priority Date: 03/14/2013
  • Status: Active Grant
First Claim
Patent Images

1. A computing device for depth-based gesture control, the computing device comprising:

  • a display to define a surface normal;

    a depth sensor to;

    generate depth sensor data indicative of a depth relative to the display of an input gesture performed by a user of the computing device in front of the display;

    generate second depth sensor data indicative of a second depth relative to the display of a second input gesture performed by a second user of the computing device in front of the display; and

    generate third depth sensor data indicative of a third depth relative to the display of a third input gesture performed by the second user of the computing device in front of the display;

    a gesture recognition module to recognize the input gesture, the second input gesture, and the third input gesture;

    a depth recognition module to;

    receive the depth sensor data, the second depth sensor data, and the third depth sensor data from the depth sensor;

    determine the depth of the input gesture as a function of the depth sensor data;

    determine the second depth of the second input gesture as a function of the second depth sensor data;

    determine the third depth of the third input gesture as a function of the third depth sensor data;

    assign a depth plane to the input gesture as a function of the depth of the input gesture;

    assign a second depth plane different from the depth plane to the second input gesture as a function of the second depth of the second input gesture; and

    assign a third depth plane different from the second depth plane to the third input gesture as a function of the third depth of the third input gesture;

    wherein each depth plane is positioned parallel to the display and intersects the surface normal; and

    a user command module to;

    designate the second depth plane as an accessible depth plane for the second user;

    execute a user interface command based on the input gesture and the assigned depth plane;

    execute a second user interface command based on the second input gesture and the assigned second depth plane;

    determine whether the third depth is associated with the accessible depth plane for the second user; and

    reject the third input gesture in response to a determination that the third depth is not associated with the accessible depth plane for the second user,wherein to execute the second user interface command comprises to;

    determine whether the second assigned depth plane comprises a secondary virtual touch plane of the computing device; and

    execute a secondary user interface command in response to a determination that the assigned second depth plane comprises the secondary virtual touch plane, wherein to execute the secondary user interface command comprises to display a contextual command menu on the display.

View all claims
  • 1 Assignment
Timeline View
Assignment View
    ×
    ×