×

Gesture control system capable of interacting with 3D images

  • US 9,405,378 B2
  • Filed: 10/07/2014
  • Issued: 08/02/2016
  • Est. Priority Date: 09/03/2014
  • Status: Expired due to Fees
First Claim
Patent Images

1. A gesture control system capable of interacting with three-dimensional (3D) images, comprising:

  • a 3D image display device to display a 3D stereoscopic or auto-stereoscopic image;

    a gesture recognition device, comprising a hand(s) and finger(s) image detection module and a hand(s) and finger(s) coordinate tracking module, the hand(s) and finger(s) image detection module being electrically connected to the hand(s) and finger(s) coordinate tracking module, the hand(s) and finger(s) image detection module capturing a hand(s) and finger(s) image of a user, and the hand(s) and finger(s) coordinate tracking module calculating a hand(s) and finger(s) coordinate based on image changes generated by user'"'"'s hand(s) and finger(s) movements; and

    a data processing unit electrically connected to the 3D image display device and the gesture recognition device, and comprising a central processing unit and a graphic processing unit, the central processing unit being electrically connected to the graphics processing unit, the central processing unit comprising a built-in software algorithm program, and one of the central process unit and the graphic processing unit comprising a built-in image coordinate software program for establishing an image 3-dimensional spatial coordinate based on an image data to be displayed on the 3D image display device, a sensing functional area being defined by positions and orientations of the gesture recognition device, wherein when user'"'"'s hand(s) and finger(s) move within the sensing functional area, the gesture recognition device captures the hand(s) and finger(s) image of the user, and calculates a hand(s) and finger(s) coordinate based on sensed data of the hand(s) and finger(s) image detection module, the hand(s) and finger(s) coordinate is then transmitted to the data processing unit for calculation and is matched with the image 3-dimensional spatial coordinates by superimposition of the hand(s) and finger(s) coordinate and the image 3-dimensional spatial coordinate for being output via the 3D image display device, and by adjusting a virtual position of the 3D image via changing a depth map of the image data to be displayed according to 3D positions of finger(s) and hand(s) of the user in the real world so that a depth of the image data to be displayed by the 3D image display device in the real world matches z-coordinates of the finger(s) and hand(s) in real world;

    wherein the sensing functional area is defined by positions and coordinates of the gesture recognition device, and a size of the sensing functional area varies based on at least one of sensing element types, quantity and deploying configuration.

View all claims
  • 1 Assignment
Timeline View
Assignment View
    ×
    ×