Wearable sensor for tracking articulated body-parts
First Claim
1. A wearable sensing device for gesture-based control of a computing device, the wearable sensing device comprising:
- a camera that captures images of an articulated body part of a user;
a structured illumination source comprising a laser line projector positioned at a specified distance from the camera to illuminate the articulated body part with a known structured illumination pattern comprising a laser line;
a tracking module configured to;
receive the captured images of the articulated body part,receive the laser line, a laser image, and a known baseline distance relative to the camera,receive a kinematic model of the articulated body part, andtrack the kinematic model of the articulated body part based on the captured images by calculating 3D positions of segments of the laser line of the captured images, including by performing binarization of the laser image, segmenting laser line segments, using a digit separation process to separate any merged laser line segments, and obtaining the 3D position of each laser line segment by triangulation to define a 3D point for each digit relative to the camera; and
a communication interface that sends the tracked kinematic model of the articulated body part to the computing device to control the computing device in accordance with a gesture associated with the tracked kinematic model of the articulated body part.
2 Assignments
0 Petitions
Accused Products
Abstract
A wearable sensor for tracking articulated body parts is described such as a wrist-worn device which enables 3D tracking of fingers and optionally also the arm and hand without the need to wear a glove or markers on the hand. In an embodiment a camera captures images of an articulated part of a body of a wearer of the device and an articulated model of the body part is tracked in real time to enable gesture-based control of a separate computing device such as a smart phone, laptop computer or other computing device. In examples the device has a structured illumination source and a diffuse illumination source for illuminating the articulated body part. In some examples an inertial measurement unit is also included in the sensor to enable tracking of the arm and hand
81 Citations
20 Claims
-
1. A wearable sensing device for gesture-based control of a computing device, the wearable sensing device comprising:
-
a camera that captures images of an articulated body part of a user; a structured illumination source comprising a laser line projector positioned at a specified distance from the camera to illuminate the articulated body part with a known structured illumination pattern comprising a laser line; a tracking module configured to; receive the captured images of the articulated body part, receive the laser line, a laser image, and a known baseline distance relative to the camera, receive a kinematic model of the articulated body part, and track the kinematic model of the articulated body part based on the captured images by calculating 3D positions of segments of the laser line of the captured images, including by performing binarization of the laser image, segmenting laser line segments, using a digit separation process to separate any merged laser line segments, and obtaining the 3D position of each laser line segment by triangulation to define a 3D point for each digit relative to the camera; and a communication interface that sends the tracked kinematic model of the articulated body part to the computing device to control the computing device in accordance with a gesture associated with the tracked kinematic model of the articulated body part. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A method of tracking a 3D articulated model of an articulated body part comprising:
-
receiving light emitting diode (LED) images of a hand and associated depth information from a sensing device worn on a wrist associated with the hand, the sensing device comprising a structured illumination source positioned at a specified distance from a camera to illuminate the hand with a known structured illumination pattern comprising a laser line, the associated depth information including a known baseline distance relative to the camera; performing background subtraction of the LED images of the hand; tracking a kinematic model of the hand based on the received LED images by calculating 3D positions of segments of the laser line of the received LED images, including by performing binarization of the LED images, segmenting laser line segments, using a digit separation process to separate any merged laser line segments, and obtaining the 3D position of each laser line segment by triangulation to define a 3D point for each digit relative to the camera; and determining a gesture of the hand based on the tracked kinematic model. - View Dependent Claims (13, 14, 15, 16, 17)
-
-
18. A tracking system for tracking a 3D articulated model of an articulated body part comprising:
-
an input/output controller that receives light emitting diode (LED) images and corresponding depth data from a sensing device configured to be worn on a body comprising the articulated body part, the sensing device comprising; a camera that captures LED images of the articulated body part, and a structured illumination source comprising a laser line projector affixed to a bridge arm positioned at a specified distance from the camera to illuminate the articulated body part with a known structured illumination pattern comprising a laser line; a processor arranged to; perform background subtraction of the LED images of the articulated body part, and receive a kinematic model of the articulated body part; and a tracking module configured to; receive the laser line, the LED images, and the specified distance from the camera; track the kinematic model of the articulated body part by calculating 3D positions of segments of the laser line of the captured images, including by performing binarization of the laser image, segmenting laser line segments, using a digit separation process to separate any merged laser line segments, and obtaining the 3D position of each laser line segment by triangulation to define a 3D point for each digit relative to the camera, and determine a gesture of the articulated body part associated with the tracked kinematic model. - View Dependent Claims (19, 20)
-
Specification