Fine-motion virtual-reality or augmented-reality control using radar
First Claim
Patent Images
1. A computer-implemented method for fine-motion virtual-reality (VR) or augmented-reality (AR) control, the computer-implemented method comprising:
- presenting a VR/AR object within a VR world or an AR viewport;
tracking a user interaction with the VR/AR object over time, a hand of a user that performs the user interaction being viewable by the user through the AR viewport or represented within the VR world, the tracking comprising;
transmitting, by a radar system, a radar field over time;
receiving, by the radar system, a radar signal representing a superposition of reflections of the radar field off two or more spatially separated points of the hand of the user over time;
determining, based on the received radar signal, relative velocities between the spatially separated points of the hand over time;
determining, based on the relative velocities, movement of the hand of the user over time; and
determining, based on the received radar signal, locations of the hand of the user over time,the tracking based on the movement and the locations of the hand of the user over time; and
altering the VR/AR object in real time corresponding to the user interaction.
2 Assignments
0 Petitions
Accused Products
Abstract
This document describes techniques for fine-motion virtual-reality or augmented-reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or sub-millimeter scale, for user control actions even when those actions are small, fast, or obscured due to darkness or varying light. Further, these techniques enable fine resolution and real-time control, unlike conventional RF-tracking or optical-tracking techniques.
686 Citations
42 Claims
-
1. A computer-implemented method for fine-motion virtual-reality (VR) or augmented-reality (AR) control, the computer-implemented method comprising:
-
presenting a VR/AR object within a VR world or an AR viewport; tracking a user interaction with the VR/AR object over time, a hand of a user that performs the user interaction being viewable by the user through the AR viewport or represented within the VR world, the tracking comprising; transmitting, by a radar system, a radar field over time; receiving, by the radar system, a radar signal representing a superposition of reflections of the radar field off two or more spatially separated points of the hand of the user over time; determining, based on the received radar signal, relative velocities between the spatially separated points of the hand over time; determining, based on the relative velocities, movement of the hand of the user over time; and determining, based on the received radar signal, locations of the hand of the user over time, the tracking based on the movement and the locations of the hand of the user over time; and altering the VR/AR object in real time corresponding to the user interaction. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21)
-
-
22. A computer-implemented method for fine-motion virtual-reality (VR) or augmented-reality (AR) control, the computer-implemented method comprising:
-
presenting a VR/AR object within a VR world or an AR viewport; tracking a user interaction with the VR/AR object over time, a portion of a user that performs the user interaction being viewable by the user through the AR viewport or represented within the VR world, the tracking comprising; transmitting, by a radar system, a radar field over time; receiving, by the radar system, a radar signal representing a superposition of reflections of the radar field off two or more spatially separated points of the portion of the user over time; distinguishing the reflections of the two or more spatially separated points by determining respective Doppler centroids within the radar signal for the two or more spatially separated points; spatially resolving, based on the respective Doppler centroids, the two or more spatially separated points over time; and determining, based on the spatially resolving, locations and movement of the portion of the user over time, the tracking based on the determined locations and movement of the portion of the user over time; and altering the VR/AR object in real time corresponding to the user interaction. - View Dependent Claims (23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42)
-
Specification