Spatially-correlated multi-display human-machine interface
First Claim
1. A handheld device for displaying images of a virtual 3D object from different viewing perspectives, the handheld device comprising:
- a handheld housing freely movable in free space and dimensioned to be grasped and supported by the hands of a user, the handheld housing being configured and dimensioned to change attitude in free space with movement of the user'"'"'s hands;
a gyroscopic sensor disposed within the housing, the gyroscopic sensor being configured to sense rotation in multiple axes;
an acceleration sensor disposed within the housing, the acceleration sensor being configured to sense acceleration in multiple axes;
a direction sensor disposed within the housing, the direction sensor being configured to sense direction;
a wireless transceiver disposed within the housing and operatively coupled to the gyroscopic sensor, the acceleration sensor and the direction sensor, the wireless transceiver configured to (a) wirelessly transmit information related to the sensed rotation, the sensed acceleration and the sensed direction, and (b) wirelessly receive a video signal encoding images of the virtual 3D object from viewing perspectives that continually update in response to sensed rotation, acceleration and direction of the handheld housing; and
a handheld display disposed on the handheld housing, the handheld display operatively coupled to the wireless transceiver and configured to display the virtual 3D object with continually updating viewing perspectives to spatially correlate 3D viewing perspective of the displayed virtual 3D object with free space attitude of the housing.
0 Assignments
0 Petitions
Accused Products
Abstract
A human-machine interface involves plural spatially-coherent visual presentation surfaces at least some of which are movable by a person. Plural windows or portholes into a virtual space, at least some of which are handheld and movable, are provided by using handheld and other display devices. Aspects of multi-dimensional spatiality of the moveable window (e.g., relative to another window) are determined and used to generate images. As one example, the moveable window can present a first person perspective “porthole” view into the virtual space, this porthole view changing based on aspects of the moveable window'"'"'s spatiality in multi-dimensional space relative to a stationary window. A display can present an image of a virtual space, and an additional, moveable display can present an additional image of the same virtual space.
249 Citations
19 Claims
-
1. A handheld device for displaying images of a virtual 3D object from different viewing perspectives, the handheld device comprising:
-
a handheld housing freely movable in free space and dimensioned to be grasped and supported by the hands of a user, the handheld housing being configured and dimensioned to change attitude in free space with movement of the user'"'"'s hands; a gyroscopic sensor disposed within the housing, the gyroscopic sensor being configured to sense rotation in multiple axes; an acceleration sensor disposed within the housing, the acceleration sensor being configured to sense acceleration in multiple axes; a direction sensor disposed within the housing, the direction sensor being configured to sense direction; a wireless transceiver disposed within the housing and operatively coupled to the gyroscopic sensor, the acceleration sensor and the direction sensor, the wireless transceiver configured to (a) wirelessly transmit information related to the sensed rotation, the sensed acceleration and the sensed direction, and (b) wirelessly receive a video signal encoding images of the virtual 3D object from viewing perspectives that continually update in response to sensed rotation, acceleration and direction of the handheld housing; and a handheld display disposed on the handheld housing, the handheld display operatively coupled to the wireless transceiver and configured to display the virtual 3D object with continually updating viewing perspectives to spatially correlate 3D viewing perspective of the displayed virtual 3D object with free space attitude of the housing. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A display device comprising:
-
a handheld housing graspable and supportable by the hands of a user, the housing being configured and dimensioned to move in free space; a MARG sensor array disposed within the housing, the MARG sensor array sensing attitude of the housing; a wireless radio disposed within the housing and coupled to the MARG sensor array, the wireless radio (a) wirelessly transmitting the sensed attitude of the housing and (b) wirelessly receiving a video signal encoding images of a virtual world viewed from a viewpoint responsive to the transmitted sensed attitude; and a display disposed on the housing, the display coupled to the wireless radio and configured to display the images of the virtual world responsive to the received video signal to provide spatial coherence between the attitude of the housing in the real world and the viewpoint of the virtual world displayed by the display. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19)
-
Specification