Movable audio/video communication interface system
First Claim
1. A system, comprising:
- multiple input/output systems coupled together to provide a view of a common scene from perspectives of each of the systems, each system comprising;
a display/sensor assembly presenting the view to a viewer and sensing a user position and user viewpoint;
a robotic arm coupled to the assembly and providing display position and orientation information;
a computer determining the view responsive to the user position and viewpoint, producing a display responsive to the position and viewpoint, comparing the user position to position range limits and producing robot motion control information to keep the user position within the range limits, the robotic arm moving and orienting the assembly responsive to the motion control information.
4 Assignments
0 Petitions
Accused Products
Abstract
A system that includes a desk top assembly of a display and sensors mounted on a robotic arm. The arm moves the assembly so that it remains within position and orientation tolerances relative to the user'"'"'s head as the user looks around. Near-field speaker arrays supply audio and a microphone array senses a user'"'"'s voice. Filters are applied to head motion to reduce latency for arm'"'"'s tracking of the head. The system is full duplex with other systems allowing immersive collaboration. Lighting and sound generation take place close to the user'"'"'s head. A haptic interface device allows the user to grab the display/sensor array and move it about. Motion acts as a planar selection device for 3D data. Planar force feedback allows a user to “feel” the data. Users see not only each other through display windows, but can also see the positions and orientations of each others'"'"' planar selections of shared 3D models or data.
-
Citations
15 Claims
-
1. A system, comprising:
multiple input/output systems coupled together to provide a view of a common scene from perspectives of each of the systems, each system comprising;
a display/sensor assembly presenting the view to a viewer and sensing a user position and user viewpoint;
a robotic arm coupled to the assembly and providing display position and orientation information;
a computer determining the view responsive to the user position and viewpoint, producing a display responsive to the position and viewpoint, comparing the user position to position range limits and producing robot motion control information to keep the user position within the range limits, the robotic arm moving and orienting the assembly responsive to the motion control information. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
10. An input/output interface, comprising:
-
a display providing a three dimensional view of a scene;
speakers attached to the display and providing a stereo sound;
tracking sensors attached to the display and tracking viewer head motion and eye position;
sound sensors attached to the display and detecting sound direction;
a handle attached to the display and allowing a user to control position and orientation of the display; and
an I/O control interface attached to the handle.
-
-
11. A process, comprising:
-
sensing a position of a user relative to a virtual scene; and
adjusting a view into the virtual scene responsive to the position using a computer.
-
-
12. A system, comprising:
-
a communication system;
first and second display and capture systems each locally capturing images and sound and transmitting the locally captured images and sound over the communication system, and receiving remotely captured images and sound and displaying/playing the remotely captured images and sound to a viewer and where each display and capture system comprises;
a desk top robotic movable arm having three degrees of freedom;
a movable display connected to an end of the movable arm, having three degrees of freedom and movable independently of the arm and displaying the remotely captured images and a common stereo image;
a stereo/autostereo image projection system associated with the display for projecting a stereo image of the captured images to a viewer of the display and having a preferred viewing angle;
near field speakers producing stereo sound from the remotely captured sound;
video sensors including cameras mounted on the display and for capturing a stereo image of a head of a viewer viewing the display;
light sources in association with the video sensors;
sound sensors including microphones mounted on the display for capturing stereo sound from the head of the viewer viewing the display;
a touch sensitive handle attached to the display/arm allowing a user to move the display and providing direction and movement amount outputs; and
a computer system, communicating with the communication system;
processing the locally captured stereo image using Kalman filter to determine a head position and head orientation of the head of the viewer;
processing the locally captured stereo image to determine an eye position of the viewer;
adjusting a position of the movable arm and the movable display, when the handle is not being touched, to maintain the head of the viewer within the viewing angle and responsive to an environmental constraint map indicating objects within the movement range of the display and arm;
adjusting a position of the movable arm and the movable display responsive to the direction and movement amount outputs when the handle is being touched;
transmitting the locally captured images and sound, the head position and orientation, the eye position and the display/arm position through the communication system;
processing remotely captured images for display through the stereo image projection system by the movable display;
processing remotely captured sound and providing the stereo sound to the speakers;
processing the remotely captured images to determine a viewing frustum of a remote viewer responsive to the remotely determined head position and orientation, eye position and the remote display/arm position and displaying the viewing frustum on the display associated with a view of the remote viewer showing an orientation of the remote viewer;
maintaining a 3D object in a common world coordinate system being viewed by the first and second systems;
determining a cut plane view of the 3D object on the display responsive to a position of the display with respect to the common world coordinate system, displaying a view of the 3D object on the display;
displaying the frustum of the remote viewer relative to the 3D object on the display; and
displaying a representation of the cut plane view of the remote viewer on the display.
-
-
13. A system, comprising:
-
an autostereo display;
a mechanical arm coupled to the display and providing display position and orientation information; and
a computer determining autostereo views responsive to the display position and viewpoint.
-
-
14. A system, comprising:
-
a display/sensor assembly presenting a view to a viewer and sensing a user position and user viewpoint;
a robotic arm coupled to the assembly and providing display position and orientation information; and
a computer determining the view responsive to the user position and viewpoint, producing a display responsive to the position and viewpoint, comparing the user position to position range limits of sensor and display components and producing robot motion control information to keep the user position within the range limits, the robotic arm moving and orienting the assembly responsive to the motion control information.
-
-
15. A system, comprising:
multiple input/output systems coupled together to provide a view of a common scene from perspectives of each of the systems, each system comprising;
a display/sensor assembly presenting the view to a viewer and sensing a user;
a mechanical arm coupled to the assembly and providing display position and orientation information; and
a computer determining the view responsive to the display position and orientation.
Specification