Interface control system, interface control apparatus, interface control method, and program
First Claim
Patent Images
1. An interface control system comprising:
- memory configured to store instructions; and
a processor configured to execute the instructions to;
detect a first marker attached to the body of a user based on an image obtained from a camera attached to the user;
display an operation image on a display screen of a head mounted display mounted on the user based on the detected first marker so that the operation image is superimposed on the body of the user and is visually recognized by the user; and
determine whether or not the operation image is operated by the user,wherein the first marker is an image displayed on a display screen of a device attached to the arm portion of the user, andwherein the processor is further configured to extract a feature of the background in an image obtained from the camera, and determine a shape, a pattern, or a color of the first marker displayed on the display screen of the device based on the extracted feature.
1 Assignment
0 Petitions
Accused Products
Abstract
An interface control system (2000) includes a first marker detection unit (2020) and a display control unit (2040). The first marker detection unit (2020) detects a first marker (50) attached to the body of a user based on a captured image obtained from a camera (20) worn by the user. The display control unit (2040) displays an operation image on a display screen (32) of a head mounted display 30 mounted on the user based on the first marker detected by the first marker detection unit (2020). The display control unit (2040) displays the operation image so that the operation image is viewed to be superimposed on the arm portion of the user when viewed with the eyes of the user.
33 Citations
17 Claims
-
1. An interface control system comprising:
-
memory configured to store instructions; and a processor configured to execute the instructions to; detect a first marker attached to the body of a user based on an image obtained from a camera attached to the user; display an operation image on a display screen of a head mounted display mounted on the user based on the detected first marker so that the operation image is superimposed on the body of the user and is visually recognized by the user; and determine whether or not the operation image is operated by the user, wherein the first marker is an image displayed on a display screen of a device attached to the arm portion of the user, and wherein the processor is further configured to extract a feature of the background in an image obtained from the camera, and determine a shape, a pattern, or a color of the first marker displayed on the display screen of the device based on the extracted feature. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15)
-
-
9. The interface control system according to claim 1, wherein the processor is further configured to determine that the operation image is operated by the user in a case where predetermined vibration is detected by a sensor detecting vibration of the arm portion of the user.
-
10. The interface control system according to claim 1, wherein the processor is further configured to detect an operation body from a part of regions defined based on the first marker among regions included in an image obtained from the camera, and determine that the operation image is operated by the user in a case where the operation body is detected from the part of regions.
-
11. The interface control system according to claim 1, wherein the processor is further configured to determine an operation target which overlaps the hand of the user with an area of a predetermined proportion or higher as an operation target operated by the user among a plurality of operation targets, in a case where the plurality of operation targets are included in the operation image, and where the hand of the user performing an operation on the operation image overlaps the plurality of operation targets.
-
12. The interface control system according to claim 1, wherein the processor is further configured to determine an operation target which is located on a leftmost part or a rightmost part as an operation target operated by the user among a plurality of operation targets, in a case where the plurality of operation targets are included in the operation image, and where the hand of the user performing an operation on the operation image overlaps the plurality of operation targets.
-
13. The interface control system according to claim 1, wherein the processor is further configured to, in a case where the operation image visually recognized by the user through the head mounted display overlaps the hand of the user operating the operation image, display an operation image in which an image of the hand is superimposed on a position where the hand of the user is located, on the display screen of the head mounted display.
-
14. The interface control system according to claim 1, wherein the processor is further configured to display only a part of the operation image according to a posture of the body of the user.
-
15. An interface control apparatus comprising the processor and the memory according to claim 1.
-
16. An interface control method executed by a computer, the method comprising:
-
detecting a first marker attached to the body of a user based on an image obtained from a camera attached to the user; displaying an operation image on a display screen of a head mounted display mounted on the user based on the detected first marker so that the operation image is superimposed on the body of the user and is visually recognized by the user; and determining whether or not the operation image is operated by the user, wherein the first marker is an image displayed on a display screen of a device attached to the arm portion of the user, and wherein the method further comprises extracting a feature of the background in an image obtained from the camera, and determine a shape, a pattern, or a color of the first marker displayed on the display screen of the device based on the extracted feature. - View Dependent Claims (17)
-
Specification