Method and array for providing a graphical user interface, in particular in a vehicle
First Claim
Patent Images
1. A method for providing a user interface coupled to a processor, in a vehicle, the method comprising:
- displaying at least one graphical object for interaction in a display area of the user interface outside the user'"'"'s reach;
detecting a user'"'"'s gesture in a detection space which is spatially separate from the display area such that the detection space is within the user'"'"'s reach;
selecting a graphical object of the at least one graphical object for interaction via the processor; and
carrying out an interaction associated with the gesture using the selected graphical object via the processor,wherein one or more position(s) adopted by the selected graphical object in the display area during interaction are outside the user'"'"'s reach,wherein the position of the selected object provided for interaction in the display area is independent of the area in the detection space in which the gesture was detected, and gestures are detected in the detection space without user approach to the selected object,wherein the method further comprises;
detecting a first phase of the gesture on a touch-sensitive surface within the detection space in the user'"'"'s reach;
determining, in the first phase of the gesture, a contact zone based on the gesture on the touch-sensitive surface, and selecting the object based on the determined contact zone on the touch-sensitive surface; and
contactlessly detecting a second phase of the gesture within the detection space, wherein the trajectories of the first and second phases of the gesture are associated with one another to interact with the display area of the user interface outside the user'"'"'s reach.
1 Assignment
0 Petitions
Accused Products
Abstract
A method and an array for providing a user interface, in particular in a vehicle. In the method, at least one graphical object designated for interaction is depicted in a display area out of reach for the user. In a detection area that is spatially separated from the display area, a gesture of a user is captured and the graphical object interaction is selected. An interaction assigned to the gesture is carried out by the selected graphical object, wherein the position(s) of the selected graphical object during the interaction is/are out of reach for the user.
25 Citations
14 Claims
-
1. A method for providing a user interface coupled to a processor, in a vehicle, the method comprising:
-
displaying at least one graphical object for interaction in a display area of the user interface outside the user'"'"'s reach; detecting a user'"'"'s gesture in a detection space which is spatially separate from the display area such that the detection space is within the user'"'"'s reach; selecting a graphical object of the at least one graphical object for interaction via the processor; and carrying out an interaction associated with the gesture using the selected graphical object via the processor, wherein one or more position(s) adopted by the selected graphical object in the display area during interaction are outside the user'"'"'s reach, wherein the position of the selected object provided for interaction in the display area is independent of the area in the detection space in which the gesture was detected, and gestures are detected in the detection space without user approach to the selected object, wherein the method further comprises; detecting a first phase of the gesture on a touch-sensitive surface within the detection space in the user'"'"'s reach; determining, in the first phase of the gesture, a contact zone based on the gesture on the touch-sensitive surface, and selecting the object based on the determined contact zone on the touch-sensitive surface; and contactlessly detecting a second phase of the gesture within the detection space, wherein the trajectories of the first and second phases of the gesture are associated with one another to interact with the display area of the user interface outside the user'"'"'s reach. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A vehicle operating system for providing a user interface coupled to a processor, the vehicle operating system comprising:
-
an interface for receiving graphical objects which are stored using data technology and are provided for interaction; a display surface for displaying graphical objects provided for interaction in a display area that is outside a user'"'"'s reach; a gesture detection device for detecting the user'"'"'s gestures in a detection space, the detection space being determined by a reach of the user restricted within a firmly predefined user area and being spatially separate from the display area such that the detection space is within the user'"'"'s reach; the processor including a control unit connected to the interface, the display surface and the gesture detection device, wherein the control unit receives signals for selecting graphical objects and controls interactions with selected graphical objects which are associated with the gestures, wherein, the control unit controls display of a selected graphical object for interaction in a display area outside the user'"'"'s reach, wherein the interaction is carried out such that at least one position adopted by the selected graphical object during interaction is/are outside the user'"'"'s reach, and wherein the position of the selected object provided for interaction in the display area is independent of the area in the detection space in which the gesture was detected, and gestures are detected in the detection space without user approach to the selected object, wherein the gesture detection device detects a first phase of the gesture on a touch-sensitive surface within the detection space in the user'"'"'s reach, and contactlessly detects a second phase of the gesture within the detection space, wherein the trajectories of the first and second phases of the gesture are associated with one another to interact with the display area that is outside the user'"'"'s reach, and wherein, in the first phase of the gesture, a contact zone is determined based on the gesture on the touch-sensitive surface, and the object is selected based on the determined contact zone on the touch-sensitive surface. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
Specification