USER INTERFACE WITH PROXIMITY DETECTION FOR OBJECT TRACKING
First Claim
1. A user interface device comprising:
- a display surface having at least one touch-sensitive region;
a receiver for wirelessly receiving a signal having identification information for at least one selected item; and
a processor in communication with at least one touch-sensitive region and the receiver, the processor detecting user input via the at least one touch-sensitive region and determining whether the at least one selected item is within proximity of the user interface device in response to the signal received from the at least one selected item.
1 Assignment
0 Petitions
Accused Products
Abstract
A system or method for tracking items proximate a user interface device include a user interface device having at least one solid-state touch-sensitive region and a receiver for wirelessly receiving a signal from at least one item to determine proximity of the item relative to the user interface device. The device may also include a display screen for displaying controls and information. The user interface device may be permanently or removably mounted in a vehicle and used to interface with vehicle systems and personal electronic devices for control and information display. Tracked items or objects may include passive or active data tags and communicate identification information and optionally position information to the user interface device. The device may alert the user to movement of tracked objects, and/or confirm presence of a group of objects intended for a particular task or project. The device may use various wired or wireless devices to control selections and/or a cursor on the display.
-
Citations
46 Claims
-
1. A user interface device comprising:
-
a display surface having at least one touch-sensitive region; a receiver for wirelessly receiving a signal having identification information for at least one selected item; and a processor in communication with at least one touch-sensitive region and the receiver, the processor detecting user input via the at least one touch-sensitive region and determining whether the at least one selected item is within proximity of the user interface device in response to the signal received from the at least one selected item. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25)
-
-
26. A method comprising:
-
identifying at least one selected object using a touch-sensitive user interface device; receiving a signal having identification information for the at least one selected object via the user interface device; determining a position of the at least one selected object relative to the user interface device; and determining whether the at least one selected object is within a desired proximity of the user interface device. - View Dependent Claims (27, 28, 29, 30, 31, 32)
-
-
33. A computer readable storage medium having stored data representing instructions executable by a microprocessor, the computer readable storage medium comprising:
-
instructions for identifying at least one selected object using a touch-sensitive user interface device; instructions for determining a position of the at least one selected object relative to the user interface device in response to a signal received from the at least one selected object, the signal including identification information regarding the at least one selected object; and instructions for determining whether the at least one selected object is within a desired proximity of the user interface device. - View Dependent Claims (34)
-
-
35. A user interface comprising:
-
a display having at least one touch-sensitive region; a processor in communication with the at least one touch sensitive region; an electronically conductive and substantially transparent layer positioned behind a panel surface; and a remote device for at least one of controlling a selection or moving a cursor on the display. - View Dependent Claims (36, 37, 38, 39, 40, 41)
-
-
42. A user interface comprising:
-
a processor for storing a plurality of pre-determined gesture combinations; said processor in communication with at least one remote device; said remote device comprising a camera for capturing an image of at least one of a user'"'"'s finger, head, hand, or body movement; said camera being mounted in a high mount display located in front of a driver where the camera can capture an image of the driver'"'"'s eyes and at least one hand in proximity of a steering wheel; and a control output signal being generated from said processor when said camera detects the driver'"'"'s eyes are straight forward, and the driver is making a gesture with at least one hand or finger.
-
-
43. A user interface comprising:
-
a processor in communication with at least one remote device; said remote device comprising a camera for capturing an image of a user'"'"'s hand movement; said camera being mounted on a vehicle dash in proximity to the user interface where the camera can compare an image of an individual in the vehicle; and a control output being generated from said processor to disable functions of the user interface if a driver of the vehicle attempts to control the remote device while the vehicle is moving, and to enable the user interface functions if a vehicle passenger attempts to control the remote device while the vehicle is moving.
-
-
44. A user interface comprising:
-
a processor of a motor vehicle, the processor being configured to recognize a plurality of pre-determined gesture combinations; at least one remote device; a camera situated for detecting a user that approaches the camera; said camera being in communication with the processor to provide information to the processor regarding a user gesture detected by the camera; a control output signal being generated from the processor when the camera detects the user making one of the pre-determined gestures; said camera being mounted on an exterior of a motor vehicle; a rechargeable power source; and a solar cell for recharging said rechargeable power source.
-
-
45. A method comprising:
-
identifying at least one selected object using at least one camera in communication with a user interface device; determining a gesture of the at least one selected object; and generating a control output signal from the user interface device when the gesture corresponds to a pre-determined gesture. - View Dependent Claims (46)
-
Specification