Orientation and navigation for a mobile device using inertial sensors
First Claim
1. A mobile device for enhanced navigation and orientation, comprising:
- a visualization interface;
a first sensor for providing signals indicative of a movement of the mobile device including an inertial sensor;
a second sensor for providing further signals indicative of a movement of the mobile device; and
a processor receiving the signals from the first sensor and the second sensor, the processor calculating a three-dimensional position and a three-dimensional orientation of the mobile device from the received signals, and generating a real time simulation of an environment via the visualization interface based on the calculated position and orientation;
wherein the processor uses signals from the inertial sensor to activate or deactivate user control operations.
1 Assignment
0 Petitions
Accused Products
Abstract
A mobile device for enhanced navigation and orientation including a visualization interface, a first sensor for providing signals indicative of a movement of the mobile device, a second sensor for providing further signals indicative of a movement of the mobile device, and a processor receiving signals from the first and second sensors, calculating a position and an orientation of the mobile device from the received signals, and generating a real time simulation of an environment via the visualization interface based on the position and orientation of the mobile device. According to an embodiment, the first and second sensors are implemented as an inertial sensor and a GPS receiver, respectively.
216 Citations
23 Claims
-
1. A mobile device for enhanced navigation and orientation, comprising:
-
a visualization interface; a first sensor for providing signals indicative of a movement of the mobile device including an inertial sensor; a second sensor for providing further signals indicative of a movement of the mobile device; and a processor receiving the signals from the first sensor and the second sensor, the processor calculating a three-dimensional position and a three-dimensional orientation of the mobile device from the received signals, and generating a real time simulation of an environment via the visualization interface based on the calculated position and orientation; wherein the processor uses signals from the inertial sensor to activate or deactivate user control operations. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method for navigating using a mobile device having a visual display, comprising:
-
detecting a first set of signals indicative of movement of the mobile device; detecting a second set of signals indicative of movement of the mobile device; determining a position of the mobile device based on at least one of the first set of signals and the second set of signals; determining an orientation of the mobile device based on at least one of the first set of signals and the second set of signals; generating at least one of a two-dimensional and a three-dimension view of an environment based on the position and orientation of the mobile device; and controlling user functions of the mobile device based on the first set of signals. - View Dependent Claims (14, 15, 16, 17, 18, 19, 20)
-
-
21. A mobile device for enhanced navigation and orientation, comprising:
-
a user interface having a visual display; an inertial sensor, the inertial sensor detecting a movement of the mobile device, the inertial sensor providing feedback signals for user control and for generating an image in the visual display and generating signals from which a location and an attitude of the mobile device are derived; a processor; and a means for receiving local navigation image information; wherein the processor computes a location and an attitude of the mobile device from the signals generated by the inertial sensor, and from the location, attitude, and information from the receiving means, the processor generates an image of a local environment, the image representing a depiction of the local environment viewed from the computed location and attitude of the mobile device and in response to particular signals received from the inertial sensor, the processor activates or deactivates controls of the user interface. - View Dependent Claims (22)
-
-
23. A visual navigation system comprising:
-
a source of navigation information, the information including map information and image information, the source further including means for transmitting the navigation information; a mobile device including; means for receiving navigation information from the source, an inertial sensor, a GPS receiver, and a processor coupled to the inertial sensor, the GPS receiver, and the means for receiving, the processor including; a navigation module, the navigation module calculating a position of the mobile device using data from the inertial sensor and the GPS receiver, and generating map data using the received map information and the calculated position; and a user interface generator module, the user interface generator module calculating an orientation of the mobile device using data from the inertial sensor, generating a three-dimensional simulation using the position calculated by the navigation module, the calculated orientation and received image information, and controlling user functions in accordance with signals received via the inertial sensor.
-
Specification