Method and System for Visualization Enhancement for Situational Awareness
First Claim
Patent Images
11-1. The method of claim 1, further comprising the steps ofproviding a built-in 3D viewer;
- displaying a 3D map in addition to the navigation and camera images;
locking the viewpoints of the three images to each other so that zooming and panning on the image will affect the 3D view and vice versa;
adjusting the 3d map, navigation map, and image to match the pan/tilt/zoom commands of the operator.
2 Assignments
0 Petitions
Accused Products
Abstract
An after-action, mission review tool that displays camera and navigation sensor data allowing a user to pan, tilt, and zoom through the images from front and back cameras on an vehicle, while simultaneously viewing time/date information, along with any available navigation information such as the latitude and longitude of the vehicle at that time instance.
Also displayed is a visual representation of the path the vehicle traversed; when the user clicks on the path, the image is automatically changed to the image corresponding to that position. If aerial images of the area are available, the path can be plotted on the geo-referenced image.
83 Citations
22 Claims
-
11-1. The method of claim 1, further comprising the steps of
providing a built-in 3D viewer; -
displaying a 3D map in addition to the navigation and camera images; locking the viewpoints of the three images to each other so that zooming and panning on the image will affect the 3D view and vice versa; adjusting the 3d map, navigation map, and image to match the pan/tilt/zoom commands of the operator.
-
-
13. A program storage device readable by a computer machine, tangibly embodying program instructions executable by the computer machine to perform method steps for visualization enhancement for situational awareness, said method steps comprising:
-
providing a graphical user interface enabling a user to interact with the software program running on the computer machine running; the computer machine providing a database of vehicle mission recordings; selecting one or more mission recordings; displaying a GUI window comprising; a navigation map associated with the mission recording wherein, the navigation map displays the traveled path traveled of an UGV Unmanned Ground Vehicle (UGV) in a first color, if available the current position of the vehicle is indicated by a shape in a second color, a status display; and taking one or more camera images from the mission recording; overlaying on the top of the image display is the cardinal direction of the displayed camera image; changing the direction with camera panning within the still frame image; progressing through the timeline of a mission using a horizontal scroll bar slider wherein each slider increment is equivalent to moving one second in time; allowing a user to jump to any point from the beginning to the end of the mission; quickly incrementing or decrementing of the slider; allowing the user to toggle between the front camera and the rear camera of the vehicle; providing a set of playback controls. - View Dependent Claims (14, 15, 16, 17, 18)
-
-
20. The method of claim 19, further comprising the steps of
selecting or clicking on a static object of interest (OOI) through two or more frames; -
using the navigation information of the robot in those frames, along with the positions in the image clicked or selected; triangulating the object to its global position; adjusting the virtual camera to this new focus point; moving forward and backward in the video stream; centering the camera image by panning and tilting the virtual camera around the object; and viewing the object from multiple angles.
-
-
21. The method of claim 19, further comprising the steps of
displaying path of the robots from both missions are displayed on the map in two different colors; -
clicking the map on an overlapping location; navigating to that location using the buttons on the app brings up the two images side by side; using navigation information from the time the images were taken to align the orientation of the images so that the images shown on the screen are showing the same area of the environment; locking the images so that moving a finger on the screen changes the pan/tilt/zoom of both images; and comparing images from two separate missions. - View Dependent Claims (22)
-
Specification