SYSTEM AND METHOD FOR NAVIGATION OF A VIRTUAL ENVIRONMENT ON A HANDHELD DEVICE
1. A computing device, comprising:
- a touch screen display;
one or more processors;
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including;
instructions for detecting one or more finger contacts with the touch screen display;
instructions for applying a heuristic to the one or more finger contacts to determine a command for the device; and
instructions for processing the command;
wherein the heuristic comprises;
a first navigational heuristic for movement within a virtual environment along a X-Y axis, wherein a first touch event within a first circular navigational area defined by a first circular navigational element shown on the touch screen display results in movement within the virtual environment; and
a second navigational heuristic for rotation within the virtual environment along the X-Y axis direction, wherein a second touch event within a second circular navigational area defined by a second navigational element shown on the touch screen display results in rotational movement within the virtual environment.
Systems and methods for navigation within a virtual environment using a mobile device is disclosed. In the embodiments of the invention, one or more novel User Interfaces are used to allow a user navigate within a virtual environment simulated on the touch screen of a mobile device, to explore the virtual environment using navigational elements instead of having to physically move about the virtual environment.
- 1. A computing device, comprising:
a touch screen display; one or more processors; memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including; instructions for detecting one or more finger contacts with the touch screen display; instructions for applying a heuristic to the one or more finger contacts to determine a command for the device; and instructions for processing the command; wherein the heuristic comprises; a first navigational heuristic for movement within a virtual environment along a X-Y axis, wherein a first touch event within a first circular navigational area defined by a first circular navigational element shown on the touch screen display results in movement within the virtual environment; and a second navigational heuristic for rotation within the virtual environment along the X-Y axis direction, wherein a second touch event within a second circular navigational area defined by a second navigational element shown on the touch screen display results in rotational movement within the virtual environment.
- View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 20)
- 9. A computing device, comprising:
a touch screen display; one or more processors; memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs are configured to; display on a first location of the touch screen display a first circular navigational element, wherein the first circular navigational element includes a first circular area within which a first touch event results in movement in a virtual environment, and display on a second location of the touch screen display a second circular navigational element, wherein the second circular navigational element includes a second circular area within which a second touch event results in results in rotational movement within the virtual environment.
- View Dependent Claims (10, 11, 12, 13, 14)
- 15. A computer-executable method for navigating within a virtual environment shown on a digital touch screen device, comprising:
displaying on a first location of a digital screen a first circular navigational element, wherein the first circular navigational element includes a first circular area within which a user'"'"'s touch results in movement in the virtual environment, wherein when a touch screen device within a portable computing device receives a first touch event within the first circular navigational area a movement results within the virtual environment, and displaying on a second location of the digital screen a second circular navigational element, wherein the second circular navigational element includes a second circular navigational area within which the users touch results in rotation of a field of view in the virtual environment, wherein when the touch screen device receives a second touch event within the second circular navigational area rotational movement results within the virtual environment.
- View Dependent Claims (16, 17, 18, 19)
This application claims priority from the provisional application No. 62/751,553, entitled “System and Method for Navigation of a Virtual Environment on a Handheld Device,” filed on Oct. 27, 2018, the entirety of which is incorporated herein by reference for all purposes.
One embodiment of the present invention provides a system and user interface for navigation of a virtual environment on handheld mobile devices. A mobile device may be used, for example, to view a virtual environment such as an augmented reality portal. The mobile device, when used to view an immersive virtual environment is like a window to another world where as you move the mobile device around, you will see different view of the other world depicted by the virtual environment. In other words, just as virtual reality headsets allow a user to be immersed in a virtual environment, the same virtual environment can be simulated on a mobile device and the user can use the mobile device to look around a virtual environment.
Previously users had to navigate such an environment by physically moving their mobile device in various directions and orientation to view different areas of a virtual environment. Furthermore, users had to walk within the virtual environment to access various locations within the virtual environment. However, there are cases where navigation within a virtual environment is difficult because the size of the virtual environment is larger than the dimension of the real environment in which the user is located. Furthermore, in some cases users may want to navigate a virtual environment without much maneuvering of their handheld device and/or walking around a virtual object. Embodiments of the present invention provide a novel user interface on mobile devices for navigation within virtual environments.
The novel user interface called the “free will” interface allows users to fully navigate a virtual environment without the need to physically move in the virtual environment or move around virtual objects. The user interface includes various buttons, wheels, and sliders that individually or in concert allow the user to efficiently move in all directions and view virtual objects from any angle.
In the figures, like reference numerals refer to the same figure elements.
The following description is presented to enable any person skilled in the art to make and use the invention, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present invention. Thus, the present invention is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
Embodiments of the invention provide a system and method to navigate a virtual environment on handheld devices. In some embodiments the navigational User Interface (UI) may also be used on desktop to navigate a virtual environment. Before describing the operation of the navigational UI, an example of a touch screen device that can be used to show the navigational UI is described below.
An overview of a typical touch screen is provided below. It will be understood by those skilled in the art, that the following overview will not be limiting and the description below explains the basic method of operation of touch screen devices. Electronic devices can use different methods to detect a person'"'"'s input on a touch screen. Most of them use sensors and circuitry to monitor changes in a particular state. Many, including the iPhone (designed and manufactured by Apple, Inc. in California), monitor changes in electrical current. Others monitor changes in the reflection of waves. These can be sound waves or beams of near-infrared light. Other systems may use transducers to measure changes in vibration caused when finger hits the screen'"'"'s surface or cameras to monitor changes in light and shadow.
When a finger is placed on the screen, it may change the state that the device is monitoring. In screens that rely on sound or light waves, a finger physically blocks or reflects some of the waves. Capacitive touch screens such as iPhone use a layer of capacitive material to hold an electrical charge. Touching the screen changes the amount of charge at a specific point of contact. In resistive screens, the pressure from a finger causes conductive and resistive layers of circuitry to touch each other, changing the circuits'"'"' resistance. In either case the detected touch by the hardware may then be translated into data by one or more firmware and such data be made available to the operating system which in turn allows software to receive such data and use it as needed.
In some embodiments, heuristics are used to translate imprecise finger gestures into actions desired by the user. The heuristics may be controlled by the software or may be controlled by lower level software within the operating system. For example, in iPhone, software (or Apps) receive the touch data from a class called UIResponder. The hardware generates electronic data that result from the finger touching the screen and provide that data to the operating system (iOS in case of iPhone). The operating system then provides that data to higher level software via one or more defined classes.
Attention is now directed towards embodiments of the device.
It should be appreciated that the device 100 is only one example of a portable multifunction device 100, and that the device 100 may have more or fewer components than shown, may combine two or more components, or a may have a different configuration or arrangement of the components. The various components shown in
Memory 102 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-slate memory devices. Access to memory 102 by other components of the device 100, such as the processor(s) 110 and the peripherals interface 114, may be controlled by the memory controller 112.
The peripherals interface 114 couples the input and output peripherals of the device to the processor(s) 110 and memory 102. The processors(s) 110 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for the device 100 and to process data.
The I/O subsystem 124 couples input/output peripherals on the device 100, such as the touch screen 132 and other input/control devices 136, to the peripherals interface 114. The I/O subsystem 126 may include a display controller 126 and one or more input controllers 130 for other input or control devices. The input controllers 160 may receive/send electrical signals from/to other input or control devices 116. The other input/control devices 130 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 130 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse.
The touch-sensitive touch screen 132 provides an input interface and an output interface between the device and a user. As explained above, the display controller 126 receives and/or sends electrical signals from/to the touch screen 132. The touch screen 132 displays visual output to the user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively termed “graphics”, “electronic content”, and/or “electronic data”). In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below.
A touch screen 132 has a touch-sensitive surface, sensor or set of sensors that accept input from the user based on haptic and/or tactile contact. The touch screen 132 and the display controller 126 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on the touch screen 132 and convert the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen. In an exemplary embodiment, a point of contact between a touch screen 132 and the user corresponds to a finger of the user.
The touch screen 132 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. The touch screen 132 and the display controller 126 may detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen 132.
A touch-sensitive display in some embodiments of the touch screen 132 may be analogous to the multi-touch sensitive tablets described in the following U.S. patents: U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety.
A touch-sensitive display in some embodiments of the touch screen 132 may be as described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 2, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed Mar. 3, 2006. All of these applications are incorporated by reference herein in their entirety.
The touch screen 132 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen may have a resolution of approximately between 326-401 dpi or more. The user may make contact with the touch screen 132 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which are much less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user using various heuristics.
In some embodiments, the software components stored in memory 102 may include a navigational UI 104 which allows depicts various shapes on the touch-sensitive display 132 for navigation within a virtual environment. Memory 102 may include other modules that store various other control logics such as an operating system, a communication module (or set of instructions), a contact/motion module (or set of instructions), a graphics module (or set of instructions), a text input module (or set of instructions), a Global Positioning System (GPS) module (or set of instructions), and applications (or set of instructions).
The operating system (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
The contact/motion module may detect contact with the touch screen 132 (in conjunction with the display controller 126) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across the touch screen 132, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations may be applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, the contact/motion module and the display controller 126 also detects contact on a touchpad.
In another embodiment, if a virtual object is in front of the field of the view of the user, swiping a finger along the curved line defined by area 105 may rotate the field of view of the user around the virtual object. Therefore, functionality of the area 105 may change depending on the location of the user within the virtual environment. Arrows 109 can be used to elevate the field of view of the user, specifically, arrows 109 may allow the user to “fly” within the virtual environment.
In one embodiment, the area 106 can be sensitive to the amount of force exerted by a user'"'"'s finger on a digital display that shows the UI such that more force can speed up the movement within the virtual environment. For example, in a large environment, once a “force touch” which may mean more pressure applied to the touch screen display, is detected, then the movement of the user can be accelerated toward that direction. Some mobile devices are equipped with sensors to provide haptic feedback to user by making the device vibrate in certain manner. The haptic feedback may also be used to indicate to the user that the acceleration mode has been activated.
It is important to note that in the context of movement within a virtual environment, the screen of the portable device (for example, a mobile phone) can basically act as a window into another world, and as such, the user is free to move the device in any direction as if he is turning his head around, and while doing so, the screen shows that specific view from that specific angle. The novel design of the UI 104 allows the user to be stationary while at the same time he can tilt the device up and down and from side to side to see the virtual environment around him and use the navigational element to move about the environment. Note that the user can technically turn around to see what is behind him in the virtual environment instead of using the navigation al area 105, but in a scenario where the user is sitting, turning around would be inconvenient. In that case the user can still freely move, tilt the phone up and down, and side to side to see the virtual environment and use the navigational areas 105 and 106 to move and rotate within the environment. The ability of the user to move the phone to see various angles in concert with the novel navigational design of UI 104, allows a user to conveniently navigate through an immersive virtual environment on a portable device without having to physically move about that environment.
The data structures and code described in this detailed description are typically stored on a computer-readable storage medium, which may be any device or medium that can store code and/or data for use by a computer system. The computer-readable storage medium includes, but is not limited to, volatile memory, non-volatile memory, magnetic and optical storage devices such as disk drives, magnetic tape, CDs (compact discs), DVDs (digital versatile discs or digital video discs), or other media capable of storing computer-readable media now known or later developed.
The methods and processes described in the detailed description section can be embodied as code and/or data, which can be stored in a computer-readable storage medium as described above. When a computer system reads and executes the code and/or data stored on the computer-readable storage medium, the computer system performs the methods and processes embodied as data structures and code and stored within the computer-readable storage medium.
Furthermore, methods and processes described herein can be included in hardware modules or apparatus. These modules or apparatus may include, but are not limited to, an application-specific integrated circuit (ASIC) chip, a field-programmable gate array (FPGA), a dedicated or shared processor that executes a particular software module or a piece of code at a particular time, and/or other programmable-logic devices now known or later developed. When the hardware modules or apparatus are activated, they perform the methods and processes included within them.