Local augmented reality persistent sticker objects
First Claim
1. A method for local augmented reality (AR) tracking, the method comprising:
- capturing, using an image sensor and one or more processors of a device, a first plurality of images of a scene;
displaying the first plurality of images on a display of the device;
receiving, at an input component of the device, a first user selection of an AR sticker object;
receiving, at the input component of the device, a second user selection placing the AR sticker object relative to a first image of the first plurality of images as displayed on the display of the device;
processing, using the one or more processors, one or more images of the first plurality of images to generate a local AR model of the scene; and
adding the AR sticker object to the local AR model of the scene for local tracking of the AR sticker object and presentation of the AR sticker object with AR images on the display of the device, wherein adding the AR sticker object to the local AR model of the scene for local tracking of the AR sticker object comprises;
identifying a target template associated with the AR sticker object described by Ji=I0(si);
wherein Ji is the target template associated with the AR sticker object, the target template comprising a set of color values sampled at a plurality of sample points Si associated with a target and a target area in an initial AR model image I0 of the one or more images of the first plurality of images and i is an integer set.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems and methods for local augmented reality (AR) tracking of an AR object are disclosed. In one example embodiment a device captures a series of video image frames. A user input is received at the device associating a first portion of a first image of the video image frames with an AR sticker object and a target. A first target template is generated to track the target across frames of the video image frames. In some embodiments, global tracking based on a determination that the target is outside a boundary area is used. The global tracking comprises using a global tracking template for tracking movement in the video image frames captured following the determination that the target is outside the boundary area. When the global tracking determines that the target is within the boundary area, local tracking is resumed along with presentation of the AR sticker object on an output display of the device.
-
Citations
20 Claims
-
1. A method for local augmented reality (AR) tracking, the method comprising:
-
capturing, using an image sensor and one or more processors of a device, a first plurality of images of a scene; displaying the first plurality of images on a display of the device; receiving, at an input component of the device, a first user selection of an AR sticker object; receiving, at the input component of the device, a second user selection placing the AR sticker object relative to a first image of the first plurality of images as displayed on the display of the device; processing, using the one or more processors, one or more images of the first plurality of images to generate a local AR model of the scene; and adding the AR sticker object to the local AR model of the scene for local tracking of the AR sticker object and presentation of the AR sticker object with AR images on the display of the device, wherein adding the AR sticker object to the local AR model of the scene for local tracking of the AR sticker object comprises; identifying a target template associated with the AR sticker object described by Ji=I0(si); wherein Ji is the target template associated with the AR sticker object, the target template comprising a set of color values sampled at a plurality of sample points Si associated with a target and a target area in an initial AR model image I0 of the one or more images of the first plurality of images and i is an integer set. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. A device comprising:
-
a display; an input component coupled to the display; a memory coupled to the display and the input component; an image sensor; and one or more processors coupled to the display, the image sensor, the input component, and the memory, the one or more processors configured to process video image frames captured by the image sensor and output local AR images using local AR tracking of an AR sticker object by; processing a user input associating a first portion of a first image of the video image frames with the AR sticker object and a target; generating, based on the user input and the first portion of the first image, a first target template associated with the target; tracking the target across frames of the video image frames following the first image by calculating changes in the first portion of the first image using the first target template; initiating a global tracking based on a determination that the target is outside a boundary area the global tracking comprising using a global tracking template for tracking movement in the video image frames captured following the determination that the target is outside the boundary area; and resuming tracking the target when the global tracking determines that the target is within the boundary area, and displaying the AR sticker object on the display based on the tracking of the target; wherein tracking the target across the frames of the video image frames following the first image by calculating the changes in the first portion of the first image using the first target template comprises; identifying the first target template associated with the AR sticker object described by Ji=I0(si); wherein Ji is the first target template associated with the AR sticker object for integer set i, the first target template comprising a set of color values sampled at a plurality of sample points Si associated with the target and a target area in an initial AR model image I0 of one or more images of the first plurality of images. - View Dependent Claims (15)
-
-
16. A non-transitory computer readable medium comprising instructions that, when performed by one or more processors of a device, cause the device to perform a method comprising:
-
processing a user input associating a first portion of a first image of video image frames captured by an image sensor with an AR sticker object and a target; generating, based on the user input and the first portion of the first image, a first target template associated with the target; and tracking the target across frames of the video image frames following the first image by calculating changes in the first portion of the first image using the first target template by identifying a target template associated with the AR sticker object described by Ji=I0(si); wherein Ji is the target template associated with the AR sticker object, the target template comprising a set of color values sampled at a plurality of sample points Si associated with a target and a target area in an initial AR model image I0 of the one or more images of the first plurality of images and i is an integer set. - View Dependent Claims (17, 18, 19, 20)
-
Specification