VIRTUAL CAMERA INTERFACE AND OTHER USER INTERACTION PARADIGMS FOR A FLYING DIGITAL ASSISTANT
First Claim
1. A computer-implemented method for providing an aerial view of a physical environment using a flying digital assistant (FDA) and a portable multifunction device (PMD), the method comprising:
- detecting a first position and orientation of a PMD relative to a first point of reference;
wherein, the detected first position and orientation of a PMD is based in part on sensor data gathered from sensors associated with the PMD;
transforming the detected first position and orientation of the PMD relative to first point of reference into a second position and orientation relative to a second point of reference;
wherein, the transforming includes applying a scale factor to the first position; and
generating a representation of a field of view of the physical environment from the second position and orientation relative to the second point of reference, the generated representation configured for presentation via a display device;
wherein the representation of the field of view is generated based in part on sensor data gathered by one or more sensors associated with the FDA in autonomous flight over the physical environment.
3 Assignments
0 Petitions
Accused Products
Abstract
Methods and systems are described for new paradigms for user interaction with an unmanned aerial vehicle (referred to as a flying digital assistant or FDA) using a portable multifunction device (PMD) such as smart phone. In some embodiments, a user may control image capture from an FDA by adjusting the position and orientation of a PMD. In other embodiments, a user may input a touch gesture via a touch display of a PMD that corresponds with a flight path to be autonomously flown by the FDA.
174 Citations
37 Claims
-
1. A computer-implemented method for providing an aerial view of a physical environment using a flying digital assistant (FDA) and a portable multifunction device (PMD), the method comprising:
-
detecting a first position and orientation of a PMD relative to a first point of reference; wherein, the detected first position and orientation of a PMD is based in part on sensor data gathered from sensors associated with the PMD; transforming the detected first position and orientation of the PMD relative to first point of reference into a second position and orientation relative to a second point of reference; wherein, the transforming includes applying a scale factor to the first position; and generating a representation of a field of view of the physical environment from the second position and orientation relative to the second point of reference, the generated representation configured for presentation via a display device; wherein the representation of the field of view is generated based in part on sensor data gathered by one or more sensors associated with the FDA in autonomous flight over the physical environment. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A system for providing an aerial view of a physical environment using a flying digital assistant (FDA) and a portable multifunction device (PMD), the system comprising:
-
one or more processors; one or more memory units having instructions stored thereon, which when executed by the one or more processors, cause the system to; detect a first position and orientation of a PMD relative to a first point of reference; wherein, the detected first position and orientation of a PMD is based in part on sensor data gathered from sensors associated with the PMD; transform the detected first position and orientation of the PMD relative to first point of reference into a second position and orientation relative to a second point of reference; wherein, the transforming includes applying a scale factor to the first position; and generate a representation of a field of view of the physical environment from the second position and orientation relative to the second point of reference, the generated representation configured for presentation via a display of the PMD; wherein the representation of the field of view is generated based in part on sensor data gathered by one or more sensors associated with an FDA in autonomous flight over the physical environment. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23)
-
-
24. A computer-implemented method for providing an aerial view of a physical environment using a flying digital assistant (FDA) and a portable multifunction device (PMD), the method comprising:
-
presenting, via a touch display of the PMD, a visual representation of the aerial view of the physical environment; wherein the visual representation of the aerial view is generated based in part on sensor data gathered by one or more sensors associated with the FDA in flight over the physical environment; receiving, via the touch display of the PMD, a touch gesture, the touch gesture indicating a selection of a point or area in the visual representation of the aerial view of the physical environment; identifying a point of reference in the physical environment corresponding to the selected point or area; and generating control commands configured to cause the FDA to autonomously fly and/or adjust image capture relative to the point of reference. - View Dependent Claims (25, 26, 27, 28, 29, 30, 31)
-
-
32. A system for providing an aerial view of a physical environment using a flying digital assistant (FDA) and a portable multifunction device (PMD), the system comprising:
-
one or more processors; and one or more memory units, the one or more memory units having instructions stored thereon, which when executed by the one or more processors, cause the system to; present, via a touch display of the PMD, a visual representation of the aerial view of the physical environment; wherein the visual representation of the aerial view is generated based in part on sensor data gathered by one or more sensors associated with the FDA in flight over the physical environment; receive, via the touch display of the PMD, a touch gesture, the touch gesture indicating a selection of a point or area in the visual representation of the aerial view of the physical environment; identify a point of reference in the physical environment corresponding to the selected point or area; and generate control commands configured to cause the FDA to autonomously fly and/or adjust image capture relative to the point of reference. - View Dependent Claims (33)
-
-
34. A computer-implemented method for controlling image capture by a flying digital assistant (FDA) in autonomous flight over a physical environment using a portable multifunction device (PMD), the method comprising:
-
capturing images by one or more image capture devices associated with the FDA, the image capture characterized by a line of sight; tracking an orientation of the PMD; adjusting the line of sight of image capture at the FDA to match to the tracked orientation of the PMD; displaying the captured images on a display device; - View Dependent Claims (35, 36, 37)
-
Specification