Multiple unmanned aerial vehicle autonomous coordination
First Claim
1. A method comprising:
- at each of a plurality of unmanned aerial vehicles operating in a three-dimensional space, determining, based on two-dimensional camera images from at least an on-board double lens camera of that unmanned aerial vehicle, current relative locations with reference to that same unmanned aerial vehicle of other ones of said plurality of unmanned aerial vehicles which are visible to that same unmanned aerial vehicle, wherein determining said current relative locations comprises applying multidimensional scaling to said two-dimensional camera images;
wirelessly transmitting said current relative locations from at least one of said plurality of unmanned aerial vehicles to a controller;
wirelessly receiving, at each of said plurality of unmanned aerial vehicles, from said controller, a specification of a path to be followed by that unmanned aerial vehicle; and
at each of said plurality of unmanned aerial vehicles, carrying out on-board collision avoidance based on said two-dimensional camera images, while following said path.
1 Assignment
0 Petitions
Accused Products
Abstract
At each of a plurality of unmanned aerial vehicles operating in a three-dimensional space, current relative locations of other ones of the plurality of unmanned aerial vehicles which are visible to each of the plurality of unmanned aerial vehicles are determined based on two-dimensional camera images from on-board dual-lens cameras. The current relative locations are wirelessly transmitted to a controller. At the plurality of unmanned aerial vehicles, specification of a path to be followed by each of the plurality of unmanned aerial vehicles is wirelessly received from the controller. At each of the plurality of unmanned aerial vehicles, on-board collision avoidance is carried out based on the two-dimensional camera images from the on-board dual-lens cameras, while following the path specification.
28 Citations
20 Claims
-
1. A method comprising:
-
at each of a plurality of unmanned aerial vehicles operating in a three-dimensional space, determining, based on two-dimensional camera images from at least an on-board double lens camera of that unmanned aerial vehicle, current relative locations with reference to that same unmanned aerial vehicle of other ones of said plurality of unmanned aerial vehicles which are visible to that same unmanned aerial vehicle, wherein determining said current relative locations comprises applying multidimensional scaling to said two-dimensional camera images; wirelessly transmitting said current relative locations from at least one of said plurality of unmanned aerial vehicles to a controller; wirelessly receiving, at each of said plurality of unmanned aerial vehicles, from said controller, a specification of a path to be followed by that unmanned aerial vehicle; and at each of said plurality of unmanned aerial vehicles, carrying out on-board collision avoidance based on said two-dimensional camera images, while following said path. - View Dependent Claims (2, 3, 4, 5, 6, 7, 19)
-
-
8. A system comprising:
-
a plurality of unmanned aerial vehicles operating in a three-dimensional space, each of said unmanned aerial vehicles in turn comprising a memory, at least one processor coupled to said memory, an on-board double lens camera coupled to said processor, and a wireless interface to a remote controller; wherein each of said processors of said plurality of unmanned aerial vehicles is configured to; determine, based on two-dimensional camera images from at least said on-board double lens camera of the unmanned aerial vehicle carrying said processor, current relative locations of other ones of said plurality of unmanned aerial vehicles which are visible to that unmanned aerial vehicle, wherein determining said current relative locations comprises applying multidimensional scaling to said two-dimensional camera images; wirelessly transmit said current relative locations to the remote controller; wirelessly receive, from the remote controller, a specification of a path to be followed by the unmanned aerial vehicle carrying said processor; and carry out on-board collision avoidance based on said two-dimensional camera images from said on-board double lens camera, while following said path. - View Dependent Claims (9, 10, 11, 12, 13, 14, 20)
-
-
15. A non-transitory computer readable medium comprising computer executable instructions which when executed by a computer cause the computer to perform the method of:
-
at each of a plurality of unmanned aerial vehicles operating in a three-dimensional space, determining, based on two-dimensional camera images from at least an on-board double lens camera of that unmanned aerial vehicle, current relative locations of other ones of said plurality of unmanned aerial vehicles which are visible to that unmanned aerial vehicle, wherein determining said current relative locations comprises applying multidimensional scaling to said two-dimensional camera images; wirelessly transmitting said current relative locations from each of said plurality of unmanned aerial vehicles to a controller; wirelessly receiving, at each of said plurality of unmanned aerial vehicles, from said controller, a specification of a path to be followed by that unmanned aerial vehicle; and at each of said plurality of unmanned aerial vehicles, carrying out on-board collision avoidance based on said two-dimensional camera images, while following said path. - View Dependent Claims (16, 17, 18)
-
Specification