Methods of obtaining panoramic images using rotationally symmetric wideangle lenses and devices thereof

0Associated
Cases 
0Associated
Defendants 
0Accused
Products 
32Forward
Citations 
0
Petitions 
1
Assignment
First Claim
1. A panoramic imaging system comprising:
 an image acquisition means for acquiring a wideangle image using a wideangle imaging lens rotationally symmetric about an optical axis;
an image processing means for generating a panoramic image based on the said wideangle image; and
an image display means for displaying the said panoramic image on a rectangular screen,wherein;
a coordinate of an image point on the rectangular screen corresponding to an object point having a coordinate (X, Y, Z) in a world coordinate system, which has a nodal point of the wideangle lens as an origin and a vertical line passing through the origin as the Yaxis and an intersection line between a reference plane containing the said Yaxis and the said optical axis of the lens and a horizontal plane perpendicular to the said vertical line as the Zaxis, is given as (x″
, y″
),a horizontal incidence angle ψ
, which an incident ray originating from the said object point makes with the said reference plane, is given as
1 Assignment
0 Petitions
Accused Products
Abstract
The present invention provides methods of obtaining panoramic images that appear most natural to the naked eye by executing a mathematically precise image processing operation on a wideangle image acquired using a wideangle lens that is rotationally symmetric about an optical axis, and devices using the methods. Imaging systems using this method can be used not only in security surveillance applications for indoor and outdoor environments, but also in diverse areas such as video phones for apartment entrance doors, rear view cameras for vehicles, visual sensors for unmanned aerial vehicles and robots, and broadcasting cameras. Also, it can be used to obtain panoramic photographs using digital cameras.
43 Citations
View as Search Results
OMNIDIRECTIONAL IMAGING OPTICS WITH 360ºSEAMLESS TELESCOPIC RESOLUTION  
Patent #
US 20110221767A1
Filed 08/30/2010

Current Assignee
Physical Optics Corporation

Sponsoring Entity
Physical Optics Corporation

Method and apparatus for correction of an image from a fisheye lens in a camera  
Patent #
US 20100194850A1
Filed 01/30/2009

Current Assignee
Panasonic Automotive Systems Company of America

Sponsoring Entity
Panasonic Automotive Systems Company of America

METHOD AND APPARATUS FOR OBTAINING PANORAMIC AND RECTILINEAR IMAGES USING ROTATIONALLY SYMMETRIC WIDEANGLE LENS  
Patent #
US 20100208032A1
Filed 07/24/2008

Current Assignee
NANOPHOTONICS CO. LTD.

Sponsoring Entity
NANOPHOTONICS CO. LTD.

UNMANNED AERIAL VEHICLE AND METHOD FOR CONTROLLING THE UNMANNED AERIAL VEHICLE  
Patent #
US 20120296497A1
Filed 01/13/2012

Current Assignee
Beijing Jingdong Century Trading Co. Ltd.

Sponsoring Entity
Hon Hai Precision Industry Co. Ltd.

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, STORAGE MEDIUM, AND IMAGE PROCESSING SYSTEM  
Patent #
US 20130057542A1
Filed 09/06/2012

Current Assignee
Ricoh Company Limited

Sponsoring Entity
Ricoh Company Limited

PANORAMIC STEREO CATADIOPTRIC IMAGING  
Patent #
US 20130208083A1
Filed 02/15/2012

Current Assignee
City University of Hong Kong

Sponsoring Entity
City University of Hong Kong

Method and apparatus for obtaining panoramic and rectilinear images using rotationally symmetric wideangle lens  
Patent #
US 8,553,069 B2
Filed 07/24/2008

Current Assignee
NANOPHOTONICS CO. LTD.

Sponsoring Entity
Gyeongil Kweon

Unmanned aerial vehicle and method for controlling the unmanned aerial vehicle  
Patent #
US 8,554,462 B2
Filed 01/13/2012

Current Assignee
Beijing Jingdong Century Trading Co. Ltd.

Sponsoring Entity
Hon Hai Precision Industry Co. Ltd.

WIDE FOV CAMERA IMAGE CALIBRATION AND DEWARPING  
Patent #
US 20140085409A1
Filed 03/15/2013

Current Assignee
GM Global Technology Operations LLC

Sponsoring Entity
GM Global Technology Operations LLC

Image correcting device, method for creating corrected image, correction table creating device, method for creating correction table, program for creating correction table, and program for creating corrected image  
Patent #
US 8,724,922 B2
Filed 02/25/2011

Current Assignee
Hitachi Information Telecommunication Engineering Ltd.

Sponsoring Entity
Hitachi Information Telecommunication Engineering Ltd.

Omnidirectional imaging optics with 360°seamless telescopic resolution  
Patent #
US 8,743,199 B2
Filed 08/30/2010

Current Assignee
Physical Optics Corporation

Sponsoring Entity
Physical Optics Corporation

Image processing apparatus, image processing method, storage medium, and image processing system  
Patent #
US 8,854,359 B2
Filed 09/06/2012

Current Assignee
Ricoh Company Limited

Sponsoring Entity
Ricoh Company Limited

Method and apparatus for correction of an image from a fisheye lens in a camera  
Patent #
US 8,988,492 B2
Filed 01/30/2009

Current Assignee
Panasonic Automotive Systems Company of America

Sponsoring Entity
Panasonic Automotive Systems Company of America

IMAGING CONTROLLER AND IMAGING CONTROL METHOD AND PROGRAM  
Patent #
US 20150222816A1
Filed 09/03/2013

Current Assignee
Ricoh Company Limited

Sponsoring Entity
Hideaki Yamamoto, Tomonori Tanaka, Satoshi Sawaguchi, Hiroyuki Satoh, Kensuke Masuda, Yoichi Ito, Nozomi Imae, Daisuke Bessho, Makoto Shohara, Hirokazu Takenaka, Yoshiaki Irino

MULTIPLE CAMERA PANORAMIC IMAGE CAPTURE APPARATUS  
Patent #
US 20150304559A1
Filed 03/26/2015

Current Assignee
Kogeto Inc.

Sponsoring Entity
Kogeto Inc.

Panoramic stereo catadioptric imaging  
Patent #
US 9,250,510 B2
Filed 02/15/2012

Current Assignee
City University of Hong Kong

Sponsoring Entity
City University of Hong Kong

PANEL TRANSFORM  
Patent #
US 20170024851A1
Filed 07/24/2015

Current Assignee
Robert Bosch LLC, Robert Bosch GmbH

Sponsoring Entity
Robert Bosch LLC, Robert Bosch GmbH

Apparatus for optical seethrough head mounted display with mutual occlusion and opaqueness control capability  
Patent #
US 9,726,893 B2
Filed 09/27/2016

Current Assignee
Magic Leap Inc.

Sponsoring Entity
Magic Leap Inc.

Imaging controller and imaging control method and program  
Patent #
US 9,756,243 B2
Filed 09/03/2013

Current Assignee
Ricoh Company Limited

Sponsoring Entity
Ricoh Company Limited

Apparatus and method for correcting image distortion of a camera for vehicle  
Patent #
US 9,813,619 B2
Filed 12/09/2014

Current Assignee
Hyundai Motor Company

Sponsoring Entity
Hyundai Motor Company

Widefield of view (FOV) imaging devices with active foveation capability  
Patent #
US 9,851,563 B2
Filed 04/04/2013

Current Assignee
Magic Leap Inc.

Sponsoring Entity
Magic Leap Inc.

Apparatus for optical seethrough head mounted display with mutual occlusion and opaqueness control capability  
Patent #
US 9,874,752 B2
Filed 05/26/2017

Current Assignee
Magic Leap Inc.

Sponsoring Entity
Magic Leap Inc.

SYSTEM AND METHOD FOR MEASURING A DISPLACEMENT OF A MOBILE PLATFORM  
Patent #
US 20180112979A1
Filed 12/15/2017

Current Assignee
SZ DJI Technology Co. Ltd. dba DJI

Sponsoring Entity
SZ DJI Technology Co. Ltd. dba DJI

Apparatus for optical seethrough head mounted display with mutual occlusion and opaqueness control capability  
Patent #
US 10,048,501 B2
Filed 12/06/2017

Current Assignee
Magic Leap Inc.

Sponsoring Entity
Magic Leap Inc.

Widefield of view (FOV) imaging devices with active foveation capability  
Patent #
US 10,061,130 B2
Filed 11/13/2017

Current Assignee
Magic Leap Inc.

Sponsoring Entity
Magic Leap Inc.

Widefield of view (FOV) imaging devices with active foveation capability  
Patent #
US 10,162,184 B2
Filed 06/12/2018

Current Assignee
Magic Leap Inc.

Sponsoring Entity
Magic Leap Inc.

Apparatus for optical seethrough head mounted display with mutual occlusion and opaqueness control capability  
Patent #
US 10,175,491 B2
Filed 05/11/2018

Current Assignee
Magic Leap Inc.

Sponsoring Entity
Magic Leap Inc.

System, method, computer program and data signal for the registration, monitoring and control of machines and devices  
Patent #
US 10,272,570 B2
Filed 11/12/2013

Current Assignee
C2 Systems Limited

Sponsoring Entity
C2 Systems Limited

Apparatus for optical seethrough head mounted display with mutual occlusion and opaqueness control capability  
Patent #
US 10,451,883 B2
Filed 11/20/2018

Current Assignee
Magic Leap Inc.

Sponsoring Entity
Magic Leap Inc.

Panel transform  
Patent #
US 10,453,173 B2
Filed 07/24/2015

Current Assignee
Robert Bosch LLC, Robert Bosch GmbH

Sponsoring Entity
Robert Bosch GmbH

System and method for measuring a displacement of a mobile platform  
Patent #
US 10,527,416 B2
Filed 12/15/2017

Current Assignee
SZ DJI Technology Co. Ltd. dba DJI

Sponsoring Entity
SZ DJI Technology Co. Ltd. dba DJI

Computerreadable recording medium, information processing method, and information processing apparatus  
Patent #
US 10,681,269 B2
Filed 02/06/2017

Current Assignee
Fujitsu Limited

Sponsoring Entity
Fujitsu Limited

METHOD AND APPARATUS FOR OBTAINING PANORAMIC AND RECTILINEAR IMAGES USING ROTATIONALLY SYMMETRIC WIDEANGLE LENS  
Patent #
US 20100208032A1
Filed 07/24/2008

Current Assignee
NANOPHOTONICS CO. LTD.

Sponsoring Entity
NANOPHOTONICS CO. LTD.

Image processing device and monitoring system  
Patent #
US 7,161,616 B1
Filed 04/17/2000

Current Assignee
Matsushita Electric Industrial Company Limited

Sponsoring Entity
Matsushita Electric Industrial Company Limited

Method for generating and interactively viewing spherical image data  
Patent #
US 6,271,853 B1
Filed 07/08/1999

Current Assignee
Grandeye Limited

Sponsoring Entity
Ford Oxaal

Panoramic video system with realtime distortionfree imaging  
Patent #
US 20060023105A1
Filed 01/15/2004

Current Assignee
Ilya Agurok, Mark Bennahmias, Sookwang Ro, Andrew A. Kostrzewski

Sponsoring Entity
Ilya Agurok, Mark Bennahmias, Sookwang Ro, Andrew A. Kostrzewski

Panoramic imaging and display system with canonical magnifier  
Patent #
US 20050259118A1
Filed 12/20/2004

Current Assignee
BIOTRONIX INC.

Sponsoring Entity
Michael Mojaver, Steven Branson

Method and device for obtaining a digital panoramic image of constant color  
Patent #
US 20040109078A1
Filed 08/15/2003

Current Assignee
IMMERVISION INTERNATIONAL

Sponsoring Entity
IMMERVISION INTERNATIONAL

Method and apparatus for displaying panoramas with streaming video  
Patent #
US 6,356,297 B1
Filed 01/15/1998

Current Assignee
Activision Publishing Incorporated

Sponsoring Entity
International Business Machines Corporation

Wideangle image dewarping method and apparatus  
Patent #
US 6,005,611 A
Filed 08/04/1998

Current Assignee
BH Image Co LLC

Sponsoring Entity
Be Here Corporation

Method and apparatus for performing perspective transformation on visible stimuli  
Patent #
US 5,684,937 A
Filed 06/07/1995

Current Assignee
Grandeye Limited

Sponsoring Entity
Ford Oxaal

System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters  
Patent #
US 5,384,588 A
Filed 01/31/1994

Current Assignee
Sony Corporation

Sponsoring Entity
TELEROBOTICS INTERNATIONAL INC.

Omniview motionless camera orientation system  
Patent #
US 5,185,667 A
Filed 05/13/1991

Current Assignee
Sony Corporation

Sponsoring Entity
TELEROBOTICS INTERNATIONAL INC. A CORPORATION OF TN

18 Claims
 1. A panoramic imaging system comprising:
an image acquisition means for acquiring a wideangle image using a wideangle imaging lens rotationally symmetric about an optical axis; an image processing means for generating a panoramic image based on the said wideangle image; and an image display means for displaying the said panoramic image on a rectangular screen, wherein; a coordinate of an image point on the rectangular screen corresponding to an object point having a coordinate (X, Y, Z) in a world coordinate system, which has a nodal point of the wideangle lens as an origin and a vertical line passing through the origin as the Yaxis and an intersection line between a reference plane containing the said Yaxis and the said optical axis of the lens and a horizontal plane perpendicular to the said vertical line as the Zaxis, is given as (x″
, y″
),a horizontal incidence angle ψ
, which an incident ray originating from the said object point makes with the said reference plane, is given as View Dependent Claims (2, 3, 4)
 5. A method of obtaining a panoramic image, the method comprising:
acquiring an uncorrected image plane using a camera equipped with a rotationally symmetric wideangle lens, where an optical axis of the said camera and a lateral side of an image sensor plane of the camera are made parallel to the ground plane; and extracting a processed image plane based on the said uncorrected image plane, wherein, the said uncorrected image plane is a two dimensional array with K_{max }rows and L_{max }columns, a pixel coordinate of the optical axis on the uncorrected image plane is (K_{o}, L_{o}), a real projection scheme of the said lens is an image height r obtained as a function of a zenith angle θ
of a corresponding incident ray and given as r=r(θ
),a magnification ratio g of the said camera is given as  View Dependent Claims (6, 7, 8)
 9. A method of obtaining a panoramic image, the method comprising:
acquiring an uncorrected image plane using a camera equipped with a rotationally symmetric wideangle lens, where an optical axis of the camera is made parallel to the ground plane; and extracting a processed image plane based on the said uncorrected image plane, wherein, an angle between a lateral side of an image sensor plane of the said camera and the ground plane is γ
,the uncorrected image plane is a two dimensional array with K_{max }rows and L_{max }columns, a pixel coordinate of the optical axis on the uncorrected image plane is (K_{o}, L_{o}), a real projection scheme of the said lens is an image height r obtained as a function of a zenith angle θ
of a corresponding incident ray and given as r=r(θ
),a magnification ratio g of the said camera is given as  View Dependent Claims (10, 11)
 12. A method of obtaining a panoramic image, the method comprising:
acquiring an uncorrected image plane using a camera equipped with a rotationally symmetric wideangle lens; and extracting a processed image plane based on the said uncorrected image plane, wherein, an angle between the said camera optical axis and the ground plane is α
,an angle between a lateral side of an image sensor plane of the camera and the ground plane is γ
,the uncorrected image plane is a two dimensional array with K_{max }rows and L_{max }columns, a pixel coordinate of the optical axis on the uncorrected image plane is (K_{o}, L_{o}), a real projection scheme of the said lens is an image height r obtained as a function of a zenith angle θ
of the corresponding incident ray and given as r=r(θ
),a magnification ratio g of the said camera is given as  View Dependent Claims (13, 14)
 15. A method of obtaining a panoramic image, the method comprising:
acquiring an uncorrected image plane using a camera equipped with a rotationally symmetric wideangle lens, wherein the optical axis of the camera is made perpendicular to the ground plane; and extracting a processed image plane based on the said uncorrected image plane, wherein, the uncorrected image plane is a two dimensional array with K_{max }rows and L_{max }columns, a pixel coordinate of the optical axis on the uncorrected image plane is (K_{o}, L_{o}), a real projection scheme of the said lens is an image height r obtained as a function of a zenith angle θ
of a corresponding incident ray and given as r=r(θ
),a magnification ratio g of the said camera is given as  View Dependent Claims (16, 17, 18)
1 Specification
The present invention generally relates to mathematically precise image processing methods of extracting panoramic images, which appear most natural to the naked eye, from images acquired using a camera equipped with a wideangle lens that is rotationally symmetric about an optical axis, as well as devices using the methods.
Panoramic camera, which captures the 360° view of scenic places such as tourist resorts, is an example of a panoramic imaging system. Panoramic imaging system is an imaging system that captures the views one could get by making one complete turnaround from a given spot. On the other hand, omnidirectional imaging system captures the view of every possible direction from a given spot. Omnidirectional imaging system provides a view that a person could observe from a given position by turning around as well as looking up and down. In a mathematical terminology, the solid angle of the region that can be captured by the imaging system is 4π steradian.
There have been a lot of studies and developments of panoramic imaging systems not only in the traditional areas such as photographing buildings, nature scenes, and heavenly bodies, but also in security/surveillance systems using CCD (chargecoupled device) or CMOS (complementary metaloxidesemiconductor) cameras, virtual touring of real estates, hotels and tourist resorts, and navigational aids for mobile robots and unmanned aerial vehicles (UAV).
As a viable method of obtaining panoramic images, people are actively researching on catadioptric panoramic imaging systems, which are imaging systems employing both mirrors and refractive lenses. Shown in
For the unwrapped panoramic image(334) in
Referring to
r=f tan θ [Math Figure 1]
For a panoramic lens following a rectilinear projection scheme, the height in the object plane(131), in other words, the distance Z measured parallel to the optical axis, is proportional to the distance r on the sensor plane. The axial radius of the point M on the panoramic mirror surface(111), whereon the reflection has occurred, is ρ, and the height is z, and the axial radius of the corresponding point(104) on the object plane(131) is S, and the height is Z. Since the altitude angle of the said incident ray(105) is δ, the height Z of the said object is given by Eq. 2.
Z=z+(S−ρ)tan δ [Math Figure 2]
If the distance from the camera to the object plane is large compared to the size of the camera (i.e., S>>ρ, Z>>z), then Eq. 2 can be approximated as Eq. 3.
Z≅S tan δ [Math Figure 3]
Therefore, if the radius S of the object plane is fixed, then the height of the object (i.e., the object size) is proportional to tan δ, and the axial radius of the corresponding image point (i.e., the image size) on the focal plane is proportional to tan θ. If tan δ is proportional to tan θ in this manner, then the image of the object on the object plane is captured on the image sensor with its vertical proportions preserved. Incidentally, referring to
Therefore, a most natural panoramic image can be obtained when the panoramic lens implements the rectilinear projection scheme given by Eq. 4. One disadvantage of such a panoramic imaging system is that there are considerable numbers of unused pixels in the image sensor.
On such an image sensor plane, the panoramic image(633) is formed between the outer rim(633b) and the inner rim(633a) of an annular region, where the two rims constitute concentric circles. Here, the said sensor plane(613) coincides with a part of the focal plane(632) of the len, and the said panoramic image(633) exists on a part of the sensor plane(613). In
A_{2}=π(r_{2}^{2}−r_{1}^{2}) [Math Figure 6]
Referring to
Therefore, the ratio between the area A_{2 }of the panoramic image(633) and the area A_{1 }of the image sensor plane(613) is given by Eq. 8.
Thus, the percentage of pixel utilization is less than 50%, and the panoramic imaging systems of prior arts have a disadvantage in that pixels are not efficiently used.
Another method of obtaining a panoramic image is to employ a fisheye lens with a wide field of view (FOV). For example, the entire sky and the horizon can be captured in a single image by pointing a camera equipped with a fisheye lens with 180° FOV toward the zenith (i.e., the optical axis of the camera is aligned perpendicular to the ground plane). On this reason, fisheye lenses have been often referred to as “allsky lenses”. Particularly, a highend fisheye lens by Nikon, namely, 6 mm f/5.6 FisheyeNikkor, has a FOV of 220°. Therefore, a camera mounted with this lens can capture even a portion of the backside of the camera as well as the front side of the camera. Then, a panoramic image can be obtained from thus obtained fisheye image by the same methods as illustrated in
In many cases, imaging systems are installed on vertical walls. Imaging systems installed on the outside walls of a building for the purpose of monitoring the surroundings, or a rear view camera for monitoring the backside of a passenger car are such examples. In such cases, it is inefficient if the horizontal field of view is significantly larger than 180°. This is because a wall, which is not needed to be monitored, takes up a large space in the monitor screen, pixels are wasted, and screen appears dull. Therefore, a horizontal FOV around 180° is more appropriate for such cases. Nevertheless, a fisheye lens with 180°FOV is not desirable for such application. This is because the barrel distortion, which accompanies a fisheye lens, evokes psychological discomfort and abhorred by the consumer.
An example of an imaging system, which can be installed on an interior wall for the purpose of monitoring the entire room, is given by a pan•tilt•zoom camera. Such a camera is comprised of a video camera, which is equipped with an optical zoom lens, mounted on a pan•tilt stage. Pan is an operation of rotating in the horizontal direction for a given angle, and tilt is an operation of rotating in the vertical direction for a given angle. In other words, if we assume that the camera is at the center of a celestial sphere, then pan is an operation of changing the longitude, and tilt is an operation of changing the latitude. Therefore, the theoretical range of pan operation is 360°, and the theoretical range of tilt operation is 180°. The shortcomings of a pan•tilt•zoom camera include high price, large size and heavy weight. Optical zoom lens is large, heavy and expensive due to the difficulty in design and the complicated structure. Also, a pan•tilt stage is an expensive device not cheaper than a camera. Therefore, it cost a considerable sum of money to install a pan•tilt•zoom camera. Furthermore, since a pan•tilt•zoom camera is large and heavy, this fact can become a serious impediment to certain applications. Examples of such cases include airplanes where the weight of the payload is of critical importance, or when a strict size limitation exists in order to install a camera in a confined space. Furthermore, pan•tilt•zoom operation takes a time because it is a mechanical operation. Therefore, depending on the particular application at hand, such a mechanical response may not be fast enough.
References 1 and 2 provide fundamental technologies of extracting an image having a particular viewpoint or projection scheme from an image having other than the desirable viewpoint or projection scheme. Specifically, reference 2 provides an example of a cubic panorama. In short, a cubic panorama is a special technique of illustration wherein the observer is assumed to be located at the very center of an imaginary cubic room made of glass, and the outside view from the center of the glass room is directly transcribed on the region of the glass wall whereon the ray vector from the object to the observer meets the glass wall. Furthermore, an example of a more advanced technology is provided in the above reference, wherewith reflections from an arbitrarily shaped mirrored surface can be calculated. Specifically, the author of reference 2 created an imaginary lizard having a highly reflective mirrorlike skin as if made of a metal surface, then setup an observer'"'"'s viewpoint separated from the lizard, and calculated the view of the imaginary environment reflected on the lizard skin from the viewpoint of the imaginary observer. However, the environment was not a real environment captured by an optical lens, but a computercreated imaginary environment captured with an imaginary distortionfree pinhole camera.
On the other hand, an imaging system is described in reference 3 that is able to perform pan•tilt•zoom operations without a physically moving part. The said invention uses a camera equipped with a fisheye lens with more than 180° FOV in order to take a picture of the environment. Then, the user designates a principal direction of vision using various devices such as a joystick, upon which, the computer extracts a rectilinear image from the fisheye image that could be obtained by heading a distortionfree camera to that particular direction. The main difference between this invention and the prior arts is that this invention creates a rectilinear image corresponding to the particular direction the user has designated using devices such as a joystick or a computer mouse. Such a technology is essential in the field of virtual reality, or when it is desirable to replace mechanical pan•tilt•zoom camera, and the keyword is “interactive picture”. In this technology, there are no physically moving parts in the camera. As a consequence, the system response is fast, and there is less chance of mechanical failure.
Ordinarily, when an imaging system such as a security camera is installed, a cautionary measure is taken so that vertical lines perpendicular to the horizontal plane also appear vertical in the acquired image. In such a case, vertical lines still appear vertical even as mechanical pan•tilt•zoom operation is performed. On the other hand, in the said invention, vertical lines generally do not appear as vertical lines after software pan•tilt•zoom operation has been performed. To remedy such an unnatural result, a rotate operation is additionally performed, which is not found in a mechanical pan•tilt•zoom camera. Furthermore, the said invention does not provide the exact amount of rotate angle that is needed in order to display vertical lines as vertical lines. Therefore, the exact rotation angle must be found in a trialanderror method in order to display vertical lines as vertical lines.
Furthermore, the said invention assumes that the projection scheme of the fisheye lens is an ideal equidistance projection scheme. But, the real projection scheme of a fisheye lens generally shows a considerable deviation from an ideal equidistance projection scheme. Since the said invention does not take into account the distortion characteristics of a real lens, images obtained after image processing still shows distortion.
The invention described in reference 4 remedies the shortcoming of the invention described in reference 3, namely the inability of taking into account the real projection scheme of a fisheye lens used in image processing. Nevertheless, the defect of not showing vertical lines as vertical lines in the monitor screen has not been resolved.
From another point of view, all animals and plants including human are bound on the surface of the earth due to the gravitational pull, and most of the events, which need attention or cautionary measure, take place near the horizon. Therefore, even though it is necessary to monitor every 360° direction on the horizon, it is not as important to monitor high along the vertical direction, for example, as high as to the zenith or deep down to the nadir. Distortion is unavoidable if we want to describe the scene of every 360° direction on a twodimensional plane. Similar difficulty exists in the cartography where geography on earth, which is a structure on the surface of a sphere, needs to be mapped on a planar twodimensional atlas. Among all the distortions, the distortion that appears most unnatural to the people is the distortion where vertical lines appear as curved lines. Therefore, even if other kinds of distortions are present, it is important to make sure that such a distortion is absent.
Described in reference 5 are the wellknown map projection schemes among the diverse map projection schemes such as equirectangular projection, Mercator projection and cylindrical projection schemes, and reference 6 provides a brief history of diverse map projection schemes. Among these, the equirectangular projection scheme is the projection scheme most familiar to us when we describe the geography on the earth, or when we draw the celestial sphere in order to make a map of the constellation.
Referring to
x=cψ [Math Figure 9]
Here, c is proportionality constant. Also, the longitudinal coordinate y is proportional to the latitude, and has the same proportionality constant as the lateral coordinate.
y=cδ [Math Figure 10]
The span of the longitude is 360° ranging from −180° to +180°, and the span of the latitude is 180° ranging from −90° to +90°. Therefore, a map drawn according to the equirectangular projection scheme must have a width W:height H ratio of 360:180=2:1. Furthermore, if the proportionality constant c is given as the radius S of the earth, then the width of the said planar map is given as the perimeter of the earth measured along the equator as given in Eq. 11.
W=2πS [Math Figure 11]
Such an equirectangular projection scheme appears as a natural projection scheme considering the fact that the earth'"'"'s surface is close to the surface of a sphere. Nevertheless, it is disadvantageous in that the size of a geographical area is greatly distorted. For example, two very close points near the North Pole can appear as if they are on the opposite sides of the earth in a map drawn according to the equirectangular projection scheme.
On the other hand, in a map drawn according to the Mercator projection scheme, the longitudinal coordinate is given as a complex function given in Eq. 12.
On the other hand,
In this projection scheme, a hypothetical cylindrical plane(934) is assumed which contacts the celestial sphere at the equator(903). Then, for a point Q(ψ, δ) on the celestial sphere(931) having a given longitude ψ and a latitude δ, a line segment connecting the center of the celestial sphere and the point Q is extended until it meets the said cylindrical plane. This intersection point is designated as P(ψ, δ). In this manner, the corresponding point P on the cylindrical plane(934) can be obtained for every point Q on the celestial sphere(931) within the said latitude range. Then, a map having a cylindrical projection scheme is obtained by cutting the cylindrical plane and laying flat on a planar surface. Therefore, the lateral coordinate of the point P on the flattenedout cylindrical plane is given by Eq. 13, and the longitudinal coordinate y is given by Eq. 14.
x=Sψ [Math Figure 13]
y=S tan δ [Math Figure 14]
Such a cylindrical projection scheme is the natural projection scheme for a panoramic camera that produces a panoramic image by rotating in the horizontal plane. Especially, if the lens mounted on the rotating panoramic camera is a distortionfree rectilinear lens, then the resulting panoramic image exactly follows a cylindrical projection scheme. In principle, such a cylindrical projection scheme is the most accurate panoramic projection scheme. However, the panoramic image appears unnatural when the latitudinal range is large, and thus it is not widely used in practice.
Unwrapped panoramic image thus produced and having a cylindrical projection scheme has a lateral width W given by Eq. 11. On the other hand, if the range of the latitude is from δ_{1 }to δ_{2}, then the longitudinal height of the unwrapped panoramic image is given by Eq. 15.
H=S(tan δ_{2}−tan δ_{1}) [Math Figure 15]
Therefore, the following equation can be derived from Eq. 11 and Eq. 15.
Therefore, an unwrapped panoramic image following a cylindrical projection scheme must satisfy Eq. 16.
In the example of
It can be noticed that the unwrapped panoramic images given in
All the animals, plants and inanimate objects such as buildings on the earth are under the influence of gravity, and the direction of gravitational force is the upright direction or the vertical direction. Ground plane is fairly perpendicular to the gravitational force, but needless to say, it is not so on a slanted ground. Therefore, the word “ground plane” actually refers to the horizontal plane, and the vertical direction is the direction perpendicular to the horizontal plane. Even if we refer them as the ground plane, the lateral direction, and the longitudinal direction, for the sake of simplicity in argument, the ground plane must be understood as the horizontal plane, the vertical direction must be understood as the direction perpendicular to the horizontal plane, and the horizontal direction must be understood as a direction parallel to the horizontal plane, whenever an exact meaning of a term needs to be clarified.
Panoramic lenses described in references 7 and 8 take panoramic images in one shot with the optical axes of the panoramic lenses aligned vertical to the ground plane. Incidentally, a cheaper alternative to the panoramic image acquisition method by the previously described camera with a horizontallyrotating lens consist of taking an image with an ordinary camera with the optical axis horizontally aligned, and repeating to take pictures after horizontally rotating the optical axis by a certain amount. Four to eight pictures are taken in this way, and a panoramic image with a cylindrical projection scheme can be obtained by seamlessly joining the pictures consecutively. Such a technique is called stitching. QuickTime VR from Apple computer inc. is commercial software supporting this stitching technology. This method requires a complex, timeconsuming, and elaborate operation of precisely joining several pictures and correcting the lens distortion.
According to the reference 9, another method of obtaining a panoramic or an omnidirectional image is to take a hemispherical image by horizontally pointing a camera equipped with a fisheye lens with more than 180° FOV, and then point the camera to the exact opposite direction and take another hemispherical image. By stitching the two images acquired by the camera using appropriate software, one omnidirectional image having the views of every direction (i.e., 4π steradian) can be obtained. By sending thus obtained image to a geographically separated remote user using communication means such as the Internet, the user can select his own viewpoint from the received omnidirectional image according to his own personal interest, and image processing software on the user'"'"'s computing device can extract a partial image corresponding to the userselected viewpoint, and a perspectively correct planar image can be displayed on the computing device. Therefore, using the image processing software, the user can make a choice of turning around (pan), lookingup or down (tilt), or taking a close (zoom in) or a remote (zoom out) view as if the user is actually present at the specific place in the image. This method has a distinctive advantage of multiple users accessing the same Internet site being able to take looks along the directions of their own choices. This advantage cannot be enjoyed in a panoramic imaging system employing a motion camera such as a pan•tilt camera.
References 10 and 11 describe a method of obtaining an omnidirectional image providing the views of every direction centered on the observer. Despite the lengthy description of the invention, however, the projection scheme provided by the said references is one kind of equidistance projection schemes in essence. In other words, the techniques described in the documents make it possible to obtain an omnidirectional image from a real environment or from a cubic panorama, but the obtained omnidirectional image follows an equidistance projection scheme only and its usefulness is thus limited.
On the other hand, reference 12 provides an algorithm for projecting an Omnimax movie on a semicylindrical screen using a fisheye lens. Especially, taking into account of the fact that the projection scheme of a fisheye lens mounted on a movie projector deviates from an ideal equidistance projection scheme, a method is described for locating the position of the object point on the film corresponding to a certain point on the screen whereon an image point is formed. Therefore, it is possible to calculate what image has to be on the film in order to project a particular image on the screen, and such an image on the film is produced using a computer. Especially, since the lens distortion is already reflected in the imageprocessing algorithm, a spectator near the movie projector can entertain himself with a satisfactory panoramic image. Nevertheless, the real projection scheme of the fisheye lens in the said reference is inconvenient to use because it has been modeled with the real image height on the film plane as the independent variable, and the zenith angle of the incident ray as the dependent variable. Furthermore, unnecessarily, the real projection scheme of the fisheye lens has been modeled only with odd polynomials.
Reference 13 provides examples of stereo panoramic images produced by Professor Paul Bourke. Each of the panoramic images follows a cylindrical projection scheme, and a panoramic image of an imaginary scene produced by a computer as well as a panoramic image produced by a rotating slit camera are presented. For panoramic images produced by a computer or produced by a traditional method of rotating slit camera, the lens distortion is not an important issue. However, rotating slit camera cannot be used to take a realtime panoramic image (i.e., movie) of a real world.
 [reference 1] J. F. Blinn and M. E. Newell, “Texture and reflection in computer generated images”, Communications of the ACM, 19, 542547 (1976).
 [reference 2] N. Greene, “Environment mapping and other applications of world projections”, IEEE Computer Graphics and Applications, 6, 2129 (1986).
 [reference 3] S. D. Zimmermann, “Omniview motionless camera orientation system”, U.S. Pat. No. 5,185,667, date of patent Feb. 9, 1993.
 [reference 4] E. Gullichsen and S. Wyshynski, “Wideangle image dewarping method and apparatus”, U.S. Pat. No. 6,005,611, date of patent Dec. 21, 1999.
 [reference 5] E. W. Weisstein, “Cylindrical Projection”, http://mathworld.wolfram.com/CylindricalProjection.html.
 [reference 6] W. D. G. Cox, “An introduction to the theory of perspective—part 1”, The British Journal of Photography, 4, 628634 (1969).
 [reference 7] G. Kweon, K. Kim, Y. Choi, G. Kim, and S. Yang, “Catadioptric panoramic lens with a rectilinear projection scheme”, Journal of the Korean Physical Society, 48, 554563 (2006).
 [reference 8] G. Kweon, Y. Choi, G. Kim, and S. Yang, “Extraction of perspectively normal images from video sequences obtained using a catadioptric panoramic lens with the rectilinear projection scheme”, Technical Proceedings of the 10th World MultiConference on Systemics, Cybernetics, and Informatics, 6775 (Orlando, Fla., USA, June, 2006).
 [reference 9] H. L. Martin and D. P. Kuban, “System for omnidirectional image viewing at a remote location without the transmission of control signals to select viewing parameters”, U.S. Pat. No. 5,384,588, date of patent Jan. 24, 1995.
 [reference 10] F. Oxaal, “Method and apparatus for performing perspective transformation on visible stimuli”, U.S. Pat. No. 5,684,937, date of patent Nov. 4, 1997.
 [reference 11] F. Oxaal, “Method for generating and interactively viewing spherical image data”, U.S. Pat. No. 6,271,853, date of patent Aug. 7, 2001.
 [reference 12] N. L. Max, “Computer graphics distortion for IMAX and OMNIMAX projection”, Proc. NICOGRAPH, 137159 (1983).
 [reference 13] P. D. Bourke, “Synthetic stereoscopic panoramic images”, Lecture Notes in Computer Graphics (LNCS), Springer, 4270, 147155 (2006).
 [reference 14] G. Kweon and M. Laikin, “Fisheye lens”, Korean patent application 1020080030184, date of filing Apr. 1, 2008.
 [reference 15] G. Kweon and M. Laikin, “Wideangle lenses”, Korean patent 100826571, date of patent Apr. 24, 2008.
The purpose of the present invention is to provide image processing algorithms for extracting natural looking panoramic images from digitized images acquired using a camera equipped with a wideangle lens that is rotationally symmetric about an optical axis and devices implementing such algorithms.
The present invention provides image processing algorithms that are accurate in principle based on geometrical optics principle regarding image formation by wideangle lenses with distortion and mathematical definitions of panoramic images.
Panoramic images, which appear most natural to the naked eye, can be obtained by accurately image processing the images obtained using a rotationally symmetric wideangle lens. Such panoramic imaging systems and devices can be used not only in security•surveillance applications for indoor and outdoor environments, but also in diverse areas such as video phone for apartment entrance door, rear view camera for vehicles, visual sensor for robots, and also it can be used to obtain panoramic photographs using a digital camera.
Hereinafter, referring to
In a rigorous sense, the direction of the optical axis is the direction of the negative Zaxis of the world coordinate system. This is because, by the notational convention of imaging optics, the direction from the object (or, an object point) to the image plane (or, an image point) is the positive direction. Despite this fact, we will describe the optical axis as coinciding with the Zaxis of the world coordinate system for the sake of simplicity in argument. This is because the current invention is not an invention about a lens design but an invention using a lens, and in the viewpoint of a lens user, it makes easier to understand by describing the optical axis as in the current embodiment of the present invention.
The image sensor plane(1213) is a plane having a rectangular shape and perpendicular to the optical axis, whereof the lateral dimension is B, and the longitudinal dimension is V. Here, we assume a first rectangular coordinate system, wherein the nodal point N of the lens is taken as the origin, and the optical axis(1201) is taken as the negative(−) zaxis. In other words, the direction of the zaxis is the exact opposite direction of the Zaxis. The intersection point between the zaxis and the image sensor plane(1213) is O. The xaxis of the first rectangular coordinate system passes through the intersection point O and is parallel to the lateral side of the image sensor plane, and the yaxis passes through the intersection point O and is parallel to the longitudinal side of the image sensor plane. Identical to the world coordinate system, this first rectangular coordinate system is a righthanded coordinate system.
In the current embodiment, the Xaxis of the world coordinate system is parallel to the xaxis of the first rectangular coordinate system, and points in the same direction. On the other hand, the Yaxis of the world coordinate system is parallel to the yaxis of the first rectangular coordinate system, but the direction of the Yaxis is the exact opposite of the direction of the yaxis. Therefore, in
The intersection point O between the zaxis of the first rectangular coordinate system and the sensor plane(1213)—hereinafter referred to as the first intersection point—is not generally located at the center of the sensor plane, and it can even be located outside the sensor plane. Such a case can happen when the center of the image sensor is moved away from the center position of the lens—i.e., the optical axis—on purpose in order to obtain an asymmetric vertical or horizontal field of view.
The lateral coordinate x of an arbitrary point P—hereinafter referred to as the first point—on the sensor plane(1213) has a minimum value x_{1 }and a maximum value x_{2 }(i.e., x_{1}≦x≦x_{2}). By definition, the difference between the maximum lateral coordinate and the minimum lateral coordinate is the lateral dimension of the sensor plane (i.e., x_{2}−x_{1}=B). In the same manner, the longitudinal coordinate y of the first point P has a minimum value y_{1 }and a maximum value y_{2 }(i.e., y_{1}≦y≦y_{2}). By definition, the difference between the maximum longitudinal coordinate and the minimum longitudinal coordinate is the longitudinal dimension of the sensor plane (i.e., y_{2}−y_{1}=V).
However, it is not desirable to use a raw image acquired using a fisheye lens in order to obtain a horizontal field of view of 180°. This is because a naturallooking panoramic image cannot be obtained due to the previously mentioned barrel distortion. A panoramic lens assuming object planes schematically shown in
An arbitrary rotationally symmetric lens including a fisheye lens, however, does not follow the said projection scheme. Therefore, to realize the said projection scheme, an image processing stage is inevitable.
Uncorrected image plane(1334) can be considered as the image displayed on the image display means without rectification of distortion, and is a magnified image of the real image on the image sensor plane by a magnification ratio g. For example, the image sensor plane of a ⅓inch CCD sensor has a rectangular shape having a lateral dimension of 4.8 mm, and a longitudinal dimension of 3.6 mm. On the other hand, if the size of a monitor is 48 cm wide and 36 cm high, then the magnification ratio g is 100. More desirably, the side dimension of a pixel in a digital image is considered as 1. A VGAgrade ⅓inch CCD sensor has pixels in a twodimensional array having 640 columns and 480 lows. Therefore, each pixel has a right rectangular shape with both the width and the height measuring as 4.8 mm/640=7.5 μm, and in this case, the magnification ratio g is given by 1 pixel/7.5 μm=133.3 pixel/mm. In recapitulation, the uncorrected image plane(1334) is a distorted digital image obtained by converting the real image formed on the image sensor plane into electrical signals.
The said first intersection point O is the intersection point between the optical axis(1201) and the image sensor plane(1213). Therefore, a ray entered along the optical axis forms an image point on the said first intersection point O. By definition, the horizontal incidence angle ψ and the vertical incidence angle δ of a ray entered along the optical axis are both zero. Therefore, the point O′ on the uncorrected image plane corresponding to the first intersection point O in the image sensor plane—hereinafter referred to as the second intersection point—corresponds to the image point by an incident ray having a horizontal incidence angle of 0 as well as a vertical incidence angle of 0.
A second rectangular coordinate systems is assumed wherein x′axis is taken as the axis that passes through the said second intersection point and is parallel to the lateral side of the uncorrected image plane(1334), and y′axis is taken as the axis that passes through the said second intersection point and is parallel to the longitudinal side of the uncorrected image plane. In
As has been described, a fisheye lens does not provide a naturallooking panoramic image as is schematically shown in
The real image(1433) of the objects on the object plane(1431) formed by the fisheye lens(1412) is converted by the image sensor(1413) into electrical signals, and displayed as an uncorrected image plane(1434) on the image display means(1415), wherein this uncorrected image plane contains a distortion aberration. If the said lens is a fisheye lens, then the distortion will be mainly a barrel distortion. This distorted image can be rectified by the image processing means(1416), and then displayed as a processed image plane(1435) on an image display means(1417) such as a computer monitor or a CCTV monitor. Said image processing can be a software image processing by a computer, or a hardware image processing by an FPGA (Field Programmable Gate Array). The following table 1 summarizes corresponding variables in the object plane, the image sensor plane, the uncorrected image plane, and the processed image plane.
The minimum value of the horizontal incidence angle is ψ_{1}, the maximum incidence angle is ψ_{2 }(i.e., ψ_{1}≦ψ≦ψ_{2}), and the horizontal FOV is Δψ=ψ_{2}−ψ_{1}. In general, if the horizontal FOV is 180°, then a desirable range of the horizontal incidence angle will be given by ψ_{2}=−ψ_{1}=90°. Since the radius of the object plane is S, the arc length of the said object plane is given by Eq. 17.
L=S(ψ_{2}−ψ_{1})=SΔψ [Math Figure 17]
Here, it has been assumed that the unit of the field of view Δψ is radian. This arc length L must be proportional to the lateral dimension W of the processed image plane. Therefore, if this proportionality constant is c, then the following equation 18 is satisfied.
L=cW [Math Figure 18]
On the other hand,
T=S(tan δ_{2}−tan δ_{1}) [Math Figure 19]
Furthermore, the height T of the object plane must satisfy the same proportionality relation with the height H of the processed image plane.
T=cH [Math Figure 20]
Equation 21 can be obtained from Eqs. 17 and 18, wherein A is a constant.
On the other hand, Eq. 22 can be obtained from Eqs. 19 and 20.
Therefore, from Eqs. 21 and 22, it can be seen that the following equation must be satisfied.
In most of the cases, it will be desirable if the range of the horizontal incidence angle and the range of the vertical incidence angle are symmetrical. Therefore, the horizontal FOV will be given as Δψ=ψ_{2}−ψ_{1}=2ψ_{2}, and the vertical FOV will be given as Δδ=δ_{2}−δ_{1}=2 δ_{2}. When designing a lens or evaluating the characteristics of a lens, the horizontal FOV Δψ and the vertical FOV Δδ are important barometers. From Eq. 23, it can be seen that the vertical FOV must be given as in Eq. 24 as a function of the horizontal FOV.
For example, if we assume that the horizontal FOV of the imaging system is 180°, and an ordinary image sensor plane having the 4:3 aspect ratio between the lateral dimension and the longitudinal dimension is employed, then the vertical FOV of a natural panoramic image is given by Eq. 25.
On the other hand, if we assume an image sensor having the 16:9 ratio, then the vertical FOV is given by Eq. 26.
Therefore, even if an image sensor having the 16:9 ratio is used, the vertical FOV corresponds to an ultra wideangle.
More generally, when the procedure from Eq. 17 through Eq. 23 is repeated on an interval containing the third intersection point O″, then Eq. 27 can be obtained.
Therefore, when settingup the desirable size of the processed image plane and the FOV, it must be ensured that Eq. 27 is satisfied.
If the processed image plane in
Likewise, the vertical incidence angle of an incident ray corresponding to the third point having a longitudinal coordinate y″ is, from Eq. 27, given as Eq. 29.
Therefore, the signal value of a third point on the processed image plane having an ideal projection scheme must be given as the signal value of an image point on the image sensor plane formed by an incident ray originating from an object point on the object plane having a horizontal incidence angle (i.e., the longitude) given by Eq. 28 and a vertical incidence angle (i.e., the latitude) given by Eq. 29.
The location of the object point Q on the object plane having said horizontal and vertical incidence angles can be obtained by the following method. Referring to
In Eq. 30, {circumflex over (X)}=(1,0,0) is the unit vector along the Xaxis, and likewise, Ŷ=(0,1,0) and {circumflex over (Z)}=(0,0,1) are the unit vectors along the Yaxis and the Zaxis, respectively. On the other hand, the said vector
Here, R is the magnitude of the said vector
X={circumflex over (X)}·
Y=Ŷ·
Z={circumflex over (Z)}·
In Eqs. 32 through 35, dot(·) represent a scalar product.
On the other hand, the said direction vector can be given by Eq. 36 as a function of two incidence angles describing the projection scheme of the current invention, namely the horizontal incidence angle ψ and the vertical incidence angle δ. Hereinafter, this coordinate system will be referred to as a cylindrical polar coordinate system.
Using these two incidence angles, the rectangular coordinate can be given as follows.
X=R cos δ sin ψ [Math Figure 37]
Y=R sin δ [Math Figure 38]
Z=R cos δ cos ψ [Math Figure 39]
Using Eqs. 37 through 39, the horizontal and the vertical incidence angles can be obtained from the rectangular coordinate (X, Y, Z) of the object point as in Eqs. 40 and 41.
On the other hand, since the coordinates given in the spherical polar coordinate system and in the cylindrical polar coordinate system must agree, the following relations given in Eqs. 42 through 44 must hold.
sin θ cos θ=cos δ sin ψ [Math Figure 42]
sin θ sin φ=sin δ [Math Figure 43]
cos θ=cos δ cos ψ [Math Figure 44]
Eq. 45 can be obtained by dividing Eq. 43 by Eq. 42.
Therefore, the azimuth angle φ is given by Eq. 46.
On the other hand, from Eq. 44, the zenith angle θ is given by Eq. 47.
θ=cos^{−1}(cos δ cos ψ) [Math Figure 47]
In the reverse direction, to convert from the spherical polar coordinate to the cylindrical polar coordinate, Eq. 48 can be obtained by dividing Eq. 42 by Eq. 44.
tan ψ=tan θ cos φ [Math Figure 48]
Therefore, the horizontal incidence angle is given by Eq. 49.
ψ=tan^{−1}(tan θ cos φ) [Math Figure 49]
On the other hand, from Eq. 43, the vertical incidence angle is given by Eq. 50.
δ=sin^{−1}(sin θ sin φ) [Math Figure 50]
Therefore, an incident ray having a horizontal incidence angle ψ and a vertical incidence angle δ is an incident ray in the spherical polar coordinate system having a zenith angle θ given by Eq. 47 and an azimuth angle φ given by Eq. 46. In order to process an image, the position on the image sensor plane corresponding to an incident ray having such a zenith angle θ and an azimuth angle φ must be determined.
For a fisheye lens with an ideal equidistance projection scheme, the image height r is given by Eq. 51.
r(θ)=fθ [Math Figure 51]
In Eq. 51, the unit of the incidence angle θ is radian, and f is the effective focal length of the fisheye lens. The unit of the image height r is identical to the unit of the effective focal length f. The difference between the ideal equidistance projection scheme given by Eq. 51 and the real projection scheme of the lens is the f−θ distortion. Nevertheless, it is considerably difficult for a fisheye lens to faithfully implement the projection scheme given by Eq. 51, and the discrepancy can be as large as 10%. Furthermore, the applicability of the present image processing algorithm is not limited to a fisheye lens with an equidistance projection scheme. Therefore, it is assumed that the projection scheme of a lens is given as a general function of the zenith angle θ of the incident ray as given in Eq. 52.
r=r(θ) [Math Figure 52]
This function is a monotonically increasing function of the zenith angle θ of the incident ray.
Such a real projection scheme of a lens can be experimentally measured using an actual lens, or can be calculated from the lens prescription using dedicated lens design software such as Code V or Zemax. For example, the yaxis coordinate y of an image point on the focal plane by an incident ray having given horizontal and vertical incidence angles can be calculated using a Zemax operator REAY, and the xaxis coordinate x can be similarly calculated using an operator REAX.
r(θ)=α_{1}θ+α_{2}θ^{2}+α_{3}θ^{3}+α_{4}θ^{4}+α_{5}θ^{5} [Math Figure 53]
Table 2 shows the polynomial coefficients in Eq. 53.
x′=gr(θ)cos φ [Math Figure 54]
y′=gr(θ)sin φ [Math Figure 55]
Using Eqs. 27 through 55, a panoramic image having an ideal projection scheme can be extracted from an image acquired using a fisheye lens exhibiting a distortion aberration. First, depending on the user'"'"'s need, a desirable size (W, H) of the panoramic image and the location of the third intersection point O″ are determined. The said third intersection point can be located even outside the processed image plane. In other words, the range of the lateral coordinate (x″_{1}≦x″≦x″_{2}) on the processed image plane as well as the range of the longitudinal coordinate y″_{1}≦y″≦y″_{2}) can take arbitrary real numbers. Also, the horizontal FOV Δψ of this panoramic image (i.e., the processed image plane) is determined. Then, the horizontal incidence angle ψ and the vertical incidence angle δ of an incident ray corresponding to the third point in the panoramic image having a rectangular coordinate (x″, y″) can be obtained using Eqs. 28 and 29. Then, the zenith angle θ and the azimuth angle φ of an incident ray having the said horizontal and the vertical incidence angles are calculated using Eqs. 47 and 46. Next, the real image height r corresponding to the zenith angle θ of the incident ray is obtained using Eq. 52. Utilizing the real image height r, the magnification ratio g, and the azimuth angle φ of the incident ray, the rectangular coordinate (x′, y′) of the image point on the uncorrected image plane is obtained using Eqs. 54 and 55. In this procedure, the coordinate of the second intersection point on the uncorrected image plane, or equivalently the location of the first intersection point on the sensor plane has to be accurately determined. Such a location of the intersection point can be easily found using various methods including image processing method. Since such technique is well known to the people in this field, it will not be described in this document. Finally, the video signal (i.e., RGB signal) from the image point by the fisheye lens having this rectangular coordinate is given as the video signal for the image point on the panoramic image having the rectangular coordinate (x″, y″). A panoramic image having an ideal projection scheme can be obtained by image processing for all the image points on the processed image plane by the abovedescribed method.
However, in reality, a complication arises due to the fact that all the image sensors and display devices are digitized devices.
There is an image point—i.e., the first point—on the image sensor plane corresponding to a pixel P″ on the said processed image plane(2335). The horizontal incidence angle of an incident ray in the world coordinate system forming an image at this first point can be written as ψ_{I,J}≡ψ(I, J). Also, the vertical incidence angle can be written as δ_{I,J}≡δ(I, J). Incidentally, the location of this first point does not generally coincide with the exact location of any one pixel.
Here, if the said screen(2335) corresponds to a panoramic image, then as given by Eq. 56, the horizontal incidence angle must be a sole function of the lateral pixel coordinate J.
ψ_{I,J}=ψ_{J}≡ψ(J) [Math Figure 56]
Likewise, the vertical incidence angle must be a sole function of the longitudinal pixel coordinate I.
δ_{I,J}=δ_{I}≡δ(I) [Math Figure 57]
Furthermore, if an equidistance projection scheme is satisfied in the lateral direction, and a rectilinear projection scheme is satisfied in the longitudinal direction, then the range of the horizontal incidence angle and the range of the vertical incidence angle must satisfy the relation given in Eq. 58.
Comparing with the image correction method described previously, image correction method for a digitized image goes through the following procedure. First, the real projection scheme of the wideangle lens that is meant to be used in the image processing is obtained either by experiment or based on the accurate lens prescription. Herein, when an incident ray having a zenith angle θ with respect to the optical axis forms a sharp image point on the image sensor plane by the image forming properties of the lens, the real projection scheme of the lens refers to the distance r from the intersection point O between the said image sensor plane and the optical axis to the said image point obtained as a function of the zenith angle θ of the incident ray.
r=r(θ) [Math Figure 59]
Said function is a monotonically increasing function of the zenith angle θ. Next, the location of the optical axis on the uncorrected image plane, in other words, the location of the second intersection point O′ corresponding to the first intersection point O on the image sensor plane is obtained. The pixel coordinate of this second intersection point is assumed as (K_{o}, L_{o}). In addition to this, the magnification ratio g of the pixel distance r′ on the uncorrected image plane over the real image height r on the image sensor plane is obtained. This magnification ratio g is given by Eq. 60.
Once such a series of preparatory stages have been completed, then a camera mounted with the said fisheye lens is installed with its optical axis aligned parallel to the ground plane, and a raw image (i.e., an uncorrected image plane) is acquired. Next, the desirable size of the processed image plane and the location (I_{o}, J_{o}) of the third intersection point is determined, and then the horizontal incidence angle ψ_{J }and the vertical incidence angle δ_{I }given by Eqs. 61 and 62 are computed for all the pixels (I, J) on the said processed image plane.
From these horizontal and vertical incidence angles, the zenith angle θ_{I,J }and the azimuth angle Φ_{I,J }of the incident ray in the first rectangular coordinate system are obtained using Eqs. 63 and 64.
Next, the image height r_{I,J }on the image sensor plane is obtained using Eqs. 63 and 59.
r_{I,J}=r(θ_{I,J}) [Math Figure 65]
Next, using the location (K_{o}, L_{o}) of the second intersection point on the uncorrected image plane and the magnification ratio g, the location of the second point(2407) on the uncorrected image plane is obtained using Eqs. 66 and 67.
x′_{I,J}=L_{o}+gr_{I,J }cos φ_{I,J} [Math Figure 66]
y′_{I,J}=K_{o}+gr_{I,J }sin φ_{I,J} [Math Figure 67]
The location of the said second point does not exactly coincide with the location of any one pixel. Therefore, (x′_{I,J}, y′_{I,J}) can be considered as the coordinate of a virtual pixel on the uncorrected image plane corresponding to the third point on the processed image plane, and has a real number in general.
Since the said second point does not coincide with any one pixel, an interpolation method must be used for image processing. The coordinate of a pixel(2408) that is nearest to the location of the said second point can be obtained using Eqs. 68 and 69.
K=int(y′_{I,J}) [Math Figure 68]
L=int(x′_{I,J}) [Math Figure 69]
Here, int{x} is a function whose output is an integer closest to the real number x. Then, the signal value P(K, L) stored at that pixel is copied and assigned as the signal value S(I, J) for the corresponding pixel in the unwrapped panoramic image.
S(I,J)=P(K,L) [Math Figure 70]
Such a geometrical transformation is very simple, but it has the advantage of fast execution time even for a case where there is a large number of pixels in the panoramic image.
Unwrapped panoramic image that is image processed by this most simple method has a shortcoming in that the boundary between two different objects appear jagged like that of a saw tooth when the number of pixels is not sufficient in the image sensor, or when the unwrapped panoramic image is magnified. To remedy this shortcoming, bilinear interpolation method can be employed. Referring to
S(I,J)=(1−Δ_{y})(1−Δ_{x})P(K,L)+Δ_{y}(1−Δ_{x})P(K+1,L)+(1−Δ_{y})Δ_{x}P(K,L+1)+Δ_{y}Δ_{x}P(K+1,L+1) [Math Figure 71]
When such a bilinear interpolation method is used, the image becomes sharper. However, since the computational load is increased, it can become an obstacle in an imaging system operating in real time such as the one generating live video signals. On the other hand, if interpolation methods such as bicubic interpolation or spline interpolation methods are used, then a more satisfactory image can be obtained, but the computational load is increased still more. To prevent the decrease in speed (i.e., frame rate) due to such an increase in the computational load, the image processing can be done by hardware such as FPGA chips.
On the other hand, still other problems may occur when the image is magnified or downsized, in other words, when the image is zoomedin or zoomedout by software. For example, the image can appear blurry when the image is excessively scaled up, and moire effect can happen when the image is scaled down. Furthermore, due to an image processing, the image can be scaledup and scaleddown within the same screen and the two adverse effects can simultaneously appear. To improve on these problems, filtering operations can be undertaken in general.
On the other hand,
If the maximum FOV of this fisheye lens is given as 2θ_{2}, then the image height of an incident ray at the image sensor plane having the maximum zenith angle is given as r_{2}≡r(θ_{2}). Here, the desirable image height is given by Eq. 73.
Therefore, the image circle(2833) contacts the left edge(2813L) and the right edge(2813R) of the image sensor plane(2813). In this configuration, the imaging system uses the most out of the pixels on the image sensor plane and provides a satisfactory processed image plane.
r(θ)=α_{1}θ+α_{2}θ^{2}+α_{3}θ^{3} [Math Figure 74]
Here, the unit of the zenith angle is radian. Table 3 shows the coefficients of the third order polynomial.
Due to the experimental measurement error, it is conjectured that the discrepancy between the approximate projection scheme given by Table 3 and the real projection scheme of the lens is larger than 3 pixels.
This algorithm can be verified using a scientific program such as the MatLab. Following is the algorithm for extracting a panoramic image having a horizontal FOV of 180° from the image given in
As has been stated previously, a cylindrical projection scheme in the strict sense of the words is not used widely. Although it provides a mathematically most precise panoramic image, the image does not appear natural to the naked eye when the vertical FOV (i.e., Δδ=δ_{2}−δ_{1}) is large.
The cylindrical projection scheme of the first embodiment can be generalized as follows. The lateral coordinate x″ on the processed image plane is proportional to the horizontal incidence angle. Therefore, as in the first embodiment, a relation given in Eq. 75 holds.
On the other hand, the longitudinal coordinate y″ on the processed image plane is proportional to a monotonic function of the vertical incidence angle as given in Eq. 76.
y″∝F(δ) [Math Figure 76]
Here, F(δ) is a continuous and monotonic function of the vertical incidence angle δ. Therefore, a relation given by Eq. 77 corresponding to the Eq. 22 holds as follows.
Therefore, the span of the horizontal incidence angle, the span of the vertical incidence angle, and the size of the processed image plane satisfy the following relation.
Also, the horizontal incidence angle corresponding to the third point on the processed image plane having a lateral coordinate x″ and a longitudinal coordinate y″ is given by Eq. 79, and the vertical incidence angle is given by Eq. 80.
Here, F^{−1 }is the inverse function of the function F( ). The said cylindrical projection scheme in the first embodiment is a case where the function F is given by Eq. 81.
F(δ)=tan δ [Math Figure 81]
On the other hand, if the said general projection scheme is specifically an equirectangular projection scheme, then the said function is given by Eq. 82.
F(δ)=δ [Math Figure 82]
Therefore, the ranges of the horizontal and the vertical incidence angles and the size of the processed image plane satisfy the following relation.
Also, the vertical incidence angle is given by Eq. 84.
On the other hand, if the said general projection scheme is specifically a Mercator projection scheme, then the said function is given by Eq. 85.
Also, the ranges of the horizontal and the vertical incidence angles and the size of the processed image plane satisfy the following relation.
On the other hand, the vertical incidence angle is given by Eq. 87.
As in the first embodiment of the present invention, considering the fact that the image sensor plane, the uncorrected image plane and the processed image plane are all digitized, the image processing methods of the first and the second embodiments are comprised of a stage of acquiring an uncorrected image plane while the optical axis of a camera mounted with a rotationallysymmetric wideangle lens and the lateral sides of the image sensor plane are aligned parallel to the ground plane, and an image processing stage of extracting a processed image plane from the uncorrected image plane. Said uncorrected image plane is a two dimensional array having K_{max }rows and L_{max }columns, the pixel coordinate of the optical axis on the said uncorrected image plane is (K_{o}, L_{o}), and the real projection scheme of the said lens is given as a function given in Eq. 88.
r=r(θ) [Math Figure 88]
Here, the real projection scheme of the lens refers to the image height r obtained as a function of the zenith angle θ of the incident ray, and the magnification ratio g of the said camera is given by Eq. 89, wherein r′ is a pixel distance on the uncorrected image plane corresponding to the image height r.
Said processed image plane is a two dimensional array having I_{max }rows and J_{max }columns, the pixel coordinate of the optical axis on the processed image plane is (I_{o}, J_{o}), and the horizontal incidence angle ψ_{I,J}≡ψ(I, J)=ψ_{J }an incident ray corresponding to a pixel having a coordinate (I, J) on the said processed image plane is given by Eq. 90 as a sole function of the said pixel coordinate J.
Here, ψ^{1 }is a horizontal incidence angle corresponding to J=1, and ψ^{Jmax }is a horizontal incidence angle corresponding to J=J_{max}. On the other hand, the vertical incidence angle δ_{I,J}∝δ(I, J)=δ_{I }of the said incident ray is given by Eq. 91 as a sole function of the said pixel coordinate I.
Here, F^{−1 }is the inverse function of a continuous and monotonically increasing function F(δ) of the incidence angle δ, and the signal value of a pixel having a coordinate (I, J) on the said processed image plane is given by the signal value of a virtual pixel having a coordinate (x′_{I,J},y′_{I,J}) on the uncorrected image plane, wherein the said coordinate (x′_{I,J},y′_{I,J}) of the virtual pixel is obtained from the series of equations given in Eqs. 92 through 96.
If the projection scheme of the panoramic image is a cylindrical projection scheme, then the said function F is given by Eq. 97, and the said vertical incidence angle is given by Eq. 98.
If the projection scheme of the panoramic image is an equirectangular projection scheme, then the said function F is given by Eq. 99, and the said vertical incidence angle is given by Eq. 100.
If the projection scheme of the panoramic image is a Mercator projection scheme, then the said function F is given by Eq. 101, and the said vertical incidence angle is given by Eq. 102.
The procedure for finding the location of the image point corresponding to given horizontal and vertical incidence angles and interpolating the image is identical to the procedure given in Eqs. 63 through 71.
Field of view of typical wideangle lenses used in security and surveillance area is 90° at the maximum, and a security lens having this amount of FOV generally exhibits a considerable degree of distortion. A lens having a FOV larger than this is not widely used because such a lens exhibits excessive distortion and causes psychological discomfort.
Usually, lighting(3471) is installed at an interior ceiling. Since typical wideangle camera is installed facing down toward the interior at an angle, the said lighting is out of the field of view(3463) of the camera, and the intense rays(3473) from the lighting are not captured by the camera.
In order to acquire a naturallooking panoramic image, the optical axis(3601) of the image acquisition means(3610) must be parallel to the ground plane. Also, the security camera must be installed at a high place that is out of reach from the people. Therefore, as schematically illustrated in
Under this physical environment, a panoramic image with a horizontal FOV around 180° can be obtained such as the ones shown in the first and the second embodiments of the present invention, while the vertical FOV is made asymmetrical. Also, a cylindrical projection scheme is desirable when the vertical FOV is near the standard FOV around 60°, and a Mercator projection scheme is desirable when the FOV is larger. Also, in an application example where any dead zone in the security monitoring either in the ceiling or the floor cannot be tolerated, then an equirectangular projection scheme can be used. In this case, the horizontal FOV as well as the vertical FOV will be both 180°.
It is desirable to install the image acquisition means(3710) of the fourth embodiment of the present invention on top of the trunk of a passenger car, and to align the optical axis at a certain angle with the ground plane. Furthermore, a fisheye lens with a FOV larger than 180° and following an equidistance projection scheme is most preferable, and the image display means is desirably installed next to the driver seat.
Using a wideangle camera with its optical axis inclined toward the ground plane, it is possible to obtain a panoramic image such as those shown in the first and the second embodiments of the present invention. The world coordinate system of this embodiment takes the nodal point N of the imaging system(3710) as the origin, and takes a vertical line that is perpendicular to the ground plane(3717) as the Yaxis, and the Zaxis is set parallel to the car(3751) axle. According to the convention of right handed coordinate system, the positive direction of the Xaxis is the direction directly plunging into the paper in
Regarding the rotation of coordinate system, it is convenient to use the Euler matrices. For this, the coordinate of an object point Q in a threedimensional space is designated as a threedimensional vector as given below.
Here,
Likewise, the matrix given in Eq. 105 can be used to find the coordinate of a new point which is obtainable by rotating the point Q by angle −β around the Yaxis, and the matrix given in Eq. 106 can be used to find the coordinate of a new point which is obtainable by rotating the point Q by angle −γ around the Zaxis.
Matrices in Eqs. 104 through 106 can describe the case where the coordinate system is fixed and the point in space has been rotated, but also the same matrices can describe the case where the point in space is fixed and the coordinate system has been rotated in the reverse direction. These two cases are mathematically equivalent. Therefore, the coordinate of a point Q in the first world coordinate system that is obtained by rotating the world coordinate system by angle α around the Xaxis as indicated in
Using the matrix given in Eq. 104, the coordinate in the first world coordinate system can be given as follows in terms of the coordinate in the world coordinate system.
X′=X [Math Figure 108]
Y′=Y cos α+Z sin α [Math Figure 109]
Z′=−Y sin α+Z cos α [Math Figure 110]
Referring to
Next, it is assumed that an incident ray having these horizontal and vertical incidence angles has been originated from an object point on a hemisphere with a radius 1 and having its center at the nodal point of the lens. Then, the coordinate of the said object point in the world coordinate system is given by Eqs. 114 through 116.
X=cos δ sin ψ [Math Figure 114]
Y=sin δ [Math Figure 115]
Z=cos δ cos ψ [Math Figure 116]
The coordinate of this object point in the first world coordinate system is given by Eqs. 108 through 110. The X′, Y′ and Z′axes of this first world coordinate system are parallel to the x, y, and zaxes of the first rectangular coordinate system, respectively. Therefore, the zenith and the azimuth angles of the incident ray are given by Eqs. 117 and 118 by the same method in the first embodiment.
Finally, the location of the first point on the image sensor plane having these zenith and azimuth angles can be obtained by the same methods as in the first and the second embodiments.
Identical to the previous examples, considering the fact that all the image sensors and display devices are digital devices, image processing procedure must use the following set of equations. After a series of preparatory stages have been taken as in the first and the second embodiments, the desirable size of the processed image plane and the location (I_{o}, J_{o}) of the third intersection point are determined, and then the horizontal incidence angle ψ_{J }and the vertical incidence angle δ_{I }given by Eqs. 119 and 120 are computed for all the pixels (I, J) on the said processed image plane.
From these horizontal and vertical incidence angles, the coordinate of an imaginary object point in the world coordinate system is calculated using Eqs. 121 through 123.
X_{I,J}=cos δ_{I }sin ψ_{J} [Math Figure 121]
Y_{I,J}=sin δ_{I} [Math Figure 122]
Z_{I,J}=cos δ_{I }cos ψ_{J} [Math Figure 123]
From this coordinate of the object point in the world coordinate system, the coordinate of the object point in the first world coordinate system is obtained using Eqs. 124 through 126.
X′_{I,J}=X_{I,J} [Math Figure 124]
Y′_{I,J}=Y_{I,J }cos α+Z_{I,J }sin α [Math Figure 125]
Z′_{I,J}=−Y_{I,J }sin α+Z_{I,J }cos α [Math Figure 126]
From this coordinate, the zenith angle θ_{I,J }and the azimuth angle Φ_{I,J }of the incident ray are computed using Eqs. 127 and 128.
Next, the image height r_{u }on the image sensor plane is calculated using Eq. 129.
r_{I,J}=r(θ_{I,J}) [Math Figure 129]
Then, the position (K_{o}, L_{o}) of the second intersection point on the uncorrected image plane and the magnification ratio g are used to find the position of the second point on the uncorrected image plane.
x′_{I,J}=L_{o}+gr_{I,J }cos(φ_{I,J}) [Math Figure 130]
y′_{I,J}=K_{o}+gr_{I,J }sin(φ_{I,J}) [Math Figure 131]
Once the position of the corresponding second point has been found, then interpolation methods such as described in the first and the second embodiments can be used to obtain a panoramic image.
One point which needs special attention when using such an imaging system as a car rear view camera is the fact that for a device (i.e., a car) of which the moving direction is the exact opposite of the optical axis direction of the image acquisition means, it can cause a great confusion to the driver if a panoramic image obtained by the methods in the first through the fourth embodiments is displayed without any further processing. Since a car rear view camera is heading toward the backside of the car, the right end of the car appears as the left end on a monitor showing the images captured by the rear view camera. However, the driver can fool himself by thinking that the image is showing the left end of the car from his own viewpoint of looking at the front end of the car, and thus, there is a great danger of possible accidents. To prevent such a perilous confusion, it is important to switch the left and the right sides of the image obtained using a car rear view camera before displaying it on the monitor. The video signal S′(I, J) for the pixel in the mirrored (i.e., the left and the right sides are exchanged) processed image plane with a coordinate (I, J) is given by the video signal S(I, J_{max}−J+1) from the pixel in the processed image plane with a coordinate (I, J_{max}−J+1).
S′(I,J)=S(I,J_{max}−J+1) [Math Figure 132]
On the other hand, an identical system can be installed near the room mirror, frontal bumper, or the radiator grill in order to be used as a recording camera connected to a car black box for the purpose of recording vehicle'"'"'s driving history.
Above embodiment has been described in relation to a car rear view camera, but it must be obvious that the usefulness of the invention described in this embodiment is not limited to a car rear view camera.
For large buses and trucks, it is necessary to monitor the lateral sides of the vehicle as well as the rear side of the vehicle. Such a side monitoring imaging system will be especially useful when making a turn in a narrow alley, or when changing lanes in a highway. Also, it will be useful as a securitymonitoring camera for the purpose of preventing accidents when passengers are boarding on the bus or getting off from the bus.
If the maximum FOV of this fisheye lens is given as 2θ_{2}, then the image height of an incident ray on the image sensor plane having the maximum zenith angle is given as r_{2}≡r(θ_{2}). Here, the desirable image height is given by Eq. 133.
Therefore, the image circle(4433) contacts the top edge(4413T) and the bottom edge(4413B) of the image sensor plane(4413). In this configuration, always the same horizontal FOV can be obtained even if the imaging system is slanted at an arbitrary angle with respect to the ground plane.
Referring to
x′″=x″ cos γ+y″ sin γ [Math Figure 134]
y′″=−x″ sin γ+y″ cos γ [Math Figure 135]
Therefore, the horizontal and the vertical incidence angles of an incident ray corresponding to this third point must be given by Eq. 136 and Eq. 137, respectively.
Therefore, the signal value of the third point on the processed image plane having an ideal projection scheme must be the signal value of an image point on the image sensor plane formed by an incident ray originated from an object point on the object plane having a horizontal incidence angle (i.e., the longitude) given by Eq. 136 and a vertical incidence angle (i.e., the latitude) given by Eq. 137. The zenith angle of this incident ray is given by Eq. 138, the azimuth angle is given by Eq. 139, and the image height is given by Eq. 140.
The image point corresponding to an object point having these zenith and azimuth angles has a twodimensional rectangular coordinate given by Eqs. 141 and 142 in the second rectangular coordinate system, which is a coordinate system with the axes rotated by angle γ with respect to the Yaxis.
x′=gr(θ)cos(γ+φ) [Math Figure 141]
y′=gr(θ)sin(γ+φ) [Math Figure 142]
Therefore, it suffice to assign the signal value of an image point on the uncorrected image plane having this rectangular coordinate as the signal value of the third point on the processed image plane.
Identical to the previous examples, considering the fact that all the image sensors and display devices are digital devices, image processing procedure must use the following set of equations. After a series of preparatory stages have been taken as in the first through the fifth embodiments, the desirable size of the processed image plane and the location (I_{o}, J_{o}) of the third intersection point are determined, and then the horizontal incidence angle ψ_{I,J }and the vertical incidence angle δ_{I,J }given by Eqs. 143 and 144 are computed for all the pixels (I, J) on the said processed image plane.
From these horizontal and vertical incidence angles, the zenith angle θ_{I,J }and the azimuth angle Φ_{I,J }of an incident ray in the world coordinate system are calculated using Eqs. 145 and 146.
Next, the image height r_{I,J }on the image sensor plane is calculated using Eq. 147.
r_{I,J}=r(θ_{I,J}) [Math Figure 147]
Next, the position (K_{o}, L_{o}) of the second intersection point on the uncorrected image plane and the magnification ratio g are used to find the position of the second point on the uncorrected image plane.
x′_{I,J}=L_{o}+gr_{I,J }cos(γ+φ_{I,J}) [Math Figure 148]
y′_{I,J}=K_{o}+gr_{I,J }sin(γ+φ_{I,J}) [Math Figure 149]
Once the position of the corresponding second point has been found, then interpolation methods such as described in the first through the third embodiments can be used to obtain a panoramic image.
The desirable examples of the monotonically increasing function F(δ) of the incidence angle in this embodiment can be given by Eqs. 97, 99 and 101.
When an airplane takes off from the ground, or when it is landing down, or when it is making a turn, the airplane body is leaning sideways as well as toward the moving direction.
Regarding the rotation of coordinate system, it is convenient to use the Euler matrices as in the fourth embodiment. The coordinate of the said one point in the first world coordinate system, which is a coordinate system that has been rotated by angle α around the Xaxis as shown in
On the other hand, referring to
Here, a rotational operation by angle γ around the Z′axis is a completely different operation from a rotational operation by angle γ around the Zaxis. However, using the Euler matrices, a rotational matrix referring to the axes in the first world coordinate system can be written in terms of rotational matrices in the world coordinate system.
M_{Z′}(γ)=M_{X}(α)M_{Z}(γ)M_{X}(−α) [Math Figure 152]
Therefore, Eq. 151 can be simplified as follows.
Using the rotational matrices given in Eqs. 104 and 106, the coordinate in the second world coordinate system can be written in terms of the coordinate in the world coordinate system as follows.
X″=X cos γ+Y sin γ [Math Figure 154]
Y″=−X cos α sin γ+Y cos α cos γ+Z sin α [Math Figure 155]
Z″=X sin α sin γ−Y sin α cos γ+Z cos α [Math Figure 156]
Referring to
If a wideangle imaging system is installed parallel to the body of a device such as an airplane and a ship as illustrated in
In this case, the following algorithm can be used to obtain a panoramic image that is parallel to the ground plane irrespective of the inclination angles of the device. First, such a system needs a direction sensing means as in the sixth embodiment, and this direction sensing means must provide two angular values α and γ to the image processing means.
Then, under the assumption that the said imaging system is parallel to the ground plane, the size of the processed image plane, the location of the intersection point, and the field of view are determined by the same method as in the examples of the previous embodiments. Therefore, the horizontal incidence angle corresponding to the lateral coordinate x″ on the processed image plane and the vertical incidence angle corresponding to the longitudinal coordinate y″ are given by Eqs. 157 through 159.
Next, it is assumed that an incident ray having these horizontal and vertical incidence angles has been originated from an object point on a hemisphere with a radius 1 and having its center at the nodal point of the said lens. Then, the coordinate of the said object point in the world coordinate system is given by Eqs. 160 through 162.
X=cos δ sin ψ [Math Figure 160]
Y=sin δ [Math Figure 161]
Z=cos δ cos ψ [Math Figure 162]
The coordinate of this object point in the second world coordinate system is given by Eqs. 154 through 156. The X″, Y″ and Z″axes of this second world coordinate system are parallel to the x, y, and zaxes of the first rectangular coordinate system, respectively. Therefore, the zenith and the azimuth angles of the incident ray are given by Eqs. 163 and 164 by the same method of the first embodiment.
θ=cos^{−1}(Z″) [Math Figure 163]
Finally, the location of the first point on the image sensor plane having these zenith and azimuth angles can be obtained by the same methods as in the first through the fourth embodiments.
Identical to the previous examples, considering the fact that all the image sensors and display devices are digital devices, image processing procedure must use the following set of equations. After a series of preparatory stages have been taken as in the first through the fourth embodiments, the desirable size of the processed image plane and the location (I_{o}, J_{o}) of the third intersection point are determined, and then the horizontal incidence angle ψ_{J }and the vertical incidence angle δ_{I }given by Eqs. 165 and 166 are computed for all the pixels (I, J) on the said processed image plane.
From these horizontal and vertical incidence angles, the coordinate of the imaginary object point in the world coordinate system is calculated using Eqs. 167 through 169.
X_{I,J}=cos δ_{I }sin ψ_{J} [Math Figure 167]
Y_{I,J}=sin δ_{I} [Math Figure 168]
Z_{I,J}=cos δ_{I }cos ψ_{J} [Math Figure 169]
From this coordinate of the object point in the world coordinate system, the coordinate of the object point in the second world coordinate system is obtained using Eqs. 170 through 172.
X″_{I,J}=X_{I,J }cos γ+Y_{I,J }sin γ [Math Figure 170]
Y″_{I,J}=−X_{I,J }cos α sin γ+Y_{I,J }cos α cos γ+Z_{I,J }sin α [Math Figure 171]
Z″_{I,J}=X_{I,J }sin α sin γ−Y_{I,J }sin α cos γ+Z_{I,J }cos α [Math Figure 172]
From this coordinate, the zenith angle θ_{I,J }and the azimuth angle Φ_{I,J }of the incident ray, are computed using Eqs. 173 and 174.
θ_{I,J}=cos^{−1}(Z″_{I,J}) [Math Figure 173]
Next, the image height r_{I,J }on the image sensor plane is calculated using Eq. 175.
r_{I,J}=r(θ_{I,J}) [Math Figure 175]
Then, the position (K_{o}, L_{o}) of the second intersection point on the uncorrected image plane and the magnification ratio g are used to find the position of the second point on the uncorrected image plane.
x′_{I,J}=L_{o}+gr_{I,J }cos(φ_{I,J}) [Math Figure 176]
y′_{I,J}=K_{o}+gr_{I,J }sin(φ_{I,J}) [Math Figure 177]
Once the position of the corresponding first point has been found, then interpolation methods such as described in the first and the second embodiments can be used to obtain a panoramic image.
The desirable examples of the monotonically increasing function F(δ) of the incidence angle in this embodiment can be given by Eqs. 97, 99 and 101.
It can be seen that the fourth embodiment is a special case of the seventh embodiment. In other words, the fourth embodiment is the seventh embodiment with a constraint of γ=0°.
In this embodiment, a celestial sphere(5230) with a radius S is assumed that takes the nodal point N of the lens as the center of the sphere. When all the points(5209) having a latitude angle x from the ground plane(5217), i.e., the XY plane, are marked, then the collection of these points form a small circle(5239) on the celestial sphere. A cone is further assumed, whereof the tangential points with this celestial sphere comprise the said small circle. Then, the vertex halfangle of this cone is also x, and the rotational symmetry axis of the cone coincides with the Zaxis. Hereinafter, this vertex halfangle is referred to as the reference angle.
The elevation angle and the azimuth angle of an incident ray originating from the said object point(5504) on the object plane can be obtained by the following method. Identical to the first embodiment, a vector from the origin N in the world coordinate system to the said object point(5504) on the object plane can be written a
In Eq. 178, {circumflex over (X)}=(1,0,0) is the unit vector along the Xaxis direction, and likewise, Ŷ=(0,1,0) and {circumflex over (Z)}=(0,0,1) are the unit vectors along the Yaxis and the Zaxis directions, respectively, and {circumflex over (R)} is the direction vector of the said vector, and R is the size of the said vector. Then, the following relations hold between the rectangular coordinate and the polar coordinate.
X={circumflex over (X)}·
Y=Ŷ·
Z={circumflex over (Z)}·
Therefore, using Eqs. 179 through 181, the zenith angle θ and the azimuth angle φ of the incident ray can be obtained from the rectangular coordinate (X, Y, Z) of the object point.
Furthermore, referring to
G(μ)=tan μ [Math Figure 185]
Eq. 185 has the same geometrical meaning as in the first embodiment of the present invention.
The distance from the said tangential point(5509) to the optical axis, in other words, the axial radius(5537), is given as Scos_{χ}. In this embodiment, this distance is considered as the radius of the object plane. Therefore, the lateral dimension of the object plane must satisfy the following Eq. 186, where c is proportionality constant.
2πS cos χ=cW [Math Figure 186]
Furthermore, considering the range of the said elevation angle, the following relation given in Eq. 187 must holds.
Here, B is a constant. On the other hand, another monotonic function F(μ) is defined as in Eq. 189.
Therefore, Eq. 190 can be obtained from Eqs. 188 and 189.
The following equation can be obtained from Eqs. 185, 189 and 190.
Therefore, the elevation angle corresponding to the said third point is given by Eq. 192.
Furthermore, the zenith angle can be easily obtained from the elevation angle of the incident ray.
On the other hand, the azimuth angle corresponding to the said third point is given by Eq. 194.
Therefore, using equations 190 through 194, the zenith and the azimuth angles of the incident ray corresponding to the third point on the processed image plane can be obtained, and using this, image processing can be done as in the previous embodiments.
Identical to the previous examples, considering the fact that all the image sensors and display devices are digital devices, image processing procedure must use the following set of equations. First, the desirable size (I_{max}, J_{max}) of the processed image plane, the location (I_{o}, J_{o}) of the reference point, and the vertex halfangle of the cone, in other words, the reference angle χ, are setup. In this embodiment, the reference angle χ takes a value that is larger than −90° and smaller than 90°. Then, the said constant A is given by Eq. 195.
By using J_{max}−1 as the numerator as in this case, the first column (i.e., the column with J=1) of the processed image plane exhibits the same information as the last column (i.e., the column with J=J_{max}). In a panoramic image exhibiting the view of 360° directions, it appears natural when the left edge and the right edge matches. However, if such a duplicate display of information is not preferred, then it is only necessary to change the numerator in Eq. 195 as J_{max}. Next, for all the pixels (I, J) on the said processed image plane, the elevation angle μ_{1 }and the azimuth angle φ_{J }given by Eq. 196 and 197 are computed.
On the other hand, the zenith angle of the incident ray is given by Eq. 198.
The image height r_{1 }on the image sensor plane is obtained using Eq. 199.
r_{1}=r(θ_{1}) [Math Figure 199]
Then, the position (K_{o}, L_{o}) of the second intersection point on the uncorrected image plane and the magnification ratio g are used to find the position of the second point on the uncorrected image plane.
x′_{I,J}=L_{o}+gr_{I }cos φ_{J} [Math Figure 200]
y′_{I,J}=K_{o}+gr_{I }sin φ_{J} [Math Figure 201]
Once the position of the corresponding second point has been found, then interpolation methods such as described in the first through the third embodiments can be used to obtain a panoramic image. Such a processed image plane satisfies the relation given in Eq. 202 or Eq. 203.
The conceptual drawing of the uncorrected image plane according to the ninth embodiment of the present invention is identical to
F(μ)=μ [Math Figure 205]
The radius of the object plane of the present embodiment is taken as the radius of the celestial sphere. Therefore, the lateral dimension of the object plane must satisfy the relation given in Eq. 207, where c is proportionality constant.
2πS=cW [Math Figure 207]
Furthermore, considering the range of the elevation angle, the relation given in Eq. 208 must be satisfied.
SF(μ)=c(y″−y″_{o}) [Math Figure 208]
Therefore, the relation given in Eq. 209 must hold true.
Therefore, the elevation angle corresponding to the said third point is given by Eq. 210.
Furthermore, the zenith angle can be easily obtained from the elevation angle of the incident ray.
In this embodiment, the reference angle χ takes a value that is larger than −90° and smaller than 90°. On the other hand, the azimuth angle corresponding to the third point is given by Eq. 212.
Therefore, using equations 208 through 212, the zenith and the azimuth angles of the incident ray corresponding to the third point on the processed image plane can be obtained, and using this, image processing can be done as in the previous embodiments.
Identical to the previous examples, considering the fact that all the image sensors and display devices are digital devices, image processing procedure must use the following set of equations. First, the desirable size (I_{max}, J_{max}) of the processed image plane, the location (I_{o}, J_{o}) of the reference point, and the reference angle χ are setup. Then, the said constant A is given either by Eq. 213 or 214.
By using J_{max}−1 as the numerator as in Eq. 213, the first column of the processed image plane exhibits the same information as the last column. On the other hand, by using J_{max }as the numerator as in Eq. 214, all the columns correspond to different azimuth angles. Next, for all the pixels (I, J) on the said processed image plane, the elevation angle μ_{1 }and the azimuth angle φ_{J }given by Eq. 215 and 216 are computed.
On the other hand, the zenith angle of the incident ray is given by Eq. 217.
The image height r_{1 }on the image sensor plane is calculated using Eq. 218.
r_{1}=r(θ_{1}) [Math Figure 218]
Next, the position (K_{o}, L_{o}) of the second intersection point on the uncorrected image plane and the magnification ratio g are used to find the position of the second point on the uncorrected image plane.
x′_{I,J}=L_{o}+gr_{I }cos φ_{J} [Math Figure 219]
y′_{I,J}=K_{o}+gr_{I }sin φ_{J} [Math Figure 220]
Once the position of the corresponding second point has been found, interpolation methods such as described in the first and the second embodiments can be used to obtain a panoramic image. Such a processed image plane satisfies the relation given in Eq. 221 or Eq. 222.
The desirable examples of the monotonically increasing function F(μ) of the elevation angle in this embodiment can be given by Eqs. 97, 99 and 101.
The most of the imaging systems in the present invention excluding those from the fifth and the sixth embodiments share many common features.
The world coordinate system of the tenth embodiment of the present invention takes the nodal point N of the said lens as the origin, and a vertical line passing through the origin as the Yaxis. Here, a vertical line is a line perpendicular to the ground plane, or more precisely to the horizontal plane(6317). The Xaxis and the Zaxis of the world coordinate system are contained within the ground plane. The optical axis(6301) of the said wideangle lens generally does not coincide with the Yaxis, and can be contained within the ground plane (i.e., parallel to the ground plane), or is not contained within the ground plane. Herein, the plane(6304) containing both the said Yaxis and the said optical axis(6301) is referred to as the reference plane. The intersection line(6302) between this reference plane(6304) and the ground plane(6317) coincides with the Zaxis of the world coordinate system. On the other hand, an incident ray(6305) originating from an object point Q having a rectangular coordinate (X, Y, Z) in the world coordinate system has an altitude angle δ from the ground plane, and an azimuth angle ψ with respect to the reference plane. The plane containing both the said Yaxis and the said incident ray is the incidence plane(6306). The horizontal incidence angle ψ of the said incident ray with respect to the said reference plane is given by Eq. 223.
On the other hand, the vertical incidence angle (i.e., the altitude angle) δ subtended by the said incident ray and the XZ plane is given by Eq. 224.
Said elevation angle μ of the incident ray is given by Eq. 225, where the reference angle χ takes a value that is larger than −90°, and smaller than 90°.
μ=δ−χ [Math Figure 225]
On the other hand, if we assume the coordinate of an image point on a screen corresponding to an object point with a coordinate (X, Y, Z) in the world coordinate system is (x″, y″), then the said image processing means process the image so that the image point corresponding to an incident ray originating from the said object point appears on the said screen at the coordinate (x″, y″), wherein the lateral coordinate x″ of the image point is given by Eq. 226.
x″=cψ [Math Figure 226]
Here, c is proportionality constant. Furthermore, the longitudinal coordinate y″ of the said image point is given by Eq. 227.
y″=cF(μ) [Math Figure 227]
Here, F(μ) is a monotonically increasing function passing through the origin. In mathematical terminology, it means that Eqs. 228 and 229 are satisfied.
The above function F can take an arbitrary form, but the most desirable forms are given by Eqs. 230 through 232.
The specification of the present invention is implicitly described with reference to the visible wavelength range, but the projection scheme of the present invention can be described by the same equations as described above even in the millimeter and the microwave wavelength ranges, in the ultra violet wavelength range, in the near infrared wavelength range, in the far infrared wavelength range, as well as in the visible wavelength range. Accordingly, the present invention is not limited to imaging systems operating in the visible wavelength range.
Preferred embodiments of the current invention have been described in detail referring to the accompanied drawings. However, the detailed description and the embodiments of the current invention are purely for illustrate purpose, and it will be apparent to those skilled in the art that variations and modifications are possible without deviating from the sprits and the scopes of the present invention.
According to the present invention, mathematically precise image processing methods of extracting panoramic images that appear most natural to the naked eye from an image acquired using a camera equipped with a wideangle lens rotationally symmetric about an optical axis, and devices using the methods are provided.
panorama, distortion, projection scheme, equidistance projection, rectilinear projection, fisheye lens, panoramic lens, image processing, image correction, rear view camera, video phone