Image processing device, calibration method thereof, and image processing
First Claim
1. An image processing device comprising:
- image pickup means for forming an image of a target using an optical system, then subsequently taking the image using an image pickup device, and obtaining image information including the target;
target-point detecting means for detecting the position in a field where a target point of the target exists as position information represented by information unrelated to the position where the image pickup means exists; and
relevant information generating means for obtaining relevant information representing the correlation between the position information detected by the target-point detecting means and the camera coordinates on the basis of the direction and/or field angle where the image pickup means takes an image.
1 Assignment
0 Petitions
Accused Products
Abstract
An image processing device includes: an image pickup unit, for forming an image of a target with an optical system, then taking the image using an image pickup device, and obtaining image information including the target; a target-point detecting unit, for detecting a target point where the target exists within a field as position information expressed by information unrelated to the position where the image pickup unit exists; and a relevant information generating unit, for obtaining relevant information representing the correlation between the position information detected by the target-point detecting unit and the direction where the image pickup unit takes an image of the target and/or camera coordinates on the basis of the view angle (i.e., for performing calibration). Thus, the position of the target existing within the three-dimensional field space within an image pickup region where an image of the target is taken by the image pickup unit, can be calculated.
-
Citations
86 Claims
-
1. An image processing device comprising:
-
image pickup means for forming an image of a target using an optical system, then subsequently taking the image using an image pickup device, and obtaining image information including the target;
target-point detecting means for detecting the position in a field where a target point of the target exists as position information represented by information unrelated to the position where the image pickup means exists; and
relevant information generating means for obtaining relevant information representing the correlation between the position information detected by the target-point detecting means and the camera coordinates on the basis of the direction and/or field angle where the image pickup means takes an image. - View Dependent Claims (2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83)
-
2. An image processing device according to claim 1, further comprising focus control means for controlling the optical system such that the image of the target to be taken by the image pickup means focuses on the image pickup device plane.
-
4. An image processing device according to claim 1, wherein the coordinates of a position where the target point exists are field coordinates representing the absolute position where the target point exists within a field by means of coordinates.
-
6. An image processing device according to claim 4, the target-point detecting means comprising:
-
field coordinates detecting means for detecting the field coordinates of a target, in order to measure the field coordinates of the target;
field coordinates information transmitting means for transmitting the field coordinates information measured by the field coordinates detecting means; and
field coordinates information receiving means for receiving the field coordinates information transmitted by the field coordinates transmitting means.
-
-
8. An image processing device according to claim 1, wherein the target-point detecting means comprises multiple target-point sensors each of which an address is assigned to for detecting the position of the target point,
wherein the coordinates of the position where the target point exists are the address number of the target-point sensor which detected the target point, and wherein the relevant information generating means obtains the correlation between the position information and the camera coordinates using a conversion table indicating the correlation between the address number and field coordinates representing the absolute position where the target-point sensor exists within a field, by means of coordinates. -
10. An image processing device according to claim 1, further comprises image cropping means for outputting the image information relating to a partial region of the image information obtained by the image pickup means based on the relevant information obtained by the relevant information generating means.
-
12. An image processing device according to claim 10, wherein the image cropping means outputs the image information relating to a partial region of the image information taken by the image pickup device.
-
14. An image processing device according to claim 10, wherein the image information output by the image cropping means is the image information with a predetermined area centered on a point corresponding to the target point detected by the target-point detecting means, of the image information obtained by the image pickup means.
-
16. An image processing device according to claim 14, further comprising target size information storing means for storing the size of the target within the field space,
wherein the image cropping means reads out the target size relating to the target point detected by the target-point detecting means from the target size information storing means, and this readout target size is converted into image-pickup-device plane coordinates based on the relevant information of the coordinates obtained by the relevant information generating means to obtain the size of the predetermined area. -
18. An image processing device according to claim 10, wherein the image information output by the image cropping means is the image information of the region surrounded by a polygon of which apexes are the target points detected by the target-point detecting means, of the image information obtained by the image pickup means.
-
20. An image processing device according to claim 10, wherein the image information output by the image cropping means is the image information of the region including all of the multiple target points detected by the target-point detecting means, out of the image information obtained by the image pickup means.
-
22. An image processing device according to claim 10, wherein the relevant information generating means generates the relevant information at the time of startup of the image processing device,
and wherein the image cropping means outputs the image information relating to a partial region of the image information obtained by the image pickup means based on the relevant information that the relevant information generating means obtains at the time of startup. -
24. An image processing device according to claim 4, wherein the relevant information generating means obtains the relevant information between the field coordinates and the image-pickup-device plane coordinates where the image pickup means takes an image based on the relevant information between the field coordinates detected by the target-point detecting means and the camera coordinates on the basis of the direction and/or field angle where the image pickup means takes an image.
-
26. An image processing device according to claim 24, wherein the camera coordinates are three-dimensional coordinates of which the origin is the center position of incident pupil of the optical system, represented by one axis serving as a primary ray passing through the origin and the center of the image pickup device plane, and two axes orthogonal to each other and to that axis, the camera coordinates being different from the field coordinates.
-
28. An image processing device according to claim 26, wherein the relevant information generating means obtains the relevant information by using a conversion expression for converting the field coordinates into the camera coordinates.
-
30. An image processing device according to claim 28, wherein the conversion expression that the relevant information generating means employs is switched according to the magnification of the optical system.
-
33. An image processing device according to claim 26, wherein the relevant information generating means obtains the relevant information by using a conversion table for converting the field coordinates into the camera coordinates.
-
35. An image processing device according to claim 33, wherein the conversion table that the relevant information generating means employs is switched according to the magnification of the optical system.
-
37. An image processing device according to claim 10, wherein the image-pickup-device plane coordinates divide the entire view angle where the image pickup means takes an image into multiple small view angles,
and wherein the image cropping means selects the field angle to be read out from the multiple small view angles based on the relevant information of the coordinates obtained by the relevant information generating means, outputs the image information relating to the selected view angle, out of the image information obtained by the image pickup means. -
39. An image processing device according to claim 10, further comprising image information recording means for recording the field coordinates of the target point detected by the target-point detecting means or the image-pickup-device plane coordinates as well as the image information obtained by the image pickup means,
wherein the image cropping means additionally reads out the field coordinates value of the target point or the image-pickup-device plane coordinates at the time of reading out the image information recorded by the image information recording means, and outputs the image information relating to a partial region of the readout image information according to the readout field coordinates value or image-pickup-device plane coordinates. -
41. An image processing device according to claim 10, further comprising image information recording means for recording the image information obtained by the image pickup means, the field coordinates of the target point detected by the target-point detecting means, the camera coordinates, and the relevant information obtained by the relevant information generating means,
wherein the image cropping means additionally reads out the field coordinates value of the target point, the camera coordinates, and the relevant information at the time of reading out the image information recorded by the image information recording means, and outputs the image information relating to a partial region of the readout image information according to the readout field coordinates of the target point, camera coordinates, and relevant information. -
43. An image processing device according to claim 6, wherein the field coordinates detecting means is means capable of measuring the latitude, longitude, and altitude of the target point by means of a GPS (Global Positioning System),
and wherein the field coordinates are coordinates represented by at least two of the measured latitude, longitude, and altitude. -
45. An image processing device according to claim 4, wherein the target-point detecting means is means for measuring the field coordinates of the target point as to multiple base stations by means of triangulation based on the intensity difference or arrival time difference of airwaves emitted from the multiple base stations,
and wherein the field coordinates are coordinates indicating the position of the measured target point as to the multiple base stations. -
47. An image processing device according to claim 4, wherein the target-point detecting means is means for measuring the field coordinates of the target point as to multiple base stations by means of triangulation based on the intensity difference or arrival time difference of airwaves emitted from the target point,
and wherein the field coordinates are coordinates indicating the position of the measured target point as to the multiple base stations. -
49. An image processing device according to claim 6, wherein the field coordinates detecting means is a group of pressure-sensitive sensors disposed with equal intervals, and the pressure-sensitive sensors on which the target rides detect the target, thereby measuring the position of the target above the sensor group,
and wherein the field coordinates are coordinates indicating the position of the measured target above the pressure-sensitive sensor group. -
51. An image processing device according to claim 4, wherein the target has information transmitting means for transmitting information indicating its own present position,
and wherein the target-point detecting means measures the field coordinates of the information transmitting means as to the target-point detecting means based on the information transmitted by the information transmitting means. -
53. An image processing device according to claim 51, wherein the information transmitting means transmits airwaves having a predetermined frequency as information indicating its own present position,
wherein the target-point detecting means is an adaptive array antenna for receiving the transmitted airwaves, wherein multiple antennas making up the adaptive array antenna detect the phase difference of the airwaves transmitted by the information transmitting means, and wherein the direction in which the target point that has transmitted the airwaves exists within the field is detected based on the detected phase difference. -
55. An image processing device according to claim 53, wherein the target-point detecting means comprises multiple adaptive array antennas,
and wherein the field coordinates of the information transmitting means as to the target-point detecting means are measured by performing triangulation based on the direction in which the target point that has transmitted the airwaves exists in the field, detected by the multiple adaptive array antennas. -
57. An image processing device according to claim 51, wherein the information transmitting means transmits ultrasonic waves having a predetermined frequency,
and wherein the target-point detecting means receives the ultrasonic waves transmitted by the information transmitting means at multiple points, performs triangulation, and measures the field coordinates of the information transmitting means as to the target-point detecting means. -
59. An image processing device according to claim 51, wherein the information transmitting means transmits infrared light at a predetermined flashing cycle,
and wherein the target-point detecting means receives the infrared light transmitted by the information transmitting means at multiple points, performs triangulation, and measures the field coordinates of the information transmitting means as to the target-point detecting means. -
61. An image processing device according to claim 4, further comprising at least one distance measurement camera of which the positional relation as to the image pickup means is known,
wherein the target-point detecting means measures the field coordinates of the target point as to the distance measurement camera and the image pickup means by performing triangulation on the target point with the distance measurement camera and the image pickup means. -
63. An image processing device according to claim 24, further comprising a position detection sensor for detecting the field coordinates of at least two points on the primary ray passing through the incident pupil center position of the optical system and the center of the image pickup device plane, of which the positional relation as to the image pickup means is known, and the field coordinates of at least one point except for on the line parallel to the primary ray,
wherein the relevant information generating means obtains the relevant information between the field coordinates detected by the target-point detecting means and the image-pickup-device plane coordinates where the image pickup means takes an image based on the correlation between the field coordinates values of the position detection sensors of at least three points and the camera coordinates. -
65. An image processing device according to claim 24, further comprising a position detection sensor for detecting the field coordinates of at least one point on the primary ray passing through the incident pupil center position of the optical system and the center of the image pickup device plane, of which the positional relation as to the image pickup means is known, the field coordinates of at least one point positioned within an image pickup region where the image pickup means takes an image and also positioned on the primary ray, and the field coordinates of at least one point except for the primary ray,
wherein the relevant information generating means obtains a conversion expression for converting the field coordinates detected by the target-point detecting means into the image-pickup-device plane coordinates where the image pickup means takes an image based on the relevant information between the field coordinates values of the position detection sensors of at least three points and the camera coordinates, as the relevant information. -
67. An image processing device according to claim 10, wherein the image cropping means starts output of the image information relating to a partial region of the image information obtained by the image pickup means when the target-point detecting means detects the field coordinates of the target point within a predetermined specific region in a field.
-
69. An image processing device according to claim 10, wherein the image pickup means comprises multiple cameras which differ from each other in at least one of the region to be taken, the direction for image-taking, power, and depth of field, wherein an image can be picked up,
and wherein the image cropping means selects one camera from the multiple cameras according to the field coordinates of the target point detected by the target-point detecting means, and outputs the image information taken by the selected camera. -
71. An image processing device according to claim 69, wherein in the event that the target point exists on an overlapped region of the image pickup regions of the multiple cameras, the image cropping means selects a camera having a greater number of pixels to take an image of the target from the cameras corresponding to the overlapped region.
-
73. An image processing device according to claim 6, wherein the field coordinates information transmitting means transmits the ID information of the target as well as the field information of the target point relating to the target.
-
75. An image processing device according to claim 10, further comprising lens control means for controlling the optical status of the image pickup means,
wherein the image cropping means corrects the size of a region of the image information to be output according to an optical status controlled by the lens control means. -
77. An image processing device according to claim 4, further comprising lens control means for controlling the optical status of the image pickup means,
wherein, in the event that the image-pickup-device plane coordinates corresponding to the field coordinates of the target point detected by the target-point detecting means are out of the coordinates range where the image pickup means can take an image, the lens control means controls the optical status of the image pickup means so as to become the view angle of a wide direction. -
79. A calibration method of an image processing device according to claim 33 for obtaining a conversion table, the calibration method comprising:
-
a first step for disposing target points at predetermined intervals within the field;
a second step for obtaining the field coordinates of the disposed target points;
a third step for taking an image of the target points disposed at the predetermined intervals by means of the image pickup means; and
a fourth step for generating the conversion table by correlating the field coordinates obtained in the second step with the image-pickup-device plane coordinates in the image taken in the third step for each target point disposed in the first step.
-
-
81. A calibration method of an image processing device according to claim 63 for obtaining a conversion expression, the calibration method comprising:
-
a first step for disposing at least one target point on the primary ray passing through the incident pupil center position of the optical system and the center of the image pickup device plane, and at least one target point other than on the primary ray within an image pickup region where the image pickup means takes an image within the field;
a second step for obtaining the field coordinates of at least the two disposed target points;
a third step for taking images of at least the two target points disposed by means of the image pickup means; and
a fourth step for creating the conversion expression based on the relevant information between the field coordinates obtained from the field coordinates value of at least one target point on the primary ray of which positional relation as to the image pickup means is known, and the field coordinates values of at least two target points obtained in the second step, and the camera coordinates, and the relevant information between the field coordinates values of at least two target points in the image taken in the third step and the image-pickup-device plane coordinates.
-
-
83. A calibration method of an image processing device according to claim 65 for obtaining a conversion expression, the calibration method comprising:
-
a first step for disposing at least one target point on the primary ray passing through the incident pupil center position of the optical system and the center of the image pickup device plane, and at least one target point other than on the primary ray within an image pickup region where the image pickup means takes an image within the field;
a second step for obtaining the field coordinates of at least the two disposed target points;
a third step for taking images of at least the two target points disposed by means of the image pickup means; and
a fourth step for creating the conversion expression based on the relevant information between the field coordinates obtained from the field coordinates value of at least one target point on the primary ray of which positional relation as to the image pickup means is known, and the field coordinates values of at least two target points obtained in the second step, and the camera coordinates, and the relevant information between the field coordinates values of at least two target points in the image taken in the third step and the image-pickup-device plane coordinates.
-
-
2. An image processing device according to claim 1, further comprising focus control means for controlling the optical system such that the image of the target to be taken by the image pickup means focuses on the image pickup device plane.
-
-
3. An image processing device comprising:
-
image pickup means for forming an image of a target using an optical system, then subsequently taking the image using an image pickup device, and obtaining image information including the target;
target-point detecting means for detecting the position in a field where a target point of the target exists as position information represented by information unrelated to the position where the image pickup means exists; and
relevant information generating means for obtaining relevant information representing the correlation between the position information detected by the target-point detecting means and image-pickup-device plane coordinates where the image pickup means takes an image of the target. - View Dependent Claims (5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 32, 34, 36, 38, 40, 42, 44, 46, 48, 50, 52, 54, 56, 58, 60, 62, 64, 66, 68, 70, 72, 74, 76, 78, 80, 82, 84)
-
5. An image processing device according to claim 3, wherein the coordinates of a position where the target point exists are field coordinates representing the absolute position where the target point exists within a field, by means of coordinates.
-
7. An image processing device according to claim 5, the target-point detecting means comprising:
-
field coordinates detecting means for detecting the field coordinates of a target, in order to measure the field coordinates of the target;
field coordinates information transmitting means for transmitting the field coordinates information measured by the field coordinates detecting means; and
field coordinates information receiving means for receiving the field coordinates information transmitted by the field coordinates transmitting means.
-
-
9. An image processing device according to claim 3, wherein the target-point detecting means comprises multiple target-point sensors each of which an address number is assigned to for detecting the position of the target point,
wherein the coordinates of the position where the target point exists are the address number of the target-point sensor which detected the target point, and wherein the relevant information generating means obtains the correlation between the position information and the image-pickup-device plane coordinates using a conversion table indicating the correlation between the address number and the image-pickup-device plane coordinates where the target-point sensor is taken. -
11. An image processing device according to claim 3, further comprising image cropping means for outputting the image information relating to a partial region of the image information obtained by the image pickup means based on the relevant information obtained by the relevant information generating means.
-
13. An image processing device according to claim 11, wherein the image cropping means outputs the image information relating to a partial region of the image information taken by the image pickup device.
-
15. An image processing device according to claim 11, wherein the image information output by the image cropping means is the image information with a predetermined area centered on a point corresponding to the target point detected by the target-point detecting means, of the image information obtained by the image pickup means.
-
17. An image processing device according to claim 15, further comprising target size information storing means for storing the size of the target within the field space,
wherein the image cropping means reads out the target size relating to the target point detected by the target-point detecting means from the target size information storing means, and this readout target size is converted into image-pickup-device plane coordinates based on the relevant information of the coordinates obtained by the relevant information generating means to obtain the size of the predetermined area. -
19. An image processing device according to claim 11, wherein the image information output by the image cropping means is the image information of the region surrounded by a polygon of which apex is the target point detected by the target-point detecting means, of the image information obtained by the image pickup means.
-
21. An image processing device according to claim 11, wherein, the image information output by the image cropping means is the image information of the region including all of the multiple target points detected by the target-point detecting means of the image information obtained by the image pickup means.
-
23. An image processing device according to claim 11, wherein the relevant information generating means generates the relevant information at the time of startup of the image processing device,
and wherein the image cropping means outputs the image information relating to a partial region of the image information obtained by the image pickup means based on the relevant information that the relevant information generating means obtains at the time of startup. -
25. An image processing device according to claim 5, wherein the relevant information generating means obtains the relevant information between the field coordinates and the image-pickup-device plane coordinates where the image pickup means takes an image based on the relevant information between the field coordinates detected by the target-point detecting means and the camera coordinates on the basis of the direction and/or field angle where the image pickup means takes an image.
-
27. An image processing device according to claim 25, wherein the camera coordinates are three-dimensional coordinates of which the origin is the center position of incident pupil of the optical system, represented by one axis serving as a primary ray passing through the origin and the center of the image pickup device plane, and two axes orthogonal to each other and to that axis, the camera coordinates being different from the field coordinates.
-
29. An image processing device according to claim 27, wherein the relevant information generating means obtains the relevant information by using a conversion expression for converting the field coordinates into the camera coordinates.
-
31. An image processing device according to claim 29, wherein the conversion expression that the relevant information generating means employs is switched according to the magnification of the optical system.
-
32. An image processing device according to claim 3, wherein the image-pickup-device plane coordinates are coordinates represented by 2 axes identifying a position within an image pickup device plane where the image pickup means takes an image.
-
34. An image processing device according to claim 27, wherein the relevant information generating means obtains the relevant information by using a conversion table for converting the field coordinates into the camera coordinates.
-
36. An image processing device according to claim 34, wherein the conversion table that the relevant information generating means employs is switched according to the magnification of the optical system.
-
38. An image processing device according to claim 11, wherein the image-pickup-device plane coordinates divide the entire view angle where the image pickup means takes an image into multiple small view angles,
and wherein the image cropping means selects the field angle to be read out from the multiple small view angles based on the relevant information of the coordinates obtained by the relevant information generating means, outputs the image information relating to the selected view angle, of the image information obtained by the image pickup means. -
40. An image processing device according to claim 11, further comprising image information recording means for recording the field coordinates of the target point detected by the target-point detecting means or the image-pickup-device plane coordinates as well as the image information obtained by the image pickup means,
wherein the image cropping means additionally reads out the field coordinates value of the target point or the image-pickup-device plane coordinates at the time of reading out the image information recorded by the image information recording means, and outputs the image information relating to a partial region of the readout image information according to the readout field coordinates value or image-pickup-device plane coordinates. -
42. An image processing device according to claim 11, further comprising image information recording means for recording the image information obtained by the image pickup means, the field coordinates of the target point detected by the target-point detecting means, the camera coordinates, and the relevant information obtained by the relevant information generating means,
wherein the image cropping means additionally reads out the field coordinates value of the target point, the camera coordinates, and the relevant information at the time of reading out the image information recorded by the image information recording means, and outputs the image information relating to a partial region of the readout image information according to the readout field coordinates of the target point, camera coordinates, and relevant information. -
44. An image processing device according to claim 7, wherein the field coordinates detecting means is means capable of measuring the latitude, longitude, and altitude of the target point by means of a GPS (Global Positioning System),
and wherein the field coordinates are coordinates represented by at least two of the measured latitude, longitude, and altitude. -
46. An image processing device according to claim 5, wherein the target-point detecting means is means for measuring the field coordinates of the target point as to multiple base stations by means of triangulation based on the intensity difference or arrival time difference of airwaves emitted from the multiple base stations,
and wherein the field coordinates are coordinates indicating the position of the measured target point as to the multiple base stations. -
48. An image processing device according to claim 5, wherein the target-point detecting means is means for measuring the field coordinates of the target point as to multiple base stations by means of triangulation based on the intensity difference or arrival time difference of airwaves emitted from the target point,
and wherein the field coordinates are coordinates indicating the position of the measured target point as to the multiple base stations. -
50. An image processing device according to claim 7, wherein the field coordinates detecting means is a group of pressure-sensitive sensors disposed with equal intervals, and the pressure-sensitive sensors on which the target rides detect the target, thereby measuring the position of the target above the sensor group,
and wherein the field coordinates are coordinates indicating the position of the measured target above the pressure-sensitive sensor group. -
52. An image processing device according to claim 5, wherein the target has information transmitting means for transmitting information indicating its own present position,
and wherein the target-point detecting means measures the field coordinates of the information transmitting means as to the target-point detecting means based on the information transmitted by the information transmitting means. -
54. An image processing device according to claim 52, wherein the information transmitting means transmits airwaves having a predetermined frequency as information indicating its own present position,
wherein the target-point detecting means is an adaptive array antenna for receiving the transmitted airwaves, wherein multiple antennas making up the adaptive array antenna detect the phase difference of the airwaves transmitted by the information transmitting means, and wherein the direction in which the target point that has transmitted the airwaves exists within the field is detected based on the detected phase difference. -
56. An image processing device according to claim 54, wherein the target-point detecting means comprises multiple adaptive array antennas,
and wherein the field coordinates of the information transmitting means as to the target-point detecting means are measured by performing triangulation based on the direction in which the target point that has transmitted the airwaves exists in the field, detected by the multiple adaptive array antennas. -
58. An image processing device according to claim 52, wherein the information transmitting means transmits ultrasonic waves having a predetermined frequency,
and wherein the target-point detecting means receives the ultrasonic waves transmitted by the information transmitting means at multiple points, performs triangulation, and measures the field coordinates of the information transmitting means as to the target-point detecting means. -
60. An image processing device according to claim 52, wherein the information transmitting means transmits infrared light at a predetermined flashing cycle,
and wherein the target-point detecting means receives the infrared light transmitted by the information transmitting means at multiple points, performs triangulation, and measures the field coordinates of the information transmitting means as to the target-point detecting means. -
62. An image processing device according to claim 5, further comprising at least one distance measurement camera of which the positional relation as to the image pickup means is known,
wherein the target-point detecting means measures the field coordinates of the target point as to the distance measurement camera and the image pickup means by performing triangulation on the target point with the distance measurement camera and the image pickup means. -
64. An image processing device according to claim 25, further comprising a position detection sensor for detecting the field coordinates of at least two points on the primary ray passing through the incident pupil center position of the optical system and the center of the image pickup device plane, of which the positional relation as to the image pickup means is known, and the field coordinates of at least one point except for on the line parallel to the primary ray,
wherein the relevant information generating means obtains the relevant information between the field coordinates detected by the target-point detecting means and the image-pickup-device plane coordinates where the image pickup means takes an image based on the correlation between the field coordinates values of the position detection sensors of at least three points and the camera coordinates. -
66. An image processing device according to claim 25, further comprising a position detection sensor for detecting the field coordinates of at least one point on the primary ray passing through the incident pupil center position of the optical system and the center of the image pickup device plane, of which the positional relation as to the image pickup means is known, the field coordinates of at least one point positioned within an image pickup region where the image pickup means takes an image and also positioned on the primary ray, and the field coordinates of at least one point except for the primary ray,
wherein the relevant information generating means obtains a conversion expression for converting the field coordinates detected by the target-point detecting means into the image-pickup-device plane coordinates where the image pickup means takes an image based on the relevant information between the field coordinates values of the position detection sensors of at least three points and the camera coordinates as the relevant information. -
68. An image processing device according to claim 11, wherein the image cropping means starts output of the image information relating to a partial region of the image information obtained by the image pickup means when the target-point detecting means detects the field coordinates of the target point within a predetermined specific region in a field.
-
70. An image processing device according to claim 11, wherein the image pickup means comprises multiple cameras which differ from each other in at least one of the region to be taken, the direction for image-taking, power, and depth of field, wherein an image can be picked up,
and wherein the image cropping means selects one camera from the multiple cameras according to the field coordinates of the target point detected by the target-point detecting means, and outputs the image information taken by the selected camera. -
72. An image processing device according to claim 70, wherein in the event that the target point exists on an overlapped region of the image pickup regions of the multiple cameras, the image cropping means selects a camera having a greater number of pixels to take an image of the target from the cameras corresponding to the overlapped region.
-
74. An image processing device according to claim 7, wherein the field coordinates information transmitting means transmits the ID information of the target as well as the field information of the target point relating to the target.
-
76. An image processing device according to claim 11, further comprising lens control means for controlling the optical status of the image pickup means,
wherein the image cropping means corrects the size of a region of the image information to be output according to an optical status controlled by the lens control means. -
78. An image processing device according to claim 5, further comprising lens control means for controlling the optical status of the image pickup means,
wherein, in the event that the image-pickup-device plane coordinates corresponding to the field coordinates of the target point detected by the target-point detecting means are out of the coordinates range where the image pickup means can take an image, the lens control means controls the optical status of the image pickup means so as to become the view angle of an wide direction. -
80. A calibration method of an image processing device according to claim 34 for obtaining a conversion table, the calibration method comprising:
-
a first step for disposing target points at predetermined intervals within the field;
a second step for obtaining the field coordinates of the disposed target points;
a third step for taking an image of the target point disposed at the predetermined intervals by means of the image pickup means; and
a fourth step for generating the conversion table by correlating the field coordinates obtained in the second step with the image-pickup-device plane coordinates in the image taken in the third step for each target point disposed in the first step.
-
-
82. A calibration method of an image processing device according to claim 64 for obtaining a conversion expression, the calibration method comprising:
-
a first step for disposing at least one target point on the primary ray passing through the incident pupil center position of the optical system and the center of the image pickup device plane, and at least one target point other than on the primary ray within an image pickup region where the image pickup means takes an image within the field;
a second step for obtaining the field coordinates of at least the two disposed target points;
a third step for taking images of at least the two target points disposed by means of the image pickup means; and
a fourth step for creating the conversion expression based on the relevant information between the field coordinates obtained from the field coordinates value of at least one target point on the primary ray of which positional relation as to the image pickup means is known, and the field coordinates values of at least two target points obtained in the second step, and the camera coordinates, and the relevant information between the field coordinates values of at least two target points in the image taken in the third step and the image-pickup-device plane coordinates.
-
-
84. A calibration method of an image processing device according to claim 66 for obtaining a conversion expression, the calibration method comprising:
-
a first step for disposing at least one target point on the primary ray passing through the incident pupil center position of the optical system and the center of the image pickup device plane, and at least one target point other than on the primary ray within an image pickup region where the image pickup means takes an image within the field;
a second step for obtaining the field coordinates of at least the two disposed target points;
a third step for taking images of at least the two target points disposed by means of the image pickup means; and
a fourth step for creating the conversion expression based on the relevant information between the field coordinates obtained from the field coordinates value of at least one target point on the primary ray of which positional relation as to the image pickup means is known, and the field coordinates values of at least two target points obtained in the second step, and the camera coordinates, and the relevant information between the field coordinates values of at least two target points in the image taken in the third step and the image-pickup-device plane coordinates.
-
-
5. An image processing device according to claim 3, wherein the coordinates of a position where the target point exists are field coordinates representing the absolute position where the target point exists within a field, by means of coordinates.
-
-
85. An image processing device comprising:
-
image picked-up data input means for inputting image information including an image of the target obtained by forming an image of the target using an optical system, and then taking an image of the target;
field coordinates input means for inputting the field coordinates of the position where the target point exists within the field; and
relevant information generating means for obtaining the relevant information between the field coordinates input from the field coordinates input means and the coordinates within an image plane in the image information input from the image picked-up data input means.
-
-
86. An image processing program for controlling a computer so as to function as:
-
image picked-up data input means for inputting image information including an image of the target obtained by forming an image of the target using an optical system, and then taking an image of the target;
field coordinates input means for inputting the field coordinates of the position where the target point exists within the field; and
relevant information generating means for obtaining the relevant information between the field coordinates input from the field coordinates input means and the coordinates within an image plane in the image information input from the image picked-up data input means.
-
Specification
- Resources
-
Current AssigneeOlympus Corporation
-
Original AssigneeOlympus Corporation
-
InventorsMatsui, Shinzo
-
Application NumberUS11/001,331Publication NumberTime in Patent OfficeDaysField of SearchUS Class Current348/239CPC Class CodesG01S 3/7864 T.V. type tracking systemsH04N 5/262 Studio circuits, e.g. for m...