The Joint Transform Correlator (JTC) has been the most suitable technique for real time optical pattern recognition and target tracking applications. This paper proposes a new application of the JTC system for an analysis of the blurring effect of the optical images caused by a defocused lens. We present the relation between the correlation peak, optical transfer function (OTF), and the amount of blurring caused by focusing error. Moreover, we show a possibility of calibrating the blurred image by simply measuring the correlation peak.
I. INTRODUCTION
Image blur due to the focusing error of camera systems (defocus blur) is mainly a result of a certain problem of the geometric image formation and the finite depth of field of practical camera lens systems. The camera lens system with the focusing error tends to defocus objects and blur the acquired images. This kind of image blur caused by the focusing error or by an imperfect imaging lens system with defocusing aberration will cause serious image degradation. Many kinds of image processing techniques have been developed to restore the original image
[1

3]
. Most of these techniques estimate the pointspread function (PSF) associated with the image acquisition system which carries out spaceinvariant deconvolution based on the estimated blurring function for image restoration. In addition to the applications related to image restoration or sharpening, defocus blur is also considered as an important visual cue for image quality assessment
[4
,
5]
and super resolution image reconstruction
[6
,
7]
. For the images captured with small depthoffield, defocus blur can be used for image segmentation or interested region detection
[8
,
9]
. Moreover, depth recovery from a single camera can be achieved by measuring the blur extent of the captured defocused image
[10
,
11]
. In the above applications, the identification of defocus blur parameters plays a central role in their underlying techniques. Moreover, these kinds of techniques are able to identify the blur parameters for specific application domains. However, the estimation of a blur does not necessarily give the true parameter estimate
[12]
. As a result, the analysis is simply carried out based on the comparison between the defocused blur image and original image.
In this paper, we adopted the Joint transform correlator (JTC) to carry out the comparison between the defocused blur image and original image. JTC has shown remarkable achievements and is a useful alternative to the other optical systems
[13

15]
for pattern recognition and target tracking applications. The typical advantage of the JTC is that it uses a type of real time optical system which quantitatively compares images by measuring correlation peaks
[16]
. Recently, the JTC shows a unique application in the color comparison and color difference measurement
[17]
. This works presented the close relation between the color difference and the correlation peaks by decomposing the red, green, and blue components from the original images. In this paper, we presented a simple technique to estimate qualitatively the extents of blurring and to restore the original image by using the JTC. The correlation peaks obtained by the JTC play an essential role as the blur parameters to estimate quantitatively the extents of the blurring in detail. The correlation peaks are used to calculate the focusing error which results in the restoration of original image by calibrating the defocused camera. In the following sections, the relation between the correlation and the focusing error concerning the OTF (optical transfer function) and the JTC system in detail. Section III describes simulation results and discusses a possibility of the restoration of original image and the calibration of the camera lens with the defocused aberration, and finally some comments are contained in the conclusion section.
II. BLURRING EXTENTS AND CORRELATION PEAK
One of the easiest aberrations to deal with mathematically is a simple focusing error of the lens. When a focusing error is present, the center of curvature of the spherical wavefront converging towards the image of an object is formed either to the left or to the right of the image plane. Considering an onaxis point for simplicity, the phase distribution across the exit pupil is of the form
where
f
is a focal length of the lens. Eq. (1) is a general phase function. If there is a focusing error, the phase difference
Δ⏀
(
x
,
y
) can then be determined by subtracting the ideal phase distribution from the actual phase distribution. Thus, the phase error is given by
where
f_{i}
is a focal length of the lens which forms ideal phase distribution, and
f_{a}
is a focal length of the lens which forms actual phase distribution with the defocused error. Thus, the pathlength error is given by
which is seen to depend on the quadratic space variables in the exit pupil. For the assumption of a square aperture of width 2
w
, the maximum pathlength error at the edge of the aperture along the
x
or
y
axis, which we represent by
W_{m}
is given by
The number
W_{m}
is a convenient indication of the severity of the focusing error. If we let
f_{a}
=
f_{i}
±
Δf
and
Δf
small, then the pathlength error
W
(
x
,
y
) can be expressed as
If the pathlength error
W
(
x
,
y
) given by Eq. (5) is used to obtain the OTF (optical transfer function), it can be shown as
[18]
We want to know the relation between the OTF of Eq. (6) caused by the focusing error and the extents of the image blurring caused by this focusing error. Now we prepare a JTC system to investigate the effect of the focusing error on the image blurring.
Figure 1
shows the optical structure of the JTC system for the investigation of the defocusing effect on the correlation peak and PSNR. The optical system of
Fig. 1
is composed of two parts, the image capture camera part and the JTC part. The image capture camera part is composed of a camera lens system which captures the original image on the LCD come from a PC. The captured image will be blurred if the camera lens is defocused. The blurred image comes into the PC and is combined with the original image to form a joint input image on LCD 1. Let us call the reference image
r(x,y)
and the sample image
s(x,y)
and assume that those two inputs are separated by
2x_{o}
. Then, the input joint images
g(x,y)
can be expressed as
Optical structure of the image blurring estimating and calibrating JTC system.
The joint power spectrum (JPS) on the Fourier plane, which is called the intensity of this interference fringe pattern, can be expressed as
Here, * is the phase conjugate,
u
and
v
are independent spatial frequency variables scaled by a factor of
where λ is the wavelength of input collimated light and
f
is the focal length of the Fourier transforming lenses L1 and L2. Equation (9) indicates the JPS excluding the DC terms and including the nonlinearity parameter
k
.
If the reference image perfectly matches the sample image and there is no phase error between the reference image and the sample image, the output can be expressed as
III. RESULTS AND DISCUSSIONS
Figure 2
shows the computer simulated results of the OTF derived from Eq. (6) for various values of
W_{m}
/λ.
Figure 2
shows that the diffractionlimited OTF is obtained when
W_{m}
= 0. Note also that, for value of
W_{m}
＞λ/2, the sign of the OTF are reversed at some region of the spatial frequency.
Table 1
shows the results of the focusing error calculated from Eq. (5) for various values of
W_{m}
/λ. In this paper, we used a DSLR camera for capturing the images for investigating the relation between the OTF of Eq. (6) caused by the focusing error and the image blurring caused by this focusing error. The diameter and the focal length are all the same of 50 mm. As mentioned previously in Section II, we make an assumption of a square aperture for the simplicity of derivation of the OTF. Because our purpose is to shows the evident relation between the OTF with the focusing error and the image blurring with this focusing error, the actual error caused by this assumption instead of using a circular aperture has little effect on our study.
Table 1
reveals that focal error
Δf
increases by 1.27 μm for λ/4 increment of
W_{m}
.
Figure 3
shows 256 × 256 gray images of a portrait captured by defocusing the camera according to
Table 1
.
Figure 3(a)
represents an image (reference) taken by the camera focusing at the object. Next, we defocused the camera step by step with the amount of 1.27 μm corresponding to λ/4 increment of
W_{m}
which resulted in from
Fig. 3(b)
to
Fig. 3(e)
.
Figure 3(f)
represents a mismatch image to check the amount of discrimination between the matched blurring images and the mismatched images compared with the reference image. We used the JTC system to evaluate the proposed technique of calculating the amount of blurring caused by defocusing the camera.
Table 2
shows the correlation peaks of the sample images measured by the JTC system. First, let us check the correlation peaks of the images obtained by defocusing the reference image. In the case of correlating the reference image of
Fig. 3(a)
corresponding to the case without a focusing error (
Δf
=0), the measured correlation peak value is 6.184×10
^{8}
. For the case of correlating the reference image of
Fig. 3(a)
with the defocused image from sample 1 (
Fig. 3(b)
) to sample 4 (
Fig. 3(e)
), the measured correlation peak values are 6.172×10
^{8}
, 6.157×10
^{8}
, 6.149×10
^{8}
, and 6.137×10
^{8}
, respectively. In
Table 2
,
ΔCP
means the difference of the correlation peak between the reference image and the sample image. The value of
ΔCP
for the four blurred sample images is 0.012, 0.027, 0.035, and 0.047, respectively. On the other hand, in the case of correlating the reference image with the mismatch image of
Fig. 3(f)
, the measured value of the correlation peak is 3.498×10
^{8}
and
ΔCP
is 2.686 which shows so very high compared to the matched images even though they are blurred.
Figure 4
shows the two curves of
ΔCP
and
Δf
.
Figure 4
reveals that two curves are linear and approximately the same. In Section II, we mentioned that we want to know the relation between the OTF of Eq. (6) caused by the focusing error and the extents of the image blurring caused by this focusing error, and we prepared a JTC system to investigate the effect of the focusing error on the image blurring. Thus, considering
Table 1
,
Table 2
and
Fig. 4
, we can conclude that the extents of the image blurring caused by the focusing error can be estimated by measuring the correlation peaks using the JTC system. In addition, we can calibrate blurred images by measuring the correlation peaks of those images and calculating the focusing errors
Δf
by using linear approximation of
Fig. 4
. Now,
Figure 5
shows one example of the calibration of the blurred image.
Figure 5(a)
is a blurred image obtained by defocusing the reference object randomly, and
Fig. 5(b)
is a calibrated image obtained by finding the focusing error
Δf
. The focusing error
Δf
can be calculated by measuring the correlation peak of the blurred image. The measured correlation peak of the blurred image is 6.127 ×10
^{8}
, and the correlation difference
ΔCP
becomes 0.057. Therefore, the calibrated
Δf
is 6.03 μm supposupposing both slope of
ΔCP
and
Δf
are approximately 2.0.
Computer simulated results of the OTF for various values of W_{m}/λ.
Results of the focusing error for various values ofWm/λ
Results of the focusing error for various values of W_{m}/λ
256 × 256 gray images of a portrait captured by defocused camera.
Correlation peaks of the sample images measured by the JTC system
Correlation peaks of the sample images measured by the JTC system
Linear approximation and relation between the correlation peak and the focusing error.
Example of the calibration of the blurred image.
IV. CONCLUSION
In this paper, we presented a new technique of estimating the extents of blurring effect caused by defocusing a camera lens using the JTC. In addition, we presented a possibility of calibration of the blurred images by finding the focusing error calculated from the measured correlation peak. Thus, we can conclude that a camera lens system with the aberration of the defocusing error can be calibrated by finding
W_{m}
/λ, that is for finding the focusing error
Δf
finally, through the calculation process by simply obtaining the correlation peak with the JTC system.
Banham M.
,
Katsaggelos A.
1997
“Digital image restoration,”
IEEE Signal Process. Mag.
14
24 
41
M. A. Kutay
,
H. M. Ozaktas
1998
“Optimal image restoration with the fractional fourier transform,”
J. Opt. Soc. Am. A
15
825 
833
DOI : 10.1364/JOSAA.15.000825
Lin H.Y.
,
Chou X.H.
2012
“Defocus blur parameters identification by histogram matching,”
J. Opt. Soc. Am. A
29
1964 
1706
Wu S.
,
Lin W.
,
Xie S.
,
Lu Z.
,
Ong E. P.
,
Yao S.
2009
“Blind blur assessment for visionbased applications,”
J. Vis. Commun. Image Represent.
20
231 
241
DOI : 10.1016/j.jvcir.2009.03.002
van Zyl Marais I.
,
Steyn W. H.
2007
“Robust defocus blur identification in the context of blind image quality assessment,”
Signal Process. Image Commun.
22
833 
844
DOI : 10.1016/j.image.2007.06.003
Rajan D.
,
Chaudhuri S.
,
Joshi M.
2003
“Multiobjective super resolution: Concepts and examples,”
IEEE Signal Process. Mag.
20
49 
61
DOI : 10.1109/MSP.2003.1203209
Yang J.
,
Schonfeld D.
2010
“Virtual focus and depth estimation from defocused video sequences,”
IEEE Trans. Image Process.
19
668 
679
DOI : 10.1109/TIP.2009.2036708
Swain C.
,
Chen T.
1995
“Defocusbased image segmentation,”
IEEE International Conference on Acoustics, Speech, and Signal Processing
4
2403 
2406
Liu Z.
,
Li W.
,
Shen L.
,
Han Z.
,
Zhang Z.
2010
“Automatic segmentation of focused objects from images with low depth of field,”
Pattern Recogn. Lett.
31
572 
581
DOI : 10.1016/j.patrec.2009.11.016
Pradeep K.
,
Rajagopalan A.
2007
“Improving shape from focus using defocus cue,”
IEEE Trans. Image Process.
16
1920 
1925
DOI : 10.1109/TIP.2007.899188
Chaudhuri S.
,
Rajagopalan A.
1998
Depth from Defocus: A Real Aperture Imaging Approach
SpringerVerlag
Lin J.
,
Zhang C.
,
Shi Q.
2004
“Estimating the amount of defocus through a wavelet transform approach,”
Pattern Recogn. Lett.
25
407 
411
DOI : 10.1016/j.patrec.2003.11.003
Weaver C. J.
,
Goodman J. W.
1966
“A technique for optically convolving two functions,”
Appl. Opt.
5
1248 
1249
DOI : 10.1364/AO.5.001248
Mu G. G.
,
Wang X. M.
,
Wang Z. Q.
1988
“Amplitudecompensated matched filtering,”
Appl. Opt.
27
3461 
3463
DOI : 10.1364/AO.27.003461
Alam M. S.
,
Khan J.
,
Bai A.
2004
“Heteroassociative multipletarget tracking by fringeadjusted joint transform correlation,”
Appl. Opt.
43
358 
365
DOI : 10.1364/AO.43.000358
Jeong M. H.
2011
“Coded single input channel for color pattern recognition in joint transform correlator,”
Journal of the Optical Society of Korea
15
(4)
335 
339
DOI : 10.3807/JOSK.2011.15.4.335
Goodman J. W.
1996
Introduction to Fourier Optics
2nd ed.
McGrawHill