Advanced
Precision Evaluation of Three-dimensional Feature Points Measurement by Binocular Vision
Precision Evaluation of Three-dimensional Feature Points Measurement by Binocular Vision
Journal of the Optical Society of Korea. 2011. Mar, 15(1): 30-37
Copyright ©2011, Optical Society of Korea
  • Received : October 10, 2010
  • Accepted : January 01, 2011
  • Published : March 25, 2011
Download
PDF
e-PUB
PubReader
PPT
Export by style
Share
Article
Author
Metrics
Cited by
TagCloud
About the Authors
Guan Xu
Mechanical Science and Engineering College, Jilin University, Renmin Street 5988#, Changchun, China
Xiaotao Li
Mechanical Science and Engineering College, Jilin University, Renmin Street 5988#, Changchun, China
lixiaotao@jlu.edu.cn
Jian Su
Traffic and Transportation College, Jilin University, Renmin Street 5988#, Changchun, China
Hongda Pan
Traffic and Transportation College, Jilin University, Renmin Street 5988#, Changchun, China
Guangdong Tian
Traffic and Transportation College, Jilin University, Renmin Street 5988#, Changchun, China
Abstract
Binocular-pair images obtained from two cameras can be used to calculate the three-dimensional (3D) world coordinate of a feature point. However, to apply this method, measurement accuracy of binocular vision depends on some structure factors. This paper presents an experimental study of measurement distance, baseline distance, and baseline direction. Their effects on camera reconstruction accuracy are investigated. The testing set for the binocular model consists of a series of feature points in stereo-pair images and corresponding 3D world coordinates. This paper discusses a method to increase the baseline distance of two cameras for enhancing the accuracy of a binocular vision system. Moreover, there is an inflexion point of the value and distribution of measurement errors when the baseline distance is increased. The accuracy benefit from increasing the baseline distance is not obvious, since the baseline distance exceeds 1000 mm in this experiment. Furthermore, it is observed that the direction errors deduced from the set-up are lower when the main measurement direction is similar to the baseline direction.
Keywords
I. INTRODUCTION
Visual navigation, robotics and vision-based measurement are just a few examples of industrial applications that depend on pose estimation and computation of the three-dimensional(3D) object location [1 - 7] . In 3D reconstruction process with an optical technique, system accuracy is defined as a problem of the determination of 3D point positions, which are provided by 2D features gathered from binocular calibrated cameras. For great accuracy in the system performance, camera calibration and reconstruction have to be accurately performed.
In the past many researchers have developed algorithms for camera calibration and reconstruction as an open topic in computer vision. Currently, several major calibration techniques are available. The most popular camera calibration method is the direct linear transformation (DLT) method originally reported by Abdel-Aziz and Karara [8] . The DLT method uses a set of control points whose object space coordinates are already known. The control points are normally fixed to a rigid calibration frame. The flexibility of the DLTbased calibration often depends on how easy it is to handle the calibration frame. The main DLT method problem is that the calibration parameters are not mutually independent. An alternative approach is reported by Hatze to ensure the orthogonality of the rotation matrix [9] . The direct nonlinear minimization technique directly builds camera parameters to minimize the residual error utilizing iteration computation [10 , 11] . The intermediate parameters can be computed by solving linear equations. However, lens distortions do not follow this method [12 , 13] . Len [14] and Tsai [15 , 16] introduce an improved solution for calibration which is a two-step method to compute the distortion and skew parameters.To make the calibration more convenient and to avoid requiring 3D coordinates, Svoboda [17] made the technique more robust. According to the analysis of several images of a 2D calibration board, usually with a chessboard pattern, Z. Zhang [18 , 19] describes an efficient method to improve the accuracy of the camera calibration based on Tsai’s model. A similar technique is explained by Kato [20] , who focuses on retrieving camera location using fiducially markers, which are located in the 3D environment as squares of known size with high contrast patterns in the centre. K. C. Kwon [21] proposes a binocular stereoscopic camera vergence control method using disparity information by image processing and estimates the quantity of vergence control.
For evaluating the calibration accuracy, K. Zhang [22] explores a model of the binocular vision system focused on 3D reconstruction and describes an improved genetic algorithm aimed at estimating camera system parameters.In order to enhance the calibration accuracy, many corners should be treated as feature points, which are distributed uniformly on the calibration block. W. Sun [23] presents a study investigating the effects of training data quantity, pixel coordinate noise on camera calibration accuracy. H. H.Cui [24] discusses an improved method for accurate calibration of a measurement system. The system accuracy is improved considering the nonlinear measurement error. Independent of the computed lens distortion model or the number of camera parameters, C. Ricolfe-Viala [25] outlines a metric calibration technique that calculates the camera lens distortion isolated from the camera calibration process. An accurate phase-height mapping algorithm is proposed by Z. W. Li [26] to improve the performance of the structured light system with digital fringe projection. By means of a training network, the relationship between the 2D image coordinates and the 3D object coordinates can be achieved. However, their experiments involve fixed system structure parameters and provide neither a synthetic evaluation on separate test data nor flexible binocular system parameters to verify the accuracy calibrated results.
Considering the widely used DLT calibration method, we emphasize the structure factors that influence the binocular measurement system accuracy in this paper. Baseline distance and measurement distance are evaluated and applied to the experimental binocular system to reveal the quality of each factor. In the experiment, the 3D board calibration method for both factors is applied.
II. MEASUREMENT ALGORITHM
- 2.1. Camera Calibration
The calibration method requires a set of 3D coordinates and two sets of 2D coordinates of grid board images from two cameras. Then the intrinsic and extrinsic parameters are calculated by solving the projection equation.
PPT Slide
Lager Image
where M is the set of camera parameters, i.e. projective matrix; X 3D and X 2D are the corresponding 3D and 2D grid corners, respectively. The parameter matrix M can be decomposed as
PPT Slide
Lager Image
where K 1 is the intrinsic parameter matrix and K 2 is the extrinsic parameter matrix. 
PPT Slide
Lager Image
are the focal lengths along x and y axes of the image plane in pixel dimensions; u 0 and v 0 are the x and y coordinates of the origin in the image plane.
PPT Slide
Lager Image
where R 3×3 denotes the 3×3 rotation matrix and T 3×1 denotes the 3×1 translation vector between camera and world coordinate. Though there are no distortion and skew parameters, the model is simple and common for calibration.
- 2.2. 3D Reconstruction
3D reconstruction is usually applied in the binocular vision measurement where the camera projection matrices of two cameras are adopted to calculate the 3D world coordinates of a point viewed by the cameras. Suppose P 1 and P 2 detected in two images and matched to each other are the 2D projective points of an arbitrary 3D point P in space. Then
PPT Slide
Lager Image
PPT Slide
Lager Image
where ( u 1 , v 1 , 1) and ( u 2 , v 2 , 1) represent the homogenous coordinates of P 1 and P 2 in two images, respectively. ( X , Y , Z , 1) is the homogenous coordinate of 3D point P . s 1 and s 2 are two non-zero scalars. M 1 and M 2 are the projective matrices of two cameras. They are received from camera calibration and employed to denote two projective mappings from world coordinates to pixel coordinates.
Derived from Eq. (4) and (5), two scalars s 1 , s 2 are eliminated and the linear equations are transformed as follows:
PPT Slide
Lager Image
PPT Slide
Lager Image
Since one linear equation stands for a plane in space and two composite linear equations represent one crossing line of two planes, Eq. (6) and (7) refer to two lines in 3D space, which intersect at the 3D point P ( X , Y , Z , 1).
III. EXPERIMENTAL METHODS AND RESULTS
- 3.1. Experimental Process
The experiment is carried out in our laboratory space equipped with a vertical rack, raised about 2 m above the ground, as shown in Fig. 1 . Two DH-HW3102UC cameras with 8 mm focal length lens are placed along the rack in a line perpendicular to the horizontal plane. The image resolution is 2048×1536 pixels with the frame rate 6 frame/sec and pixel size 3.2 ㎛×3.2 ㎛. To investigate the influences of baseline distance and measurement distance illustrated in Fig. 2 ., twenty-five configurations, baseline distances of 600 mm, 800 mm, 1000 mm, 1200 mm, 1400 mm, and measurement distances of 1000 mm, 1500 mm,2000 mm, 2500 mm, 3000 mm, are separately studied on the experimental setup.
The obtained data for calibration is generated by printing a chessboard pattern of 60×60 mm grid corners onto three 500×500 mm sized sheets. Each of them is attached to the plane of a rigid cube. Binocular images captured from two cameras are demonstrated in Fig. 3 . and Fig. 4 . It produces two 2D coordinate data sets of 27 points for either camera.
PPT Slide
Lager Image
Experimental environment.
The 3D world coordinates of these points are measured relative to the intersection point of three chessboard pattern sheets, i.e. the origin of world reference system. The 3D coordinate data set has covered the calibration cube surface with 60 mm division. Because a normal ruler is accurate only to 1 mm, the maximum calibration board error of 0.5 mm can be reached, which is approximate 0.83% of the pattern size.
PPT Slide
Lager Image
Baseline distance and measurement distance.
PPT Slide
Lager Image
The image and feature points from the higher camerawith 600 mm baseline distance and 1000 mm measurementdistance.
PPT Slide
Lager Image
The image and feature points from the lower camerawith 600 mm baseline distance and 1000 mm measurementdistance.
- 3.2. Accuracy Evaluation Results
The calibration result is a set of camera parameters. In most applications, the parameters are used for stereo computation to reconstruct the 3D coordinates for feature points of the measured objects. As the baseline distance, the measurement distance and the measurement direction are the three essential factors for the feature point reconstruction precision, the influences should be evaluated by the accuracy experiments of the stereo vision system.
In this evaluation, with given camera parameters, stereo computation reconstructs the 3D coordinates of feature points on the calibration board by 2D camera images. Since the measurement and baseline distances of a binocular system influence the system accuracy, the difference between the reconstructed 3D coordinate of a feature point and the original coordinate in space is defined as the stereo error. The stereo error is related to the camera parameters and is restricted by system structures. To make a conclusion, we create a binocular system and choose some synthetic test points to show the stereo error distribution in 3D space. Furthermore, since different test points have different error values, we also analyze the error distribution and scope to show the relationship between the stereo error and the positioning parameters of the binocular system. The identical calibration and reconstruction methods mentioned above are adopted to ensure that the assessment results are independent of the computation method. Therefore, different measurement results are appraised with the same evaluation procedure.
The stereo error varies when the camera positions are different. The 3D errors are defined as follows:
PPT Slide
Lager Image
PPT Slide
Lager Image
PPT Slide
Lager Image
PPT Slide
Lager Image
where ( x or , y or , z or ) is the original coordinates of the feature points on the calibration board, ( x re , y re , z re ) is the reconstructed coordinates of the feature points from two camera images; E x , E y , and E z demonstrate the measurement errors of the binocular system in x , y , z directions, respectively; E represents the comprehensive error.
Fig. 5 (a)-(y) attempt to show what level of accuracy is available given a constant amount of baseline distance and a variable observation distance. All the data are collected under similar conditions in order to compare experimental results.
PPT Slide
Lager Image
Feature points reconstruction errors Ex Ey Ez and E. The experiment is performed on baseline distance and measurementdistance with the following values respectively.(a). 600 mm 1000mm. (b). 600 mm 1500 mm. (c). 600 mm 2000 mm. (d). 600 mm2500 mm. (e). 600 mm 3000mm. (f). 800 mm 1000 mm. (g). 800 mm 1500 mm. (h). 800 mm 2000 mm. (i). 800 mm 2500 mm.(j). 800 mm 3000 mm. (k). 1000 mm 1000 mm. (l). 1000 mm 1500 mm. (m). 1000 mm 2000 mm. (n). 1000 mm 2500 mm. (o).1000 mm 3000 mm. (p). 1200 mm 1000mm. (q). 1200 mm 1500 mm. (r). 1200 mm 2000 mm. (s). 1200 mm 2500 mm. (t). 1200mm 3000 mm. (u). 1400 mm 1000mm. (v). 1400 mm 1500 mm. (w). 1400 mm 2000 mm. (x). 1400 mm 2500mm. (y). 1400 mm3000mm. (t). 1200 mm 3000 mm. (u). 1400 mm 1000mm. (v). 1400 mm 1500 mm. (w). 1400 mm 2000 mm. (x). 1400 mm2500mm. (y). 1400 mm 3000 mm.
In the scope of the error distribution in Fig. 5 (a)-(y), as the measurement distance increases from 1000 mm to 3000 mm, the maximum error value changes from -2 mm - +2 mm to -4 mm - +6 mm, and the error values distribute more dispersedly. From these results, we come to the conclusion that the error value increases with the distance between the test object and the measurement system. However, when the baseline distance varies from 800 mm to 1400 mm with a certain constant value of measurement distance 1000 mm,1500 mm, 2000 mm, 2500 mm, or 3000 mm, for example, 3000 mm, the error value transforms from -4 mm - +6 mm to less than -2 mm - +2 mm in the boundary of error scope. If the baseline distance is increased, the measurement system error made by binocular cameras is reduced. Since the baseline distance is confined with system dimensions and the error distribution concentrates on -2 mm to +2 mm when the baseline distance is up to 1000 mm, the optimal value of the system is baseline distance 1000 mm, measurement distance 1500 mm with the minimum error value and distribution scope. For a certain binocular measurement system, after an obvious peak of the accuracy benefit from baseline distance increase, the system accuracy is not able to be improved effectively. In addition, we notice that the measurement error in the z direction is normally less than in x and y directions. This means that the principal measurement dimension should be arranged along with the baseline direction of the two cameras.
- 3.3. Experimental Results Discussion
A model of the binocular measurement system is constructed in Fig. 6 , where o , o ’ represent the optical centres of two cameras [27] . P 1 ( X 1 , Y 1 ) and P 2 ( X 2 , Y 2 ) are the projective 2D points of a measured 3D point p ( x ’, y ’, z ’). L is the baseline distance of the system. f 1 and f 2 are the focus lengths of the two cameras. α , β are the measurement angles of p ( x ’, y ’, z ’) relative to o , o ’ in o - x z ’ plane. α 0 , β 0 are the angles between optical axis and ox axis, γ 1 , γ 2 are the horizontal viewing angles of P 1 ( X 1 , Y 1 ) and P 2 ( X 2 , Y 2 ). θ 1 , θ 2 are the vertical viewing angles of P 1 ( X 1 , Y 1 )and P 2 ( X 2 , Y 2 ). The coordinate of p ( x ’, y ’, z ’) is obtained from geometrical relationship as follows:
PPT Slide
Lager Image
For evaluating the measurement error caused by baseline distance variation, the error transfer function relative to baseline distance, L , can be defined as
PPT Slide
Lager Image
Here, two typical situations are considered in this paper, α = β and β = 90º. According to geometrical relationship in Fig. 6 , the error transfer function can be expressed as
PPT Slide
Lager Image
Where L is the baseline distance of the system, y ’, z ’ are the coordinates of p ( x ’, y ’, z ’).
PPT Slide
Lager Image
Measurement model of a binocular vision system.
PPT Slide
Lager Image
Value of the error transfer function. The baselinedistance varies from 0 mm to 1500 mm and measurementdistance varies from 0 mm to 3000 mm respectively. (a). α = β. (b). β = 90º.
According to simulation results in Fig. 7 , the value of error transfer function reduces as the baseline distance L increases. And the value is lower when the measurement distance z ’ decreases. The simulation results fully explain the experimental data.
IV. CONCLUSION
An experimental study on camera calibration and 3D reconstruction of a binocular measurement system is carried out to investigate how factors such as baseline distance, measurement distance and baseline direction affect the reconstruction accuracy. The most representative method, developed by Abdel-Aziz and Karara, is chosen for experimentation on 2D data from cameras. A typical criterion is applied to evaluate measurement accuracy on test sets.
With the same reference calibration board, we set a series of test distances from 1000 mm to 3000 mm based on the same calibration and reconstruction methods. According to the comprehensive error, a smaller test distance employing grid-board calibration shows a more stable result than a higher distance method. Nevertheless, the value of the measurement distance of a binocular system is determined on the detected object, system geometrical structure and view field. Therefore, it is an inefficient method to enhance accuracy by adjusting measurement distance between cameras and the detected object in the experimental situation that the measurement distance is constrained by the system structure and camera characteristics. If we alter the baseline distance for the source data of the binocular system, the accuracy is more sensitive in the range of 600 mm to 1000 mm. Thus, to increase the accuracy of the binocular system, it is more important to increase the baseline distance than to improve the measurement distance. However, experimental results indicate that after a peak of accuracy, i.e. 1000 mm in the experiment, it is unproductive to improve measurement effects by choosing a larger baseline distance.
The stereo error evaluation shows that the measurement result gotten in different directions has different valid ranges. For the x , y , and z direction results, on average, z direction errors are smaller than the other two directions when the z direction agrees with the baseline direction.
In summary, it is clear that the methods for enhancing accuracy of a binocular measurement system depend greatly on some known specific geometrical information such as the test object dimension, measurement distance, baseline distance and direction. Hence, it is crucial to select a reasonable system dimension and structure for binocular measurement so as to improve the accuracy of 3D feature measurement precision using binocular vision.
Acknowledgements
This work is supported by the China Postdoctoral Science Foundation, under Grant No.20100471254, the Ph.D.Programs Foundation of Ministry of Education of China,under Grant No. 20100061120067, and the Jilin Province Science Foundation for Youths, under Grant No.20100167.
References
Kwon G. I , Choi Y. H 2010 Image-processing based panoramic camera employing single fisheye lens J. Opt.Soc. Korea 14 245 - 259
Tay C. J , Quan C , Huang Y. H , Fu Y 2005 Digital image correlation for whole field out-of-plane displacement measurement using a single camera Opt. Comm. 251 23 - 36
Choi H. J , Park J. H , Hong J. S , Lee B 2004 An improved stereovision scheme using single camera and a composite lens array J. Opt. Soc. Korea 8 72 - 78
Kwon K. C , Choi J. K , Choi Y. S 2002 Automatic control of horizontal-moving stereoscopic camera by disparity compensation J. Opt. Soc. Korea 8 150 - 155
Beiderman Y , Teicher M , Garci J , Mico V , Zalevsky Z 2010 Optical technique for classification recognition and identification of obscured objects Opt. Comm. 283 4274 - 4282
Park Y. C , Park C. G , Kang M. H , Ahn S. J 2009 A high-speed digital laser grating projection system for the measurement of 3-dimensional shapes J. Opt. Soc. Korea 13 251 - 255
Shin D. H , Kim E. S 2008 Computational integral imaging reconstruction of 3D object using a depth conversion technique J. Opt. Soc. Korea 12 131 - 135
Abdel-Aziz Y. I , Karara H. M “Direct linear transformation into object space coordinates in close-range photogrammetry” in Proc. the Symposium on Close-range Photogrammetry(Falls Church VA USA Jan. 1971) 1 - 18
Hatze H 1988 High-precision three-dimensional photogrammetric calibration and object space reconstruction using a modified DLT-approach J. Biomech. 21 533 - 538
Gennery D. B “Stereo-camera calibration” in Proc. the 10th Image Understanding Workshop (Menlo Park CAUSA Nov. 1979) 101 - 108
Isaguirre A , Pu P , Summers J “A new developmentin camera calibration calibrating a pair of mobile cameras” in Proc. International Conference on Robotics and Automation(St. Louis MO USA Mar. 1985) 74 - 79
Ganapathy S “Decomposition of transformation matricesfor robot vision” in Proc. IEEE International Conferenceon Robotics and Automation (Atlanta GA USA Mar. 1984) 130 - 139
Faugeras O. D , Toscani G “The Calibration problem for stereo” in Proc. International Conference of ComputerVision and Pattern Recognition (Miami Beach FL USA Jun. 1986) 15 - 20
Len R. K , Tsai R. Y “Techniques for calibration ofthe scale factor and image center for high accuracy 3D machine vision metrology” in Proc. IEEE International Conference on Robotics and Automation (Raleigh NC USAMar. 1987) 68 - 75
Tsai R. Y “An efficient and accurate camera calibration technique for 3D machine vision” in Proc. IEEE Conferenceon Computer Vision and Pattern Recognition (Miami Beach FL USA Jun. 1986) 364 - 374
Tsai R. Y 1987 An efficient and accurate camera calibration technique for 3D machine vision IEEE Trans. Robot 3 364 - 374
Tomas S , Daniel M , Tomas P 2005 A convenient multicamera self-calibration for virtual environments Teleoperators and Virtual Environments 14 407 - 422
Zhang Z “Flexible camera calibration by viewing a planefrom unknown orientations” in Proc. International Conferenceon Computer Vision (Kerkyra Corfu Greece Sep. 1999)
Zhang Z 2000 A flexible new technique for camera calibration IEEE Trans. Pattern Anal. Mach. Intell. 22 1330 - 1334
Kato H , Billinghurst M “Marker tracking and HMD Calibration for a video-based augmented reality conferencing system” in Proc. the Second IEEE and ACM International Workshop on Augmented Reality (San Francisco CA USA Oct. 1999) 85 - 95
Kwon K. C , Lim Y. T , Kim N , Song Y. J , Choi Y. S 2009 Vergence control of binocular stereoscopic camera using disparity information J. Opt. Soc. Korea 13 379 - 385
Zhang K , Xu B , Tang L , Shi H 2006 Modeling of binocular vision system for 3D reconstruction with improved genetic algorithms Int. J. Adv. Manuf. Technol. 29 722 - 728
Sun W , Cooperstock J. R 2006 An empirical evaluation of factors influencing camera calibration accuracy using three publicly available techniques Mach. Vision Appl. 17 51 - 67
Cui H. H , Liao W. H , Cheng X. S , Dai N , Yuan T. R 2010 A three-step system calibration procedure with error compensation for 3D shape measurement Chinese Opt. Lett. 8 33 - 37
Ricolfe-Viala C , Sanchez-Salmeron A. J 2010 Robust metric calibration of non-linear camera lens distortion Pattern Recognit. 43 1688 - 1699
Li Z. W , Shi Y. S , Wang C. J , Qin D. H , Huang K 2009 Complex object 3D measurement based on phaseshifting and a neural network Opt. Comm. 282 2699 - 2706
Guo Y. B , Yao Y , Di X. G “Research on structural parameter optimization of binocular vision measuring system for parallel mechanism” in Proc. IEEE International Conferenceon Mechatronics and Automation (Luoyang China Jun. 2006) 1131 - 1135