DOI QR코드

DOI QR Code

Combined Static and Dynamic Platform Calibration for an Aerial Multi-Camera System

  • Cui, Hong-Xia (College of Information Science and Technology, Bohai University) ;
  • Liu, Jia-Qi (College of Information Science and Technology, Bohai University) ;
  • Su, Guo-Zhong (Chinese Academy Survey and Mapping)
  • Received : 2015.12.10
  • Accepted : 2016.03.30
  • Published : 2016.06.30

Abstract

Multi-camera systems which integrate two or more low-cost digital cameras are adopted to reach higher ground coverage and improve the base-height ratio in low altitude remote sensing. To guarantee accurate multi-camera integration, the geometric relationship among cameras must be determined through platform calibration techniques. This paper proposed a combined two-step platform calibration method. In the first step, the static platform calibration was conducted based on the stable relative orientation constraint and convergent conditions among cameras in static environments. In the second step, a dynamic platform self-calibration approach was proposed based on not only tie points but also straight lines in order to correct the small change of the relative relationship among cameras during dynamic flight. Experiments based on the proposed two-step platform calibration method were carried out with terrestrial and aerial images from a multi-camera system combined with four consumer-grade digital cameras onboard an unmanned aerial vehicle. The experimental results have shown that the proposed platform calibration approach is able to compensate the varied relative relationship during flight, acquiring the mosaicing accuracy of virtual images smaller than 0.5pixel. The proposed approach can be extended for calibrating other low-cost multi-camera system without rigorously mechanical structure.

Keywords

1. Introduction

Multi-camera systems are attractive alternatives to single frame camera systems in many applications. The multiple cameras can be integrated with or without overlapping views. Non-overlapping multi-camera systems can be found in vision-based applications[1][2]. In photogrammetric industry, multi-camera systems have been developed with overlapping views. The professional aerial multi-camera have been commercially available almost twenty years[3][4][5][6][7]. Besides, multi-camera systems combined with small and medium format digital cameras also have been developed and being used in 3D city virtual reconstruction and mapping projects from both manned and unmanned aerial vehicles[8-14][17]. In order to accurately generate virtual images from a multi-camera system , it is crucial to conduct the calibration of each camera, and platform geometric calibration. The task of platform geometric calibration is to estimate a set of relative orientation parameters(ROPs) with respect to a body frame or a reference camera. Previous works on platform calibration for stereo or multi-head camera system can be classified into two strategies. One strategy is through integrated or two-step static calibration process based on calibration field in a laboratory or ground control points (GCP) assuming that the physical relative orientation is fixed and static. In contrast, the on-fly dynamic platform calibration strategy is adopted considering that no multi-head camera system will be perfectly constant during flight. The rigorous on-fly platform calibration strategy requires direct measurement of optical center coordinates of each camera and the indirect estimation of the mounting angles with tie points extracted in the sub-images acquired during flight using a bundle block adjustment. The limitation of the method is that it requires specialized laboratory facilities that are not easily accessible for low-cost multi-camera systems. Moreover, the calibration method requires enough tie points and can not be accurately determined with sub-images of low texture in the overlapping image areas.

As far as a multi-camera system with low-cost and low-weight system structure design onboard an UAV system is considered, the change of ROPs will be even larger than that of a professional multi-camera system and may vary at different instant during dynamic flight in low altitude environments. To cope with the aforementioned challenges in platform calibration, the current study proposes a novel two-step calibration approach well-suited to the low-cost multi-camera system. In the first step, in order to avoid direct measurement with specialized laboratory facilities, the static platform calibration process was developed to determine the reliable relative orientation parameters among cameras based on the calibration field in static environments. In this step, the conventional one-step approach for two-camera systems was extended to multiple cameras and improved by the constrained relative orientation conditions and inherent convergent conditions for the multi-camera system.

In the second step, different from above commonly-used in-flight platform calibration for professional multi-camera systems, a new dynamic platform self-calibration approach was proposed based on not only tie points but also straight lines in order to acquire accurate results even in the case that not enough corresponding points are available.

The rest of the paper is organized as follows. Section 2 discusses the related work. Section 3 gives the underlying mathematical model of camera calibration. Section 4 describes the new calibration approach based on static and dynamic calibration algorithms. In Section 5, we present experimental evaluation and finally in Section 6 we draw conclusions.

 

2. Related Work

During the past decades, multi-camera systems are more and more used in computer vision, photogrammetry, vision-based applications. Jahan F.,et al.[1] described a non-overlapping multi-camera intelligent surveillance system and proposed the techniques of detecting, recognizing, and tracking certain objects from image sequences. Ragab M.,et al.,[2] proposed a non-overlapping multi-camera system for robot pose estimation purpose. In contrast, in photogrammetric applications, the aim of multi-camera systems is to extend the image coverage. The fields of view from cameras need to be overlapped and the images from different cameras can be mosaicked together based on geometric relationship among cameras as if being photographed from a virtual wide-angle camera. Due to the payload and size restrictions of the Unmanned Aerial Vehicle (UAV) , the multi-camera systems onboard UAVs are often integrated with small and medium format low-cost consumer grade cameras [8-14][17]. Ritchie G.,et al.[11] estimated crop reflectance with multispectral images from a combined two low cost digital cameras in vertical viewing. Tommaselli A., et al.[8] [17]introduced a two-camera system on an UAV, including two digital cameras and described the steps for platform calibration, image rectification, registration and fusion. Holtkamp D., and Goshtasby A. [12] adopted a system mounted on an UAV, consisting of an array of six vertical cameras to acquire images and proposed the approach for registration and mosaicking of multiple images. Lin Z.,et al.[13] proposed a low altitude multi-camera system( LAC-04) onboard of an UAV for mapping projects. Our previous study on the techniques for LAC-04 system calibration and metric mapping applications can also be found in [14][15]. A careful platform calibration must be carried out in order to guarantee accurate multi-camera integration. In this case of interest are not only the interior orientation parameters(IOPs), but also the relative orientation parameters (ROPs) of each camera relative to a body frame or a reference camera[16].

Assuming that the geometric relative orientation relationships among cameras are invariant during surveying projects, the platform calibration procedure is defined as static platform calibration in this paper. In the static platform calibration for metric applications, the ROPs among cameras can be determined using either a two-step or single-step approach. The two-step procedure includes the determination of cameras’ exterior orientation parameters through a conventional indirect calibration procedure and ROPs by comparing the EOPs from different cameras. Although this procedure is easy to implement, the accuracy of EOPs may vary with the imaging configuration and the number and distribution of control points . The one-step procedure incorporates the relative orientation constraints between the slave camera and the body frame/reference camera in the bundle adjustment. Several previous papers on the topic of multi-camera system calibration considered the use of relative orientation constraints[8][9][17][18]. Tommaselli A.,et al. [8][17] presented an approach for a stereo-camera calibration by introducing relative orientation constraints in the bundle adjustment. Lee Y.,et al.[9] adopted the single-step procedure for in-flight platform calibration assuming that the ROPs were constant during flight. Habib A.,et al.[18] adopted the similar single-step procedure to determine the ROPs among multiple cameras in the bundle adjustment and presented the accuracy and stability analysis of calibration parameters. The complexity of the implementation procedure for one-step calibration approach may be intensified with the increase of the number of cameras in the system and the number of observation epochs. However, the constrained conditions used in the adjustment will also reduce any possible high correlations between the ROPs among cameras. The drawbacks of previous static platform calibration techniques with or without incorporating constraint conditions is the assumption of the physical relative relationships among cameras remain invariant during flight.

Another strategy for rigorous determination of the ROPs aiming at virtual image generation from multiple images have been used for professional aerial multi-camera system. Spiller R. and Hinz A. [5] estimated the three rotation angles of each four panchromatic camera manufactured by Z/I imaging’s DMC (Digital Modular Camera) using a bundle adjustment with tie points in the small overlapping area between images acquired simultaneously. In this case, the small difference in positional displacements can be neglected. In the similar fashion, Gruber M. and Walcher W. [6] estimates the specific parameters for each CCD position in the focal plane of each camera. This combined measurement process dealing with professional multi-camera systems, avoids the correlations among unknowns, is reliable but the limitation of the approach is the direct measurement of perspective center coordinates of each camera based on specialized laboratory facilities that are not easily accessible for low-cost multi-camera system. Another drawback is the reliability and accuracy of the determined ROPs highly dependent on the number and distribution of tie points in the small overlapping images. Thus, previous works for the rigorous platform calibration of the multi-camera systems required enough tie points in the overlapping sub-images.

As far as the platform calibration of a multi-camera system mounted on an UAV platform is considered, it is more complex and difficult than that of a professional multi-camera system on board a large airplane with large-size and enough payload weight. On one hand, due to wind gusts or the spontaneous aerodynamic characteristics of the small-size and limited payload UAV platform, it is difficult to keep the UAV relatively steady in low-altitude environment and there often exists vibration of high and medium frequency[19].On the other hand, due to the limitation of size and payload, the multi-camera systems onboard UAV systems are designed with low-weight structure. The above two problems are the major differences between the low-weight and low-cost multi-camera system onboard an UAV system in low altitude environments and the professional multi-camera systems other flights. Therefore, the relative movements among cameras of the low-cost multi-camera on board an UAV platform may be larger than that of a professional multi-camera system and the varied ROPs can not be neglected within the course of data collection campaign for mapping or remote sensing projects in low altitude environments.

In order to calibrate the low-cost multi-cameras rigorously, a novel two-step platform calibration approach was proposed in the current study. Firstly, static platform calibration step was performed using an extended bundle adjustment. The aim of this step is that all positional parameters, angular parameters are reliably determined with an ordinary terrestrial calibration field, avoiding specialized direct measurements. Secondly, a new dynamic platform self-calibration approach was proposed based on both tie points and straight lines. The main advantage of this step is that the varied angular parameters among cameras can be determined even in the case that not enough corresponding points are available, avoiding failure in dynamic platform calibration.

 

3. Camera Calibration Model

3.1 Conventional geometry of photographs

The collinearity equations represent the geometry of photographs that models the camera perspective center, any object point, and its photo image as a single straight line. The model can be extended by a set of Additional Parameters(APs) (see equation (1)).

Where xi,k, yi,k are the image coordinates of the k th(k=1,2,3..) point at a image from camera “i” (i=1,2,3,4...n); the coordinates of the same point in the object space; the rotation matrix; the coordinates of the camera perspective center ; xi0, yi0 the principal point coordinates; fi the camera focal length and Δxi,k, Δyi,k are the systematic error compensation caused by distortions. The APs defined in Brown D.[20]consist of 29 parameters but only 10 thereof are used for digital cameras[21][22][23],which can be represented as following.

Where the terms of ki,1, ki,2, ki,3 represent the coefficients of radial distortion and ri,k the radial distance. pi,1, pi,2 are the coefficients of the decentring distortion. The scale parameters bi,1 models no-square pixel size and bi,2 compensates for the nonorthogonality in the pixel array. The exterior orientation parameters (EOPs), interior orientation parameters (IOPs) and object coordinates of photogrammetric points are simultaneously estimated from image point observations through bundle adjustment [20]based on the linearization of equation(1).

3.2 Modified geometry of photographs

Without the loss of generality, let’s assume that there is a two-head camera system which consists of camera “1” and “i ”(i=2,3,4..n). The camera “1” is used as the master camera which is the reference for estimating platform calibration parameters while the camera “i ” is used as a slave camera. The relative orientation parameters between two cameras are defined to be relative to the master camera “1”, which consists of three positional/baseline parameters, three angular/rotational parameters The EOPs of the slave camera relative to the object space coordinate system can be derived from the ROPs between two cameras and the EOPs of the master camera using equation(3) and equation (4) .

Where are the object space coordinate vectors of the exposure stations for camera “1”and camera“i ”, are rotation matrixes of two cameras; is the relative rotation matrix between two cameras . Then, the collinearity equations for the k th( k=1,2,3.. ) point at a photograph acquired from the slave camera “i ”can be expressed in terms of the EOPs of the photograph acquired from the master camera simultaneously as following:

The modified bundle adjustment model can be derived by the linearization of equation(5) to estimate the camera geometric platform calibration parameters. However, it should be noted that the constraints would only work for stationary objects due to the unintended variations caused by the structural instability of low-cost multi-head camera system .Thus, we include these constraints and other constraints related to the arrangement of multiple cameras into bundle adjustment in the following static system calibration step.

 

4. Two-Step Platform Calibration for a Multi-Camera System

In the case when the system consists of many cameras, correct platform calibration is essential for generating precise synthetic images. This paper introduced a multi-camera system developed for low altitude UAV. Moreover, this paper proposed a novel two-step platform calibration approach well-suited to the multi-head camera system. First of all, the static platform calibration was conducted to determine the initial ROPs between cameras based on an extended bundle adjustment approach. To further detect small deformations of ROPs during dynamic flight, a new dynamic platform self-calibration approach was proposed based on both tie points and straight lines in the small overlapping sub-images.

4.1 System structure of LAC-04 Camera System

The LAC-04 camera system introduced in this paper is combined with four Canon 5D Mark cameras, positioned in a convergent way, as shown in Fig. 1. All four cameras have same specification and the technical data are given in Table 1. To solve the problem of synchronous exposure of all single cameras, a shutter controlling system was developed to release four cameras within a precision of less than 0.8msec. Generally, UAV can fly at low altitude with low velocity less than 60km/h, this ensures that we can assume all 4 images taken at the same time. As a low-weight mechanical structure would be varied during flight, the camera mount for LAC-04 was designed to allow angular deformations

Fig. 1.Geometry structure of LAC-04 system

Table 1.Technical data of single camera

As shown in Fig. 1, four caeras of 1, 2, 3 and 4 are arranged with special direction. All platform calibration parameters are represented in the coordinate system of the camera “1”. The fields of view of every two adjacent physical available cameras are overlapping. The purpose of this kind of geometric structure design is to realize self-calibration. Based on the overlap by means of tie points (Fig. 2) the individual images can be merged together to a homogenous virtual image. The virtual image has 11750 pixels across track and 5504 pixels along track. The field of view for the combined camera system is 124° × 90° relative to 72° × 52° for a single cmera.

Fig. 2.Configuration of the individual images

4.2 Static platform calibration

In our framework, we extend the one-step platform calibration approach between two cameras to four cameras. Based on the overlapping relationship among cameras, camera”1” is adopted as the master camera and camera “j ” (j =2,3,4) is the slave camera. For the master camera, the basic mathematical model for one-step platform calibration are the collinear equations (1)for the master camera and the modified collinear equations (5) for a slave camera. Therefore, all observation equations can be expressed in the following matrix form through the linearization of equation (1) and equation(5).

Where Xp is correction vector for the initial approximations of the EOPs of a image taken from the master camera”1”; Xr contains corrections for the initial approximations of a set of ROPs for the slave cameras relative to the master camera.

Xc is correction vector for the intitial values of the object space coordinates of points; A1, Aj, B1, Bj, C1 and C2 are the designed matrix of the corresponding correction vector. V1 and Vj are the correction vector of the observations, L1 and Lj the constant items. Besides, the following constraint conditions can be derived based on convergent angular design of LAC-04.

Then, the constraint observations can be derived after the linearization of equation(8) in the matrix form as following:

Where D is the designed matrix of the correction vector for the ROPs, V5 the correction vector of the corresponding observations, L5 the constant item.

Therefore, the extended bundle block adjustment model with constrains are developed from equation (6) ,(7) ,(8) and (9).If the number of images acquired from each camera is k ,the total number of photographs from all cameras is 4k and the number of unknown object coordinates for object points is p , then the total number of unknowns to be estimated is 6×3 + (k × 6 ) + (p × 3) ,which is smaller than 6×4 k + ( p × 3)of the traditional bundle adjustment .Then, the solution of the extended bundle adjustment can be obtained by solving the normal equation as follows:

where X=[Xp Xc Xr]T the correction vectors of unknown parameters; Li the constant items calculated by the approximate values of the unknowns; and Pi the weight matrices of the observation equations. Thus, the ROPs( 18 relative orientation parameters,6 for each slave camera) of all slave cameras relative to the master camera, the exterior orientation of all images from the master camera and the coordinates of all the object points need to be calculated meanwhile in the combined bundle adjustment model through total iteration process.

4.3 Dynamic platform calibration

As described in section “1”, the central point of this paper is concerned with the on-fly platform calibration applied to a low-cost multi-head camera system LAC-04 . As seen in Fig. 3, assuming that we have a virtual space coordinate system S-XYZ whose origin is at the optical center of the virtual camera. The X and Y axes are directed toward the right and top direction relative to the camera respectively and the Z xis is perpendicular to the object plane. In this section, the 3D coordinates are defined in the virtual space coordinate system. The focal length of virtual camera is defined as the average value of those from real cameras.

Fig. 3.Tie points in the real image projected to the virtual image

The coordinates of an image point k (k =1,2..n)in the real image from camera” i” (i=1,2,3,4)can be projected to the virtual image as following:

Where are the distortion free position of the real image point and the tie points in the virtual image, respectively. λi,k is the scale and fi is the principal distance of a real camera , H is the flying height relative to the ground. are the coordinates of the projection center of camera” i” in the space coordinate system, the rotation matrix whose initial estimation values have been determined based on static platform calibration approach. Moreover, the EOPs of a slave camera relative to S-XYZ coordinate system can be expressed as follows.

Where there positional parameters and three rotation angles are six ROPs( relative orientation parameters ) from the slave image frame to master image frame .Further, provided that both the IOPs and ROPs are calibrated successfully and remain unchanged during flight, the x-coordinates and y-coordinates in the virtual image space being projected from the conjugate points in the real image frames of four cameras at the same epoch would be equal(see equation(13)).In fact, there is significant changes due to variations in the ROPs parameters for the multi-camera system during flight.

Where i and j are the number of two adjacent cameras(e.g. 1 and 3) with overlap, Δxk is for the change in the parallax in the x-direction, Δyk in y-direction. In the current system, the heads are tightly attached to an external platform , there is only tiny exposure lags among cameras and UAV moves at low velocity. Thus, it can be supposed that the change of relative spatial position of each camera is usually tiny (sub-millimeter level) which could be ignored. However, the tiny angular movements can not neglected and need to be determined at each exposure epoch. In our framework, the error equations for the dynamic platform calibration can be built based on the linearization of equation(14).

Where and are two sets of partial derivatives.

Where Vx,k, Vy,k the correction vectors of the observations; contains the correction vectors of the angular exterior parameters of camera ”i” and camera ”j” . Based on the tie points in the overlapping images from every two cameras (Fig. 3) ,the unknown angular orientation parameters can be acquired by solving equations (15)and (16).

However, investigations have shown that it is difficult to acquire accurate calibration results based on a small image patch in the overlapping area without enough well distributed points according to equation(14),(15) or other similar modified bundle adjustment. In current study, the linear features can be employed and incorporated into the mathematical adjustment model so that the redundancy can be increased and the geometric stability can be enhanced.

A shown in Fig. 4, two corresponding image lines p1p2 and p1'p2' are projected on the image pair. Let πi denote the plane defined by two lines Sip1' and Sip2';πj denote the plane defined by lines points Sjp1' and Sjp2'. The vectors of Sip1 and Sip2 and Sjp1', Sjp2' can be defined as and Plane πi and πj intersect at the object straight line L . Assuming that the normal vectors πi and πj are represented as and respectively, we can derive the parameters of two vectors according to the following two equations.

Fig. 4.Corresponding image lines on an image pair

Where

Where the distortion free coordinates of real image point from camera “i”, from camera “j” .

Moreover, the direction vector (βiγj - βjγi,αjγi - αiγj,αiβj - αjβi)T of the object line L can be computed according to the intersection of two planes let line L represent a horizontal line, there exists the following equation:

If line L is a vertical line, there exists the following equations:

Equation (19)-(21) can be linearized according to the Taylor series, so that three equations with the following forms can be obtained:

Where

In the same fashion, the partial derivatives for can be derived.

where is the approximate value of Fm which can be calculated according to equation(19) to equation (21) using the approximations of the angular orientation parameters . Therefore, for each two image pairs with overlap, the combined adjustment model of platform self-calibration can be expressed as the following matrix form:

Where: Vx,y and VE are the correction vectors of the observations based on corresponding points and lines, respectively, the correction vectors of angular parameters with respect to virtual image, Lx,y and LE the constant items calculated by the approximate values of the unknowns; A1 and A2 the designed matrix of the correction vectors. Px,y and Pl the weight matrices of the observation equations. The observations should be weighted carefully since the approximate weights of the observations may influence the results of calibration. Then, the solution of the bundle adjustment can be obtained by solving the normal equation as following:

To evaluate the performance of dynamic platform calibration, the root mean square (RMS) residuals of tie points, of which image coordinates are known but not used in the adjustment, were examined. RMSE residuals can be calculated from equation(26).

Where are discrepancies between the x-coordinates, y-coordinates of tie points in a virtual image generated from image i and image j .

 

5 Experiments and discussions

The multi-camera system LAC-04 was pre-calibrated based on the static platform calibraion approach using the outdoor calibration field before image collection tasks. The outdoor calibration field and the distribution of control points are shown in Fig. 5 and Fig. 6. Total 64 images (16 images with each camera) were captured.The images acquired at the same instant was shown in Fig. 7.More than 30 control points are used in each image pair. In all experiments, camera “1” is taken as the master camera (i.e., the ROPs of each slave camera refer to the position and orientation parameters with respect to camera “1”). The calibration results for static platform calibration are shown in Table 2. To evaluate the calibration accuracy, image pairs were rectified using those IOPs and ROPs estimated with terrestrial static calibration. Then, the RMSE of tie points located in the overlap area was then computed. The average value of RMSE was 0.85 pixels for check points.

Fig. 5.Outdoor calibration field

Fig. 6.Distribution of control points

Fig. 7.Images of calibration field acquired at the same instant

Table 2.ROPs for LAC-04 from Static Platform Calibration

Moreover,the correctness and performance of the proposed dynamic platform self-calibration approach was tested with 14 groups of digital low-altitude images(four images in each group) from LAC-04 camera system. The test data containing 60 corresponding points (40 points were taken as check points,20 points are taken as control points) , four corresponding horizontal lines, two corresponding vertical lines were obtained from four images taken at the same instant. The distribution of the features of corresponding points and straight lines in four images taken at the same instant was shown in Fig. 8.

Fig. 8.Distribution of tie points and straight lines in the sub-images

The virtual image was then developed from the four sub-images based on the two-step platform calibration results(Fig. 9).The experimental results in Table 3 show that, when compared to the platform calibration approach using 40 corresponding points only, the proposed two-step platform calibration approach using four corresponding horizontal lines, two corresponding vertical lines and 40 tie points not only ensures the validity of calibrating results , but also slightly improves the calibration accuracy. Table 3 also shows that the proposed approach based on both corresponding straight lines and corresponding points can obtain reasonable relative orientation results even in the case that not enough corresponding points are available and the point-based platform self-calibration cannot be implemented.

Fig. 9.Virtual image from the four sub-images

Table 3.Platform calibration results from second dataset

Moreover, comparisons of the calibrated ROPs(relative orientation parameters) from 14 groups of images are shown in Fig. 10, Fig. 11 and Fig. 12, respectively. Considering that the calibration elements based on 40 corresponding points are reliable and accurate, let the (i=1,2,3,4)represent the discrepancies between the accurate elements and those based on 20 corresponding points ,four horizontal lines and two vertical lines ; (i=2,3,4) the discrepancies between the accurate elements and those only based on only 20 corresponding points. It is clear that the elements of are larger than those of . The geometric relationship between cameras may be seriously bended based on the dynamic platform calibration from less corresponding points. However, the elements are very small which shows that the proposed platform calibration approach based on points and straight lines can acquire accurate platform calibration results even in the case that not enough corresponding points are available and the conventional point-based calibration method cannot be implemented reliably.

Fig. 10.Deviation from accurate angular element Phi

Fig. 11.Deviation from accurate angular element omg

Fig. 12.Deviation from accurate angular element klf

Finally, from the group of 14 images, two sets of RMSE for check points can be computed from only static platform camera calibration and the two-step platform calibration, respectively. It can be seen from Fig.12 that the RMSEs decrease proportionally with the two-step platform calibration. This decrease is probably due to the varied relative orientation parameters being corrected from the combined static calibration and the dynamic calibration. It is feasible to acquire satisfactory mosaicing images with sub-pixel accuracy based on the two-step platform calibration approach, although the geometric relationship among cameras may unstable for low-cost and low-weight multi-camera system onboard an UAV system. As the above experiments mentioned, in order to acquire virtual images of high accuracy for mapping projects, the static platform calibration step similar to the platform calibration methods in references[8][17][18] should be combined with a dynamic platform calibration step. In contrast to the dynamic platform calibration approach in references[3][4][5][6], the static ROPs are reliably determined with an ordinary terrestrial calibration field, avoiding specialized direct measurements and the varied angular parameters among cameras can be determined based on tie points and lines.

Fig. 13.Value of the root mean square for residuals of check points computed from two sets of ROPs

 

6. Conclusion

The use of multi-camera system is another alternative to extend ground coverage and base-height ratio in low-altitude remote sensing. These multi-camera systems may suffer from the varied relative movement among cameras in dynamic flight. This paper gave a brief overview for the previous multi-camera system and platform calibration methods. Moreover, a multi-camera system LAC-04 combined with four consumer-grade digital cameras was introduced and a new two-step approach combined with static and dynamic platform calibration process was proposed. Firstly, the extended bundle adjustment method aided by stable relative orientation constraint and convergent angular conditions among cameras was adopted, which states that the relative relationship between cameras remains stable in static terrestrial calibration environment. The aim of this step is that all positional parameters, angular parameters are reliably determined with an ordinary terrestrial calibration field, avoiding specialized direct measurements. In the second step, a new approach based on points and straight lines was proposed to determine the varied relative orientation parameters under flight . The aim of this step is to compensate the small change of relative orientation among cameras,avoiding failure while lacking enough corresponding points. Finally, experiments based on the proposed combined platform calibration method were carried out with the LAC-04 multi-camera system onboard an UAV aerial vehicle. Experimental results were presented with ideal geometric platform calibration accuracy. The approach will be extended and used for other multi-camera system platform calibration, which is left as a work for further research.

References

  1. Jahan F., Islam, M. K. and Baek, J., "Person Detection, Re-identification and Tracking Using Spatio-Color-based Model for Non-Overlapping Multi-Camera Surveillance Systems,” Smart Computing Review, vol. 2, no. 1, pp. 42-59, 2012. Article(CrossRefLink)
  2. Ragab M., Elkabbany G., “A Parallel Implementation of Multiple Non-overlapping Cameras for Robot Pose Estimation,” KSII Transactions on Internet and Information Systems, vol.8, no.11, pp.4103-4117, 2014. Article(CrossRefLink) https://doi.org/10.3837/tiis.2014.11.025
  3. Alamús R. , Kornus W. and Talaya J. , “Studies on DMC geometry,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 60, no. 6, pp. 375–386, 2006. Article(CrossRefLink) https://doi.org/10.1016/j.isprsjprs.2006.05.006
  4. Dhillon, V.S., Marsh, T.R., Stevenson, M.J., et al., “ULTRACAM: An ultrafast, triple-beam CCD camera for high-speed astrophysics,” Monthly Notices of the Royal Astronomical Society, vol. 378, pp.825–840, 2007. Article(CrossRefLink) https://doi.org/10.1111/j.1365-2966.2007.11881.x
  5. Spiller R. and Hinz A., "Z/I Imaging digital aerial camera system," in Proc. of International Symposium on Optical Science and Technology, pp.190-199, 2000. Article(CrossRefLink)
  6. Gruber M. and Walcher W., "Calibrating the new Ultracam osperey oblique aerial sensor," in Proc. of International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, pp.47-52, 2014. Article(CrossRefLink)
  7. Rupnik E, Nex F, Toschi I, et al., “Aerial multi-camera systems: Accuracy and block triangulation issues,” ISPRS Journal of Photogrammetry and Remote Sensing, vol.101, pp.233-246, 2015. Article(CrossRefLink) https://doi.org/10.1016/j.isprsjprs.2014.12.020
  8. Tommaselli A, GaloM., Moraes M., Marcato, J., et al., “Generating Virtual Images from Oblique Frames,” Remote Sensing,vol.5, no.4,pp.1875–1893,2013. Article(CrossRefLink) https://doi.org/10.3390/rs5041875
  9. Lee Y., Yilmaz A. and Mendoza-Schrock O., "In-flight camera platform geometric calibration of the aerial multi-head camera system," in Proc. of the IEEE National Aerospace and Electronics Conference (NAECON),pp.136-139,2010. Article(CrossRefLink)
  10. Heng L, Lee G., Pollefeys M., “Self-Calibration and Visual SLAM with a Multi-Camera System on a Micro Aerial Vehicle,” Autonomous Robots, vol.39, no.3, pp.259-277, 2014. Article(CrossRefLink) https://doi.org/10.1007/s10514-015-9466-8
  11. Ritchie G., Sullivan D., Perry C., et al., “Preparation of a low-cost digital camera system for remote sensing,” Applied engineering in agriculture, vol.24, no. 6, pp. 885–896, 2008. Article(CrossRefLink) https://doi.org/10.13031/2013.25359
  12. Holtkamp D., Goshtasby A., “Precision registration and mosaicking of multicamera images,” IEEE Transactions on Geoscience & Remote Sensin, vol.47, pp.3446–3455, 2009. Article(CrossRefLink) https://doi.org/10.1109/TGRS.2009.2023114
  13. Lin Z., "UAV borne low altitude photogrammetry system," in Proc. of International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences,vol.XXXIX-B1, pp. 415-423, 2012. Article(CrossRefLink)
  14. Cui H,Lin Z.,MENG W., et al., “Relative Self-calibration of Large Frame Digital Camera,” Optical Electrical Engineering, vol.36, no.6, pp.81-85, 2009. Article(CrossrefLink)
  15. Cui H., Lin Z., et al., “Multiview Photogrammetry Using Low Altitude Digital Images From Unmanned Airship,” Optical Electrical Engineering, vol.35, no.7, pp.73-78, 2008. Article(CrossRefLink)
  16. Detchev I., Mazaheri M., Rondeel S. and Habib A., "Calibration of multi-camera photogrammetric systems," in Proc. of International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XL-1, pp.101-108, 2014. Article(CrossRefLink)
  17. Tommaselli A., Galo, M., Bazan, W., et al., "Using Relative Orientation Constraints to Produce Virtual Images from Oblique Frames," in Proc. of International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences,vol .XXXIX-B1, pp.61-66, 2012. Article(CrossRefLink)
  18. Ayman H., Ivan D., and Eunju K, “Stability Analysis for a Multi-Camera Photogrammetric System,” Sensors, vol.14, no.8, pp. 15084-15112, 2013. Article(CrossRefLink) https://doi.org/10.3390/s140815084
  19. Taha Z., Tang Y.R., Yap K. C., “Development of an onboard system for flight data collection of a small-scale UAV helicopter,” Mechatronics, vol.21, no.1, pp.132–144, 2011. Article(CrossRefLink) https://doi.org/10.1016/j.mechatronics.2010.09.008
  20. Brown D., "The bundle adjustment - Progress and prospects," in Proc. of International Archives of Photogrammetry, vol.21, pp . 29-33,1976. Article(CrossRefLink)
  21. Fraser C., “Digital camera self-calibration,” ISPRS Journal of Photogrammetry & Remote Sensing, vol.52, no.4, pp.149–159, 1997. Article(CrossRefLink) https://doi.org/10.1016/S0924-2716(97)00005-1
  22. Habib A. and Morgan M., “Automatic calibration of low-cost digital cameras,” Optical Engineering, vol.42, 948–95, 2003. Article(CrossRefLink) https://doi.org/10.1117/1.1555732
  23. Jeong Y., Nistér D., Steedly W., et al., “Pushing the Envelope of Modern Methods for Bundle Adjustment,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.34, no.8, pp.1605-1617, 2012. Article(CrossRefLink) https://doi.org/10.1109/TPAMI.2011.256