1. Introduction
The Global Positioning System (GPS) has been dominantly used in the car navigation systems, but does not appropriately operate in the urban canyon when its signals are blocked by the buildings. In this case, however, the accuracy of vehicle positioning is significantly degraded (Skog and Handel, 2009), or the positioning is even impossible. The GPS has been integrated with the Inertial Navigation System (INS) to improve the accuracy and continuity of GPS-based positioning system over the last two decades (Godha and Cannon, 2007; Zhou et al., 2010; Leung et al., 2011). Although the INS/GPS integration is applied for the vehicle positioning, if GPS signals are blocked for long time, the accuracy and reliability of positioning are degraded so as the divergent of INS navigation solution. Therefore, other sensors, attached on the vehicle or camera, have been considered to provide an additional position and attitude information.
The vehicle-sensors, such as the Steering Angle Sensor (SAS) and the odometer, also provide additional position or attitude of the car to correct the INS navigation solution. There have been a great number of researches developing the vehicle positioning, based on integrating INS, GPS, and vehicle-sensors (Yang and Farrell, 2003; Gao et al., 2008; Georgy et al., 2011; Jo et al., 2012). However, the results of positioning based on the integration between vehicle-sensor and INS/GPS have errors up to several tens of meters because the vehicle-sensors provide inaccurate and deficient data for correcting the INS navigation solution.
Image-based positioning techniques, such as the visual odometry, vision system, and the Single Photo Resection (SPR), can determine position and attitude more accurately compared with to the other vehicle- sensors. Recently, as the image processing techniques have been well-developed, the image-based positioning has been applied for the vehicle positioning. The video black box is widely used application following developments in image processing techniques. Nister et al. (2004) has developed and applied the visual odometry, which estimates a relative position and attitude between two consecutive frames by using the conjugate points, for the vehicle positioning. In fact, the visual odometry has positioning errors that are increased as the number of image frames and the driving distance are increased. Vu et al. (2012) calculated the relative distance between the car and the known point using the vision system and corrected the error in INS solution. Park(2013) estimated the relative attitudes of car with the displacements of vanishing point, detected on the image. Since the image-based positioning of the vanishing point only provide relative attitude, it is not enough to sufficiently correct the errors of INS solution. Han et al. (2014) has developed the vehicle positioning algorithm based on the loosely coupled approach using the integration of GPS, INS and SPR, which determined the position and attitude of camera using coordinates of Ground Control Points (GCPs) and image points, and verified its performance depending on the geometric constellation of GPS satellite and GCP on the image. It can be analyzed the positioning accuracy of INS/GPS/SPR integration was less than 1m and irrelevant to GPS signal environment if the SPR worked properly, however, it did not consider the various factors affecting on SPR performance.
In this paper, we implement the INS/GPS/SPR integration algorithm, where the INS is combined with the GPS in loosely coupled approach and integrated with the SPR in tightly coupled one. Also, the performance of INS/SPR integration is analyzed with the consideration of factors affecting the SPR under the GPS signal blockage environment. The target accuracy of positioning by INS/SPR integration set to 2.5m. At section two, the INS/GPS/SPR integration algorithm, based on the Extended Kalman Filter (EKF) technique, is introduced. The section three describes results of performance analysis of the INS/SPR integration under the GPS blockage environments with a consideration of magnitudes of GCP errors, distances between GCPs in the driving direction and SPR processing intervals. The section four finally summarizes all the results and proposes for future researches.
2. INS/GPS/SPR Integration Algorithm
The INS/GPS/SPR integration algorithm can estimate the stable and reliable positions and attitudes because the algorithm reduces the navigation solution errors of the INS with the GPS, or the SPR. Fig. 1 is a diagram of the INS/ GPS/SPR integration algorithm. To estimate the position and attitude, we firstly solve the navigation equation with an acceleration and angular velocity, obtained from inertial sensors. The position and attitude are updated by EKF with the 3D position information that has determined by GPS. If more than 4 satellites are available, or use SPR solution, if GCPs are obtained enough to conduct SPR.
Fig. 1.Diagram of INS/GPS/SPR integration algorithm
The eighteen states EKF for the INS/GPS/SPR integration algorithm are designed when the state vector is composed of errors in the orientation, the velocity towards North-East- Down (NED) coordinates system and the position towards the World Coordinate System 1984 (WGS84), bias in the gyroscope and the accelerometer, and scale factor in the gyroscope. The white noise is composed of the gyroscope and accelerometer random-walk.
where xi is the state vector of navigation error(xi = [δα δβ δγ δVn δVe δVd δφ δλ δh]T), and it consists of attitude, velocity, and positon error, xf is the state vector of sensor error (xf= [δψn δψe δψd δfn δfe δfd δsn), and it consists of gyroscope bias, accelerometer bias, and gyroscope scale factor, F11 and F12 are the dynamic matrix of navigation and system error respectively, ωi is the white noise (ωi = [Wg Wg Wg Wa Wa Wa]T).
2.1 Compensating errors of INS navigation solution using GPS
Since the GPS-based navigation system, mounted in the car, does not generally provbide raw data, a loosely coupled approach is applied for the INS/GPS integration. If it is available to receive GPS signals from more than four satellites, the state vector is updated using 3D position information determined from the GPS. Eq.(2) is the observation equation usinkg the GPS in order to update the state vector.
The vectors of GPS observations (zGPS) consist of difference between the latitude, longitude, and ellipsoidal height estimated from GPS and INS in WGS84
Eq. (4) shows the design matrix (HGPS) which explain the relationship between the vectors of GPS observation and the state vector, and Eq. (5) is the variance-covariance matrix.
where 0 is the zero matrix, I is the identity matrix are the variance of estimated latitude, longitude, and ellipsoidal height.
2.2 Correcting the error of INS navigation solution using SPR
The integration of INS and SPR is developed by a tightly coupled approach, as it is much easier and more efficient in this case than the loosely coupled approach, which independently generates solutions from each sensor and need another filter to integrate them. The SPR is a photogrammetry technique which determines the position and the attitude of camera from the collinearity equation using the coordinates of objective points and image points, the interior orientation parameters, and the initial exterior orientation parameters. The collinearity equation is based on the conditions that the perspective center, the image point, and the object point must lie on a straight line, and it is given in Eq. (6).
where (xa, ya) is the image coordinate of point A, (xp, yp) is the calibrated principal point coordinate, c is the focal length of camera, r11, r12, r13,... ,r33 are the elements of the rotation martrix, (XA, YA, ZA) is the object coordinate of point A, (X0, Y0, Z0) is the coordinate of the perspectibe center.
The observation equation using the image coordinates produced by SPR processing is given in Eq. (7), and it update the state vector.
The vector of SPR obsevations (zSPR) consist of difference between the measured coordinates and the newly predicted coordinates of image points. If the coordinates of object and image point corresponding to the object are know, the new coordinate of image point can be predicted from the collinearity equation using the initial exterior orientation parameters from INS.
where is the coordinate of image point predicted from the collinearity equation, is the coordinate of image point measured from the image.
The design matrix (HSPR) which explain the relationship between the vector of SPR observations and the state vectors in given in Eq. (9), and Eq. (10) shows the variance-covariance matrix
The design matrices (Aatt, Apos) in Eq. (9) are the partial differential martices for the attitude and position, respectively, of SPR design martix derived from linearization of collinearity equation as given in Eq. (12) and (13). The rotation matrix from ECEF (Earth- Centered-Earth-Fixed) to NED coordinate system and the partial differential matrix for ECEF coordinate system are given in Eq. (14) and (15). Please refer detailed equation for Eq. (15) in Kim et al. (2006)
where α, β, γ are the roll, pitch, yaw angles, respectively, of the body frame with respect to the navigation frame, φ, λ, h are the latitude, longitude, and ellipsoidal height estimated by INS.
2.3 Correction the error of INS navigation solution using both GPS and SPR
If the state vector is updated using 3D position information determined by GPS and image coordinates produced by the SPR processing. An observation equation for updating state vector is given in Eq. (16). The vector observations (zGPS/SPR), the design matrix (HGPS/SPR), and variance-covariance matrix (RGPS/SPR) are given in Eq. (17), Eq. (18), and Ep. (19).
3. Simulation Tests
3.1 Overview
Under the loss of GPS signals, the accuracy of INS/SPR integration depends on SPR because INS navigation solution is updated by only SPR results. Therefore, we assumed that the GPS signal is perfectly blocked and analyzed the performance of INS/SPR integration with consideration of factors that affecting to the SPR. The targeted accuracy is set up to be 2.5m for the positioning because it is the general accuracy value of the vehicle positioning based on L1/CAcode of GPS. The driving trajectory heads to the longitudinal direction and drives continuously 200 seconds with the speed of 10m/s. Also, it is assumed that the image including 4 GCPs is obtained in every second, and the camera coincide with axis heads the longitudinal direction. The MEMS-IMU of Xsens MTi-G 700 INS/GPS model was selected with consideration of the size, price, and accuracy for vehicle positioning satisfying the goal accuracy. The camera model for simulation is the GEViCAM GD-155000C camera, which is easy to integrate with the INS, so as similar specification with the video black box camera. Table 1 and 2 show the detailed specification of sensors.
Table 1.MEMS-IMU specification (Model : Xsens MTi-G 700 INS/GPS)
Table 2.Camera specification (Model : GEViCAM GD- 155000C)
The SPR performance is affected by the accuracy of input data, such as the initial exterior orientation parameters, the interior orientation parameters, the image coordinates, GCP coordinates, and the geometry of GCPs. The interior orientation parameters can be precisely determined by the camera calibration. Also, the accuracy of image coordinates does not have crucial effects on SPR performance (Kang et al., 2014). Therefore, we consider those GCPs coordinate errors, the distance between GCPs in the driving direction, the constellation of GCPs on the image, and the SPR processing interval for the GCP and initial exterior orientation parameters.
The test parameters for simulation are given in the following; 1) GCPs coordinate errors vary from 0.05m to 0.4m with 0.05m interval, 2) the distances between GCPs in the driving direction are changing in 5m, 3m, and 1m, 3) SPR processing interval varies from 1 to 5 seconds, 4) the constellations of GCPs on the image are considered in 2 types, that is, the 4 GCPs are widespread over the image or the 4 GCPs are crowed on center of the image (Fig. 2). The simulation tests are proceeded for 3 factors, which are from 1) to 3) and the 4) constellation types on the image for each simulation test. Table 3 show the test parameters.
Fig. 2.Constellations of GCPs on the image
Table 3.Summary of the test parameters
3.2 Results
3.2.1 Impact of GCPs accuracy
The GCPs coordinate can be observed directly, or extracted from the digital map, so we set up the maximum errors of the GCPs coordinate as 0.4m by considering 0.4m accuracy of 1:1000 digital maps. The errors in the coordinate and constellation types of GCPs are set up as variables, when others are fixed; 1) the distance between GCPs in the driving direction is 5m, 2) SPR processing interval is 1second. The summary of test conditions is shown in Table 4.
Table 4.Simulation conditions for the GCPs coordinate error
In the Fig. 3, as errors in GCPs coordinate increase, especially for geometry type 1 and 2, the Root Mean Square Error (RMSE) of position is also increased because an error in INS is corrected inaccurately by the SPR. When the geometry type 1 image is used, the RMSE is 1.53m towards the longitude direction. In fact, this is better than the goal accuracy, although the error in GCPs coordinate reaches up to 0.04m. On the other hand, the RMSE is 4.92m in longitude direction, if we apply geometry type 2 and 0.15m for GCP coordinates error. When the error in GCP coordinate is more than 20m, it becomes diverged. The reason for the RMSE in longitudinal direction is larger than the other direction is that all GCPs are located in the driving direction, and is hard to compensate the error in the direction.
Fig. 3.Compared RMSE of position depend on the GCP coordinate error for geometry type1 and 2
3.2.2 Impact of the distance between GCPs in driving direction
If the distance between GCPs in driving direction is changed, the geometries in 3D space is also transformed, although these are projected onto the image at the same location. We test about that the geometry of GCPs in the 3D space affect to SPR performance. Every image has 4 GCPs, which is grouped into 2 sets. Each set has same distance from the car in the driving direction, and the distance between 2 sets are varying as 5m, 3m, and 1m. Fig. 4 explains the distance between GCPs in the driving direction. The test conducts in the geometry type 1 and 2, same as tested in 3.2.1, and others are fixed as: 1) the GCPs coordinate error is 0.1m, 2) the SPR processing interval is 1 second. The test summary is given in Table 6.
Fig. 4.The distance between GCPs in driving direction
Table 5.Simulation conditions for the distance between GCPs in driving direction
As a result, the RMSE of position is increased respect to decreasing the distance between GCPs from 5m to 1m (Fig. 5). Since the distance between GCPs gets shorter, the GCPs geometry in 3D space becomes unstable as well as the accuracy of SPR performance is degraded. The geometry type1 results out the stable outcome, 0.64m, which is the largest error in every direction, but the distance between GCPs is 1m. On the other hand, positioning RMSE is larger than the goal accuracy, if the type2 geometry is applied and the distance between GCPs becomes shorter than 3m.
Fig. 5.Compared RMSE of position depend on the distance between GCPs in driving directionfor geometry type1 and 2
3.2.3 Impact of SPR processing interval
If the SPR processing interval becomes longer, it continuously cumulates the error of INS navigation solution, but also affects to SPR following to use the INS navigation solution as the initial exterior parameters. Therefore, we test the SPR processing interval in order to find the optimal interval to satisfy targeted accuracy. In test, the SPR processing interval varies from 1 to 5 seconds, so as needs to consider in the geometry types, while others are fixed as: 1) GCPs coordinate error is 0.05m, 2) the distance between GCPs in longitudinal direction is 5m. Table 6 summarizes the test parameters.
Table 6.Simulation conditions for the SPR processing interval
As the SPR processing interval is increased, the position RMSE of INS/SPR integration is increased for both the geometry type 1 and 2 as shown in Fig. 6. The RMSE of position is 0.68m, if the interval is 5seconds and the geometry type1 is used. When applied the geometry type2, the accuracy of car position satisfy the goal accuracy within 2 seconds interval, but it is unlikely diverged, if the interval becomes 3seconds. Since the updated position by the SPR is located at the point passed by the GCPs, there are no available GCPs for the next SPR processing (Fig. 7) and the position diverges at 3 seconds.
Fig. 6.Compared RMSE of position depend on SPR processing interval for geometry type1 and 2
Fig. 7.The car position estimated at the point passed by the GCPs
4. Conclusion
The INS/SPR integration, which corrected errors of INS navigation solution with results fom SPR processing, was implemented from the tightly coupled EKF. Its performance has been tested based on the simulation tests under the assumption of only available observations with MEMS-IMU, camera sensor, not GPS. Since the accuracy of INS/SPR integration depends on the SPR, we tested parameters, such as errors in GCP coordinate, distances between GCPs, and SPR processing intervals, determining the accuracy of SPR. Additionally, the simulation tests has been considered the geometric constellation of GCPs on the image with the goal accuracy of 2.5m. From the results, we can achieve the positioning accuracy less than 1m through the INS/SPR integration with the following conditions; 1) the GCPs should be distributed on the image like geometry type1, 2) the SPR should be done within every 5 seconds, 3) the accuracy of GCPs coordinates should be less than 0.05m, 4) the distance between GCPs in the driving direction should be longer than 1m.
In conclusion, the interval, density, and constellation of GCPs should be considered for the accurate positioning with the SPR. If the suggested conditions are applied, the collecting of GCPs for the INS/SPR integration can be properly done in the aspects of efficiency and accuracy. In the future, we will conduct the additional performance analysis for the INS/GPS/SPR integration algorithm under similar environments with the real road. Furthermore, the driving test will be performed in the test bed, which is constructed from the additional performance analysis.
참고문헌
- Gao, J., Petovello, M., and Cannon, M.E. (2008), Integration of steering angle sensor with global positioning system and micro-electro-mechanical systems inertial measurement unit for vehicular positioning, Journal of Intelligent Transportation Systems, Vol. 12, No. 4, pp.159-167. https://doi.org/10.1080/15472450802448138
- Georgy, J., Karamat, T., Iqbal, U., and Noureldin, A. (2011), Enhanced MEMS-IMU/odometer/GPS integration using mixture particle filter, GPS Solutions, Vol. 15, No. 3, pp. 239-252. https://doi.org/10.1007/s10291-010-0186-4
- Godha, S. and Cannon, M.E. (2007), GPS/MEMS INS integrated system for navigation in urban areas, GPS Solutions, Vol. 11, No. 3, pp. 193-203. https://doi.org/10.1007/s10291-006-0050-8
- Han, J., Kang, B.Y., and Kwon, J.H. (2014), Development of GPS/IMU/SPR integration algorithm and performance analysis for determination of precise car positioning, Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, Vol. 32, No. 2, pp. 163-171. (in Korean with English abstract) https://doi.org/10.7848/ksgpc.2014.32.2.163
- Jo, K., Chu, K., and Sunwoo, M. (2012), Real-time computer vision/DGPS-aided inertial navigation system for lanelevel vehicle navigation, IEEE Transactions on Intelligent Transportation Systems, Vol. 13, No. 2, pp. 899-913. https://doi.org/10.1109/TITS.2012.2187641
- Kang, B.Y., Han, J., and Kwon, J.H. (2014), Positioning accuracy analysis of GPS/INS/SPR integration with measurement error of ground control points and image points, Proceedings of the Korean Association of Geographic Information Studies Spring Conference 2014, KAGIS, 24-25 April, Seoul, Korea, pp. 84-85. (in Korean)
- Kim, K., Park, C.G., Yu, M.J., and Park, Y.B. (2006), A performance comparison of extended and unscented kalman filters for INS/GPS tightly coupled approach, Journal of Control, Automation, and Systems Engineering, Vol. 12, No. 8, pp. 780-788. (in Korean with English abstract) https://doi.org/10.5302/J.ICROS.2006.12.8.780
- Leung, K.T., Whidborne, J.F., Purdy, D., and Barber, P. (2011), Road vehicle state estimation using low-cost GPS/INS, Mechanical Systems and Signal Processing, Vol. 25, No. 6, pp. 1988-2004. https://doi.org/10.1016/j.ymssp.2010.08.003
- Nister, D., Naroditsky, O., and Berfen, J. (2004), Visual odometry, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE, 27 June-2 July, Washington D.C., USA, Vol. 1, pp. 652-659.
- Park, J.H. (2013), Estimation for Displacement of Vehicle Based on GPS and Monocular Vision Sensor, Master's thesis, Korea Aerospace University, Goyang-si, Gyeonggido, Korea, 101p. (in Korean with English abstract)
- Skog, I. and Handel, P. (2009), In-car positioning and navigation technologies-a survey, IEEE Transactions on Intelligent Transportation Systems, Vol. 10, No. 1, pp. 4-21. https://doi.org/10.1109/TITS.2008.2011712
- Vu, A., Ramanandan, A., Chen, A., Farrell, J.A., and Barth, M. (2012), Real-time computer vision/DGPS-aided inertial navigation system for lane-level vehicle navigation, IEEE Transactions on Intelligent Transportation Systems, Vol. 13, No. 2, pp. 899-913. https://doi.org/10.1109/TITS.2012.2187641
- Yang, Y. and Farrell, J.A. (2003), Magnetometer and differential carrier phase GPS-aided INS for advanced vehicle control, IEEE Transactions on Robotics and Automation, Vol. 19, No. 2, pp. 269-282. https://doi.org/10.1109/TRA.2003.809591
- Zhou, J., Edwan, E., Knedlik, S., and Loffeld, O. (2010), Low-cost INS/GPS with nonlinear filtering methods, Proceeding of the 2010 13th Conference on Information Fusion (FUSION), IEEE, 26-29 July, Edinburgh, Scotland, pp. 1-8.
피인용 문헌
- Unscented Kalman Filter based Outdoor Localization of a Mobile Robot vol.36, pp.2, 2014, https://doi.org/10.7736/kspe.2019.36.2.183
- Unscented Kalman Filter Based 3D Localization of Outdoor Mobile Robots vol.37, pp.5, 2014, https://doi.org/10.7736/jkspe.019.066