Spacecraft Pose Estimation Based on Different Camera Models

Spacecraft pose estimation is an important technology to maintain or change the spacecraft orientation in space. For spacecraft pose estimation, when two spacecraft are relatively distant, the depth information of the space point is less than that of the measuring distance, so the camera model can be seen as a weak perspective projection model. In this paper, a spacecraft pose estimation algorithm based on four symmetrical points of the spacecraft outline is proposed. The analytical solution of the spacecraft pose is obtained by solving the weak perspective projection model, which can satisfy the requirements of the measurement model when the measurement distance is long. The optimal solution is obtained from the weak perspective projection model to the perspective projection model, which can meet the measurement requirements when the measuring distance is small. The simulation results show that the proposed algorithm can obtain better results, even though the noise is large.


Introduction
With the development of space technology, an increasing number of space missions involve the relative position measurement of two spacecraft [1][2][3][4], such as space assembly, space satellite repair, fuel injection, satellite capture and tracking, and space interception. The measurement of the spacecraft's relative position is very important to maintain or change the spacecraft orientation in space to complete a space mission.
For relative position measurement, vision has some advantages in terms of weight, volume, power consumption, and equipment cost [5][6][7][8]. In the smart-OLEV mission [9,10], the SMART-1 platform uses stereo cameras and lighting equipment to provide better measurement data within 5 m, but pointing data are only provided at a long distance. The Argon vision system is divided into long-distance and short-range vision sensors [11,12], which select different field-of-view sensors for different distances. The natural image feature recognition system developed by the Johnson Space Center USES generates a 3D model of the target under test to calculate the relative pose [13,14]. Its measurement accuracy is proportional to the relative distance. The measurement system requires to measure the relative pose information of two spacecraft at different distances for the control system or other systems, and the relative distance of the spacecraft varies significantly, reaching more than 20 times. Therefore, spacecraft pose estimation has the following characteristics.
(1) When the two spacecraft are relatively distant, the depth information of the feature points on the target spacecraft and the distance between the feature points are less than the relative distance between the two spacecraft.
(2) Because the focal length of the camera is fixed, the accuracy of the feature point extraction decreases with an increase in the relative distance between the two spacecraft. Based on the reasons above, when the two spacecraft are far apart, the pose measurement accuracy will be reduced. At present, two main algorithms are used for pose estimation.
(1) Cooperative space measurement. The first is an analytical algorithm based on the perspective projection camera model, such as perspective-n-point [15][16][17] and direct linear transformation [18][19][20]. Using these algorithms, the pose of spacecraft can be solved directly. However, the accuracy of the spacecraft pose obtained using the analytical algorithm is unsatisfactory. and optimization algorithms based on nonlinear camera models, such as Gauss-Newton, Levenberg-Marqurdt, and orthogonal iteration algorithms [21][22][23]. These algorithms require a good initial solution for optimization. Therefore, we aim to obtain high-precision analytical solutions using analytical algorithms. (2) Based on noncooperative space measurement, the transformation of the pose is calculated by using pattern matching and 3D point cloud technology [24][25][26][27][28].
In this study, a spacecraft pose estimation algorithm based on the target geometric constraints of the spacecraft outline is proposed. To reduce the influence of distance on measurement accuracy, this study simplifies the camera measurement model. The simulation results show that the proposed algorithm has an image feature error of 0.1 pixel to 1 pixel from 1 m to 20 m.

Pose Estimation Algorithm
Spacecraft pose estimation is based on the relationship between the target spacecraft points and the corresponding image points. The relative pose of the target spacecraft coordinate system and camera coordinate system is calculated by using the multipoint correspondence relationship ( Figure 1).
The mapping relations between the target spacecraft point and the image point can be described by two mathematical mappings: 1) rigid transformation and 2) camera model. In the former, the space points in the space coordinate system and camera coordinate system follow a rigid body transformation, namely, rotation and translation transformations. Because the camera is installed on the tracker spacecraft, the relative pose relationship between the tracker spacecraft and the target spacecraft can be described by the pose relationship between the target spacecraft and the camera coordinate system. In the latter, the relationship between 3D space points in the camera coordinates and the projection 2D image points on the camera image plane is considered.

Algorithm Model
To construct the spacecraft pose estimation algorithm model, four coplanar symmetry points are used to calculate the spacecraft pose. The target spacecraft coordinate system is O s -xyz. There are four points for P s i , i = 1, · · · , 4 . The four points in the target spacecraft coordinates are where a, b, c, d are known values and the relationship between P s i and the points in the camera coordinate system P c i is given by where where I is the first row of the rotation matrix of R 3×3 , J is the second row, and K is the third row.

Camera Model
The fixed-focus-lens camera model can be simplified to a single-lens model. According to the optics princi- , and the camera origin O c are located on the same line. Therefore, the camera model is called the pinhole camera model, which is also known as the perspective projection model. (1)

Simplified Model
When the two spacecraft are relatively distant, the accuracy of the image feature extraction is low, and the depth information of the feature points to the target spacecraft can be ignored. The camera model can be approximated by a simplified perspective projection model [29][30][31]. Consequently, we obtain Simplified perspective projection refers to the projection on a plane parallel to the imaging plane through the origin of the target spacecraft. Therefore, it ignores the depth of the target spacecraft point relative to the origin of the target spacecraft. When the two spacecraft are relatively distant, the neglect error is insignificant. From Eq. (8), we have where k i can be calculated by image points. Equation (7) contains nine variables and six equations. Thus, it cannot be solved directly. The rotation matrix R has the following constraints: From the first, third, and sixth equations of Eq. (12), we can obtain From Eqs. (8) and (13), we obtain Eq. (14) is a quartic equation. Therefore, the number of roots is four. Two negative roots are removed according to the relationship between roots and coefficients, and two positive roots meet the following conditions: r 2 11 + r 2 12 + r 2 13 = 1, r 2 21 + r 2 22 + r 2 23 = 1, r 2 31 + r 2 32 + r 2 33 = 1, r 11 r 21 + r 12 r 22 + r 13 r 23 = 0, r 31 r 21 + r 32 r 22 + r 33 r 23 = 0, r 11 r 31 + r 12 r 32 + r 13 r 33 = 0.  Condition 2 can only be satisfied when the rotation angle is greater than 60°; therefore, the result of applying Condition 1 is selected. Rotation matrix R can be described by four quaternion parameters (q 0 , q 1 , q 2 , q 3 ): Assumed that Form Eq. (14), we obtain As a result, we have (15) q 1 r 12 + q 2 r 31 = 2q 4 (q 2 2 − q 2 1 ), q 2 r 12 + q 1 r 31 = 2q 3 (q 2 2 − q 2 1 ). (20) , (v 1 (1 + ε 1 ) + v 2 (1 + ε 2 )) − dr 33 .

Optimization Algorithm
The accuracy of spacecraft pose estimation based on simplified perspective projection is poor. Therefore, an iterative optimal algorithm was constructed to improve the solution accuracy. The optimal algorithm for improving the accuracy of spacecraft pose estimation is shown in Figure 2.
In the algorithm, R j = I j J j K j T and

Experimental Section
The simulation experiment parameters were set as follows. The focal distance of the camera was 12 mm. The pixel size was 7. Simulation experiments verified the proposed algorithm in the following three aspects: 1) The optimization algorithm was analyzed without noise. 2) The relationship between the estimation accuracy and distance was analyzed with a mean value of 0 and a standard deviation of 0.1 pixel Gaussian noise. 3) The relationship between the estimation accuracy and distance was analyzed with a mean value of 0 and a standard deviation of 1 pixel Gaussian noise.
The simulation results are shown in Figures 3 and 4. The spacecraft pose estimation error is large, based on the simplified perspective projection, and the optimization algorithm based on the camera model effectively reduces the measurement error. After 10 iterations, the attitude errors are less than 0.42°, and the position errors are less than 4 mm. Figures 5 and 6 show the estimation accuracy with a mean value of 0 and a standard deviation of 0.1 pixel noise. When t z is 10 m, the attitude error is less than 0.36°, and the position error is less than 19.5 mm. When t z is 20 m, the attitude error is less than 0.65°, and the position error is less than 117 mm. The maximum pose error occurs when t z = 1 m. This is mainly because the initial relative position accuracy is low based on the simplified perspective projection model. Figures 7 and 8 show the estimation accuracy with a mean value of 0 and a standard deviation of 1 pixel noise. When t z is 10 m, the attitude errors are less than 3°, and the position errors are less than 0.35 m. When t z is 20 m, the attitude errors are less than 7.5°, and the position errors are less than 1 m.

Conclusions
To meet the requirements of pose estimation accuracy for spacecraft relative distance change from far to near, we propose a model based on two different camera models. In this model, the initial value of the spacecraft pose is calculated by a simplified perspective projection model, and the results are further optimized by the