 Research
 Open Access
 Published:
Threedimensional reconstruction for space targets with multistatic inverse synthetic aperture radar systems
EURASIP Journal on Advances in Signal Processing volume 2019, Article number: 40 (2019)
Abstract
In this paper, a threedimensional (3D) reconstruction algorithm is proposed for space targets with multistatic inverse synthetic aperture radar (ISAR) systems. In the proposed algorithm, target 3D geometry can be obtained by solving the projection equations between the target 3D geometry and ISAR images. Specially, it is no need to perform crossrange scaling. To obtain the projection equations, the algorithm consists of two steps: establishing projection matrix and associating scattering centers. Firstly, observation angles of sensor can be estimated by kinematic formulas and coordinate systems transformation. Using azimuth and elevation angles of sensor relative to target, the projection matrix from target 3D geometry to ISAR images is established. Secondly, an association cost function based on projective transform and epipolar geometry is developed. As the cost function is an assignment with 0–1 linear programming, the JonkerVolgenant algorithm is used to build a onetoone correspondence between two scattering centers. Numerical results show the efficiency of the proposed algorithm.
Introduction
For space targets, it is important to conduct identifications for these interesting components, such as cabin and solar panels in the event of their aberrancy. Since inverse synthetic aperture radar (ISAR) provides highquality twodimensional (2D) image for space targets [1], it becomes possible to analyze characteristics of partial interesting components. In space securityrelated reconnaissance and surveillance, there are a number of issues. For example, the image projection plane (IPP) is not a prior knowledge, and a 2D image is usually sensitive to the relative target motion. One solution to tackle these problems is to form threedimensional (3D) ISAR image. Compared with a 2D image, a 3D image contains geometric structure which provides robust characterization of vital interesting components, so it is of great military significance to reconstruct 3D image for space targets. In recent years, several techniques have been exploited to form 3D ISAR image. Previous attempts to form a 3D ISAR image can be roughly categorized into three types. The first class of methods to form a 3D image is a direct extension of the 2D ISAR concept [2, 3]. In this method, a transmitter illuminates the target through a 2D angular aperture in both azimuth and elevation while simultaneously emitting wideband waveforms, 3D Fourier Transform (3DFT) is applied to obtain a 3D ISAR image. However, such a 3DFT based method requires a measurement of radar data in three dimensions with a dense sampling way, which increases the cost of computation and storage significantly. The second type of approaches is based on the interferometric ISAR (InISAR)[4–7], which uses the spatial information provided by multiple antennas. Such an approach has the advantage of not requiring the prior knowledge of target motions. For such systems, the height information can be estimated from the phase differences between the corresponding ISAR images. However, such InISAR technique which needs using specific antenna arrays has difficult in applying for space targets. The third category reconstructs target 3D geometry via factorization analysis [8–11]. The factorization method builds the measurement matrix which is made up of a range and cross range of tracked scattering centers in multiple images. Then the target 3D geometry are reconstructed from a singular value decomposition of the measurement matrix. However, this method needs crossrange scaling to obtain cross range of scattering centers. Otherwise, the reconstructed 3D geometry of target cannot provide the real size information. In this paper, a new algorithm is proposed for space targets 3D reconstruction with multistatic ISAR systems. Firstly, observation angles of sensor are estimated by using kinematic formulas and coordinate system transformation, so the projection matrix from the target 3D geometry to the trajectory of scattering centers can be build. Secondly, ranges and Doppler frequencies of scattering centers are extracted to build the trajectory matrix. Finally, the projection equations from the target 3D geometry to the trajectory matrix of scattering centers are solved to reconstruct target 3D geometry. Compared with factorization method, the proposed algorithm only uses ranges and Doppler frequencies of scattering centers without crossrange scaling constraint, which would be thought as an advantage. However, space targets are viewed under multistatic ISAR systems so that the positions of scattering centers may vary widely in different ISAR images. To tackle this problem, a new method based on projective transform and epipolar geometry [12] is proposed to associate scattering centers between different images. In this method, an optimization cost function is developed via minimizing the assignment cost of two scattering center sets. To solve the mixed linear assignment programming, the JonkerVolgenant algorithm is used to build a onetoone correspondence between two scattering center sets with the lowest cost.
The remainder of this paper is organized as follows. In Section 2, imaging geometry and signal model is introduced. The proposed 3D reconstruction algorithm is presented in Section 3. In Section 4, numerical examples illustrating the efficiency of the proposed method are given. Finally, the conclusions are drawn in Section 5.
Geometry and signal model
In this paper, the space target is in threeaxis stabilized attitude control system, which is the advanced operating mode. In the operating mode, space target undergoes steady moving motion relative to the radar. It is noted that the orbital elements are utilized to calculate the IPP under the steady moving model. As the space target is often engaged in complicated maneuvers that combine translational and rotational motions, we use the motion compensation method in [13] to eliminate any relative translation motion of the target. The process of motion compensation consists of two steps. Firstly, an adaptive joint timefrequency technique is used to parametrize the signal using the basis functions. Secondly, the reference points of target can be selected by using a search and projection procedure in the timefrequency plane and the corresponding motion parameter can be obtained by the basis functions. Based on the motion parameters, all the motion error can be well eliminated by multiplying the phase correction vector, interpolating to the uniform azimuth scale, and polar reformatting the original collected data. The target is able to be transformed into a turntable model [14], in which the target rotates around the center of gravity of the target.
Multistatic ISAR systems are shown in Fig. 1. There is a body reference coordinate system that is centered at the center of gravity of the target. We assume that the orange solid line represents line of sight (LOS) which is determined by the azimuth and elevation angles β_{n} and φ_{n}, n=1,2,⋯,N, N is the number of sensors. In the farfield region, radar sensor transmits a linear frequency modulated signal
where A denotes amplitude, f_{c} denotes carrier frequency, μ is chirp rate, T_{p} is time length of the chirp pulse, and \(\hat {t}\) is the fast time. After dechirp and pulse compression, the returned signal of the kth scattering centers can be given by
where t_{m} denotes slow time, c is the velocity of light, δ_{k} is the backscatter coefficient, r_{k} denotes the range between the kth scattering center and target rotational center, and K is the number of scattering centers. As shown in Fig. 1, the range r_{k, n} can be calculated by
where (x_{k},y_{k},z_{k}) is the 3D location of the kth scattering center. The time derivative of r_{k, n} is now given by
where f_{k, n} is the Doppler frequency of the kth scattering center, λ denotes the wavelength, and \(\left ({{{\dot \varphi }_{n}},{{\dot \beta }_{n}}} \right)\) is the time derivative of (φ_{n},β_{n}). Obviously, ISAR images formed by range Doppler method can offer the range r_{k, n} and the Doppler frequency f_{k, n}. Therefore, the foundation of projection from the target 3D geometry to ISAR image is provided without crossrange scaling constraint in (3) and (4).
3D reconstruction using 2D ISAR images
Based on the signal model shown in Section 2, a 3D reconstruction algorithm is proposed for space targets with multistatic ISAR systems. In this algorithm, we use the projection equations between the target 3D geometry and ISAR images offered by multiple sensors properly spaced. To illustrate this process from the target 3D geometry to ISAR images mathematically, the transformation in matrix form is expressed by
where p_{k}(=[x_{k},y_{k},z_{k}]^{T}) is the 3Dreconstructed position of the kth scattering center and D_{n}(=[R_{n},F_{n}]^{T}) is the projection matrix of the nth sensor. Under multistatic ISAR systems, a set of equations can be expressed as follows:
Then a leastsquares sense is adopted to solve the overdetermined equations.
In (7), the reconstructed position of scattering center p_{k} can be calculated by the composite projection matrix D and the trajectory matrix I. Obviously, the proposed algorithm does not need crossrange scaling, and trajectory matrix of scattering centers directly calculated from the gravity model. To obtain the composite projection matrix D and the trajectory matrix I, the process is outlined in Fig. 2. Firstly, the observation angles of sensor relative to space targets can be estimated by using kinematic formulas and coordinate system transformation. Then we establish the composite projection matrix D by using the observation angles (azimuth and elevation angles) to sensor. Secondly, we assume that scattering centers are sufficiently separated such that each peak in the ISAR image corresponds to a single scattering center. In order to extract scattering centers from ISAR image, a watershed algorithm is adopted to segment ISAR image into highenergy regions [15]. As a result, the range and Doppler frequency of scattering centers can be extracted from the maximum of every region. Considering projective positions of scattering centers in different images may vary widely under multistatic SAR systems, it is necessary to associate scattering centers between different ISAR images. In this paper, an association cost function based on projective transform and epipolar geometry is developed. As the cost function is an assignment with 0–1 linear programming, the JonkerVolgenant algorithm is used to build a onetoone correspondence between two scattering centers. As these scattering centers from different images are associated, the ranges and Doppler frequencies of the same scattering centers in different images can be used to build the matrix I. The projection matrix D_{n} is discussed in Section 3.1 and scattering centers association is analyzed in Section 3.2.
Projection matrix
For space targets in the steady trajectory, we can use orbital elements to establish the projection matrix. As shown in Fig. 3, a satellite orbits around the earth in its regular orbit, and we assume that sensors on the Earth’s surface can receive returned signal on observation time. The processing steps are listed as follows:
(1) Transform the location of sensor from the Earthcentered, Earthfixed (ECEF) reference system to the Earthcentered inertial (ECI) reference system, as shown in Fig. 3. By coordinate system transformation, the location of sensor is given by
where \({\mathbf {R}_{{Z_{I}}}}\) is the rotation matrix rotating around the Z_{I} axis [16], α_{G} is the Greenwich Hour Angle, and \({\bar {\mathbf {r}}_{ECEF}}\) is the location of the sensor in ECEF reference system defined by longitude and latitude.
(2) Transform the location of the sensor from the ECI reference system to the orbit plane (O^{′},X_{a},Y_{a},Z_{a}) reference system. The orbit plane reference system is used to describe the motion of the satellite. The X_{a} axis points at the flight direction of satellite, Z_{a} axis points at subsatellite point, and Y_{a} axis is normal to the orbit plane. The transformation consists of two steps: coordinate system transformation and coordinate system translation. Firstly, a temporary coordinate system which centers the Earth’s core and is parallel to the orbit reference system is built by rotating the ECI reference system
where \({\mathbf {R}_{{Z_{a}}}}\) is the rotation matrix rotating around the Z_{a} axis, \({\mathbf {R}_{{X_{a}}}}\) is the rotation matrix rotating around the X_{a} axis, μ is the argument of perigee, i is the orbit inclination, and Ω is the right ascension. Secondly, in the temporary coordinate system, the location of the satellite is given by applying Kepler’s laws and the two body kinematics equations
where ρ denotes the semilatus rectum, e denotes the eccentricity, and γ is the true anomaly [17]. Then, the location of the sensor in the orbit (O^{′},X_{a},Y_{a},Z_{a}) reference system is given by translating the origin of the temporary coordinate system to the center of satellite
where \(\mathbf {B} = \left [ {\begin {array}{ccc} 0&1&0\\ 0&0&{  1}\\ {  1}&0&0 \end {array}} \right ]\) is used to adjust the coordinate axis.
(3) Transform the location of sensor in the orbit (O^{′},X_{a},Y_{a},Z_{a}) reference system to the body (O^{′},X_{n},Y_{n},Z_{n}) reference system. In the body reference system, the location of sensor is shown by
where \({\bar {\mathbf {r}}_{n}}\left ({ = {{\left [ {{x_{n}}, {y_{n}}, {z_{n}}} \right ]}^{\mathrm {T}}}} \right)\) is the coordinate value of sensor in the body reference system; \({\mathbf {R}_{{X_{n}}}}\), \({\mathbf {R}_{{Y_{n}}}}\), and \({\mathbf {R}_{{Z_{n}}}}\) are the rotation matrices rotating around the X_{n}, Y_{n}, and Z_{n} axes; and (ψ_{s},γ_{s},ϕ_{s}) are the roll, pitch, and yaw angles of the satellite. Obviously, while ψ_{s}=0,γ_{s}=0 and ϕ_{s}=0, the body reference system is equal to the orbit reference system. We use the method in [18] to estimate roll, pitch, and yaw angles. The basic idea is to exploit the phase history of the strongest scatterers in different images. Firstly, the brightest spots in the different images as corresponding to the same scatterer of the target are associated by using the simple nearest neighboring method. Secondly, considering that the rotation vector is involved in the Doppler frequency of the scatterer, a Doppler matchingbased estimation technique processing scheme is proposed to recover the Doppler frequency of the scatterer. Then, the yaw, pitch, and roll rotation motions can be estimated by matching the Doppler frequency. Finally, the elevation angle φ_{n} and azimuth angle β_{n} of the nth sensor relative to the satellite are obtained by
Generally, the ISAR targets are noncooperative with unknown motion. For specific space targets, they usually undergo steadily moving motion, so orbital elements can be used to estimate the observation angles of the sensor. By using the elevation angle and azimuth angle, the projection matrix can be constructed.
Scattering center association
As space targets are viewed under multistatic ISAR systems and their scattering centers are viewed in different orientations as well, radar data in general are not associated. In different ISAR images, the projective positions of scattering center may vary widely. To tackle the problem about the association scattering center between different images, we propose a new association method based on projective transform and epipolar geometry. In fact, scattering center association can be considered as an assignment procedure which assigns each unassociated scattering center to the associated one. To arrive at the minimum assignment cost, we establish a cost function between the mth and nth image as follows:
where P is the number of scattering centers in the mth image and Q is the number of scattering centers in the nth image. We assume the mth image is obtained by the mth sensor and the nth image is obtained by the nth sensor. k_{m, n}(p, q) is the control variable which is called association matrix between the mth image and the nth image. While the pth scattering center in the mth image is associated with the qth scattering center in the nth image, the value of k_{m, n}(p, q) is 1. On the contrary, the value of k_{m, n}(p, q) is 0. The variables g_{m, n}(p, q) and e_{m, n}(p, q) are geometry coefficient and error coefficient which are analyzed below.
Geometry coefficient g _{m, n}(p, q)
Epipolar geometry describes the mapping relationship between the two images. The 3D geometry structure corresponding to its scattering centers on the image should locate on a line in another image. Therefore, the probable location of the corresponding scattering center projecting on another imaging plane is restricted. The geometry coefficient g_{m, n}(p, q) indicates the Euclidean distances between the probable projective position of the pth scattering center in the nth image and the position of the qth scattering center. The mathematical expression of geometry coefficient g_{m, n}(p, q) is presented as follows.
Firstly, the position of 3D geometry structure corresponding to the pth scattering center be obtained by
where \(\left [ {{\tilde {x}_{p}}\left (\tilde {t} \right),{\tilde {y}_{p}}\left (\tilde {t} \right),{\tilde {z}_{p}}\left (\tilde {t} \right)} \right ]\) is the probable position of 3D geometry structure, (r_{p, m},f_{p, m}) are the range and Doppler frequency of the pth scattering center in the mth image and “ ×” is the multiplication cross. The size of 3D geometry structure distributed on the coordinate should be limited in a certain range, so the value range of \(\tilde {t}\) are \(\tilde {t}_{a}\) to \(\tilde {t}_{b}\), which are userdefined values. The projection on the nth image of this 3D geometry structure is the epipolar line segment \(\vec l_{n}^{p,m}\)
where \(\left ({{\tilde {r}_{p,n}}\left (\tilde {t} \right),{\tilde {f}_{p,n}}\left (\tilde {t} \right)} \right)\) are the projective range and Doppler frequency of the pth scattering center in the nth image. While the pth scattering center is associated with the qth scattering center, the qth scattering center should be on the line segment \(\vec l_{n}^{p,m}\). Obviously, since there are deviations caused by noise etc., the qth scattering center may be not just right on the line segment \(\vec l_{n}^{p,m}\), but near \(\vec l_{n}^{p,m}\). Finally, the geometry coefficient g_{m, n}(p, q) is presented by
where min(∙) donates the minimization and \(\frac {{\lambda {T_{p}}}}{2}\) is used to keep the same units of the range and Doppler frequency. In (17), the geometry coefficient g_{m, n}(p, q) indicates the difference between the pth and qth scattering center in the same projection plane. The smaller value of g_{m, n}(p, q) represents higher possibility of which the two scattering centers are correlative. However, while the target is complicated, there may be more than one scattering centers near \(\vec l_{n}^{p,m}\). Then the error coefficient e_{m, n}(p, q) is presented to evaluate the association possibility of these scattering centers.
Error coefficient e _{m, n}(p, q)
In (7), the 3D position of the scattering center is able to be reconstructed by the leastsquares solution with minimum mean squared error. While trajectory matrix I has a large error, for example, scattering center association is inaccurate, there may have a big reconstruction error between the 3Dreconstructed position and the 3D real position. Here, the error coefficient e_{m, n}(p, q) regards the reconstruction error as the assignment cost. The mathematical expression of the error coefficient e_{m, n}(p, q) is presented as follows.
Firstly, based on projective transform, the 3Dreconstructed position can be obtained by associating the pth scattering center with the qth one
where D_{m} and D_{n} denote the projection matrixes of the mth and nth sensors and r_{p, m} and f_{p, m} are the range and Doppler frequency of the pth scattering center in the mth image. Similarly, r_{q, n} and f_{q, n} are the range and Doppler frequency of the qth scattering center in the nth image. Then the projection ranges and Doppler frequencies of \({\tilde {\mathbf {p}}_{p,q}}\) on the two images are expressed by
According to (19), the error coefficient e_{m, n}(p, q) is given by
In (20), based on the projective transform, the error coefficient e_{m, n}(p, q) indicates the difference between the 3D reconstructed position and the 3D real one in the image domain. As error coefficient e_{m, n}(p, q) is smaller, the association possibility is higher.
The next step is to optimize the cost function. The cost function is an assignment problem with 0–1 linear programming, which is extremely complex to optimize. Enumeration methods are used for this assignment problem; however, they need too much computation. In this paper, the linear assignment problem can be efficiently solved by the JonkerVolgenant algorithm [19]. The JonkerVolgenant algorithm is a joint optimization process which decreases the cost of computation.
As analyzed above, the proposed association method simplifies the association problem between the two images to the one between line and image. However, this method needs a high range and Doppler frequency resolution of images. To get better performance, interpolation method [20] can be utilized to enhance image resolution.
Results and discussion
In this section, examples of numerical simulation are presented to evaluate the performance of the proposed algorithm. Firstly, the performance analysis of the proposed association method is given in comparison to other association methods. Then, the proposed algorithm is tested with simulation data. The conditions of the simulation are shown in Table 1.
On the conditions shown in Table 1, the range resolution of the image is Δr=c/2B=0.075m, and the Doppler frequency resolution of the image is Δf_{d}=f_{p}/H=0.39Hz.
Experiment 1
Here, an experiment is presented to compare the performance of the proposed association method which is defined as epipolar geometry projective (EGP) method with the nearest neighbor (NN) method, robust point matching (RPM) method [21], and coherent point drift (CPD) method [22]. The NN method is the most common association method and the main characteristics of the NN method are simple and fast. The RPM method and CPD method are association methods for optical images which can recover the transformation and assign correspondences between two images. There are 100 scattering centers in every image and each scattering center is placed at different separations from each other.
We evaluate the robustness of the proposed association method of the observation angle (θ,ϕ). To eliminate the interactions between the elevation angle θ and azimuth angle ϕ, the experiment falls into two parts: the location of sensors only varies in the azimuth angle, and the location of sensors only varies in the elevation angle. We use the corrected associated rate to evaluate the performance of these association method. The corrected associated rate ω_{c} can be calculated by
where N_{total} is the total number of scattering centers in the two images and N_{c} is the number of scattering centers which are correctly associated in the two images. The correct associated rates of the proposed method, NN method, RPM method, and CPD method, are shown in Fig. 4. The association performance of the NN method is the worst. It is because that the positions of scattering centers in different images may vary widely. Besides, it is also seen that the proposed method is more robust to the rotation angle than the RPM method and the CPD method. The reason is that the proposed method uses epipolar geometry to build the projection relationship between the scattering center and epipolar line, so the association problem between the two images is simplified to the one between line and image.
Experiment 2
We consider first a pointscatterer target which consists of a platform and a solar panel shown in Fig. 5. Eightyfive scattering centers (red dot) are used to form the target, each with unit reflectivity amplitude. Table 2 shows the position of sensors in the ECEF reference system. The reconstruction result along with the real positions of scatter centers is shown in Fig. 5. It can be seen that all the strong scatter centers can be formed correctly, agreeing with the real positions.
The reconstruction performance is measured by the coordinate errors Δx_{k}, Δy_{k}, and Δz_{k} which are given by
where (x_{k},y_{k},z_{k}) is the true position of the kth scattering center in the body reference system and \(\left ({{\hat {x}_{k}},{\hat {y}_{k}},{\hat {z}_{k}}} \right)\) is the estimated position of the kth scattering center in the body reference system. As shown in Fig. 6, we can see that the absolute errors are so small that the position errors are less than the range resolution, which confirms the efficiency of the proposed algorithm.
To evaluate the performance of the proposed algorithm in the presence of noise, different levels of complex Gaussian noise have been added to the ISAR data to produce different signaltonoise ratios (SNRs) ranging from 0 to 20 dB. The average errors e_{x}, e_{y}, and e_{z} are given by
where N_{m} is the number of Monte Carlo trials. In this experiment, the number of Monte Carlo trials is 500. Figure 7 depicts the relationship between the SNRs and the average errors when various data sizes are used. It can be seen that with the increase of the SNRs, the reconstruction accuracy improves as well. While the SNRs are over 10 dB, the average errors remaining stable are less than the range resolution. The reason is that the accuracy of association is mainly influenced by the range resolution and Doppler frequency resolution.
Next, the estimation accuracy of the 3Dreconstructed position of scattering center (x_{k},y_{k},z_{k}) is specified by the CramerRao lower bound (CRLB). Firstly, the CRLB for the target range r_{k} and Doppler frequency f_{k} are given as follows [9]
where η is the signaltonoise power ratio, t_{pri} is the pulse recurrence interval and M is the total number of hits to observe ISAR images. In the following discussion, the vectorform representation of the parameters \({\bar {g}_{n}}\) and \({\bar {p}_{k}}\) is introduced as follows
Then the error sensitivity matrix of (x_{k},y_{k},z_{k}) to the range and Doppler frequency directly observed from the ISAR image are calculated from (5) as follows
The CRLB for (x_{k},y_{k},z_{k}) can be expressed using the CRLB for r_{k, n} and f_{k, n} as follows:
where \(\text {CRLB}\left \{ {{{\bar {g}}_{n}}} \right \}\) is a diagonal matrix whose diagonal elements are the CRLBs given by (25).
The experiment for the estimation accuracy of the 3Dreconstructed position of the scattering center is provided, and the system parameters in Table 1 are assumed. The CRLBs expressed by (27) are shown in Fig. 8. The horizontal axis represents SNRs, and the vertical axis represents the CRLB. From Fig. 8, while the SNRs are over 20 dB, it is clear that the estimation accuracy is close to the CRLBs. While the SNRs are low, the estimation performance is also influenced by the overdetermined projection equations.
Experiment 3
In this experiment, we use the geometrical theory of diffraction (GTD) data to evaluate the efficiency of the proposed algorithm. Figure 9 shows the 3D Hubble Space Telescope (HST) model. Generally, strong scattering centers of the target are generated from edges, corners, and tips; therefore, these strong scattering centers are able to present the physical structure and interesting component. As shown in Fig. 9, these strong scatter centers are mainly distributed on the edges of solar panels. As solar panels are interesting components of satellite, the experiment is performed to estimate the physical structure of solar panels.
Figure 10 shows the 3D reconstruction results along with the real positions of scatter centers. The reconstructed scattering centers are grouped into four straight lines around edges of solar panels. The estimated width of solar panels which is calculated by the average distance of four straight lines is 3.65m (actual width is 3.8m). The estimated length of solar panels which is calculated by the average length of four straight lines is 16.2m (actual width is 16.6m). Besides, there are several reconstructed scattering centers away from solar panels, and the reason is that these scattering centers association is inaccurate.
Conclusions
In this paper, a 3D reconstruction algorithm is proposed for space targets with multistatic ISAR systems. The algorithm builds the projection equations between the target 3D geometry and ISAR images without crossrange scaling constraint. The projection equations contain the projection matrix and the trajectory matrix of scattering centers. The first one is obtained by kinematic formulas and coordinate systems transformation, and the second one is ranges and Doppler frequencies extracted from ISAR images. However, the positions of scattering centers in different images may vary widely, and we propose a new method based on projective transform and epipolar geometry to associate scattering centers. Numerical results with simulation data validate its performance in scattering center association and 3D reconstruction.
The future works consist of two parts. Firstly, when the space target is in other control systems, its intrinsic orbit characteristics may lead to an unsteady moving trajectory relative to the radar. Aiming at this problem, we will study the 3D reconstruction of space target for unsteady moving trajectory relative to the radar. Secondly, in fact, as the radar illuminates target, there are some structures of target that are occluded, which adds to the difficulty in scattering center association. Therefore, we will improve the association method to solve the problem of occlusion.
Availability of data and materials
Not available online. Please contact the author for data requests.
Abbreviations
 2D:

two dimensional
 3D:

three dimensional
 3DFT:

3D Fourier Transform
 CPD:

coherent point drift
 CRLB:

Cramér–Rao lower bound
 ECEF:

Earthcentered, Earthfixed
 ECI:

Earthcentered inertial
 EGP:

epipolar geometry projective
 HST:

Hubble Space Telescope InISAR: 3DFourier Transform
 IPP:

image projection plane
 ISAR:

inverse synthetic aperture radar
 LOS:

line of sight
 NN:

nearest neighbor
 RPM:

robust point matching
 SNRs:

signaltonoise ratios
References
 1
Z. Ji, E. Hu, Y Zhang, H Jin, Research on microDoppler feature of spatial target. EURASIP J. Adv. Sig. Proc. 2017(1), 117 (2017).
 2
O. Caner, Inverse synthetic aperture radar imaging with matlab algorithms (Wiley, Mersin University, USA, 2012).
 3
W. Qiu, M. Martorella, J. Zhou, H. Zhao, Q. Fu, Threedimensional inverse synthetic aperture radar imaging based on compressive sensing. IET Radar. Sonar Navig. 9(4), 411–420 (2015).
 4
Y. Wang, G. Qian, Novel approach for InSAR sensors imaging via gradientbased algorithm for the sparse signal reconstruction. IEEE Sens. J. 18(6), 2385–2394 (2018).
 5
S. Kumar, U. G. Khati, S. Chandola, S. Agrawal, S. P. S. Kushwaha, Polarimetric SAR interferometry based modeling for tree height and aboveground biomass retrieval in a tropical deciduous forest. Adv. Space Res. 60(3), 571–586 (2017).
 6
B. Tian, Y. Liu, S. Xu, Z. Chen, Interferometric inverse synthetic aperture radar imaging for space targets based on wideband direct sampling using two antennas. J. Appl. Remote. Sens. 8(1), 1–9 (2014).
 7
J. T. Mayhan, M. Burrows, K. M. Cuomo, E. Jean, High resolution 3D “Snapshot” ISAR imaging and feature extraction. IEEE Trans. Aerosp. Electron. Syst. 37(2), 630–642 (2001).
 8
Y. Bi, S. Wei, J. Wang, 3D reconstruction of highspeed moving targets based on HRR measurements. IET Radar. Sonar Navig. 11(5), 778–787 (2017).
 9
K. Suwa, K. Yamamoto, M. Iwamoto, Threedimensional target geometry and target motion estimation method using multistatic ISAR movies and its performance. IEEE Trans. Geosci. Remote Sens. 49(6), 2361–2373 (2011).
 10
C. Tomasi, T. Kanade, Shape and motion from image streams under orthography: a factorization method. Int. J. Comput. Vis. 9(2), 137–154 (1992).
 11
L. Liu, F. Zhou, X. Bai, M. Tao, Z. Zhang, Joint crossrange scaling and 3D geometry reconstruction of ISAR targets based on factorization method. IEEE Trans. Image Process. 25(4), 1740–1750 (2016).
 12
Y Wu, C Tang, M Hor, Automatic image interpolation using homography. EURASIP J. Adv. Sig. Proc. 2010(1), 307546 (2010).
 13
Y. Wang, H. Ling, V. C. Chen, ISAR motion compensation via adaptive joint timefrequency techniques. IEEE Trans. Aerosp. Electron. Syst. 34(2), 670–677 (1998).
 14
A. F. GarciaFernandez, O. A. YesteOjeda, J. Grajal, Facet model of moving targets for ISAR imaging and radar backscattering simulation. IEEE Trans. Aerosp. Electron. Syst. 46(3), 1455–1467 (2010).
 15
L. Zhao, X. Zhou, G. Kuang, Building detection from urban SAR image using building characteristics and contextual information. EURASIP J. Adv. Sig. Proc.2013(1), 56 (2013).
 16
R. L. White, M. B. Adams, E. G. Geisler, F. D. Grant, Attitude and oribit estimation using stars and landmarks. IEEE Trans. Aerosp. Electron. Syst.AES11(2), 195–203 (1975).
 17
C. Xie, G. Zhang, Y. Zhang, H. Li, Optimal twoimpulse rendezvous with terminal tangent burn considering the trajectory constraints. Adv. Space Re. 54(4), 734–743 (2014).
 18
F. Santi, D. Pastina, M. Bucciarelli, Estimation of ship dynamics with a multiplatform radar imaging system. IEEE Trans. Aerosp. Electron. Syst. 53(6), 2769–2788 (2017).
 19
R. Jonker, A. Volgenant, A shortest augmenting path algorithm for dense and sparse linear assignment problems. Computing. 38(4), 325–340 (1987).
 20
A. SanchezBeato, G. Pajares, Noniterative interpolationbased superresolution minimizing aliasing in the reconstructed image. IEEE Trans. Image Process. 17(10), 1817–1826 (2008).
 21
H. Chui, A Rangarajan, Anew algorithm for nonrigid point matching. Proc. IEEE Conf. Comput. Vis. Pattern Recog. 2:, 44–51 (2003).
 22
A. Myronenko, X. Song, Point set registration: coherent point drift. IEEE Trans. Pattern Anal. Mach. Intell. 32(12), 2262–2275 (2010).
Acknowledgements
The authors would like to thank the anonymous reviewers for their valuable comments and suggestions that helped improve the quality of this manuscript.
Funding
This work is supported by the National Science Foundation for Distinguished Young Scholars 61525105; the National Science Foundation of China 61671351,61771372; and by the Program for New Century Excellent Talents in University NCET090630.
Author information
Affiliations
Contributions
All authors have contributed equally. All authors have read and approved the final manuscript.
Corresponding author
Correspondence to Lei Zhang.
Ethics declarations
Competing interests
The authors declare that they have no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License(http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Zhao, Y., Zhang, L., Jiu, B. et al. Threedimensional reconstruction for space targets with multistatic inverse synthetic aperture radar systems. EURASIP J. Adv. Signal Process. 2019, 40 (2019). https://doi.org/10.1186/s1363401906308
Received:
Accepted:
Published:
Keywords
 3D reconstruction
 Space targets
 Multistatic ISAR systems
 Projection matrix
 Scattering centers