[go: up one dir, main page]

Next Article in Journal
ECG Noise Cancellation Based on Grey Spectral Noise Estimation
Next Article in Special Issue
Quasi-All-Passive Thermal Control System Design and On-Orbit Validation of Luojia 1-01 Satellite
Previous Article in Journal
Cross-Device Computation Coordination for Mobile Collocated Interactions with Wearables
Previous Article in Special Issue
On-Orbit Relative Radiometric Calibration of the Night-Time Sensor of the LuoJia1-01 Satellite
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

High Sensitive Night-time Light Imaging Camera Design and In-orbit Test of Luojia1-01 Satellite

1
Chang Guang Satellite Technology Co. LTD, Changchun 130102, China
2
Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
3
State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(4), 797; https://doi.org/10.3390/s19040797
Submission received: 28 December 2018 / Revised: 26 January 2019 / Accepted: 1 February 2019 / Published: 15 February 2019
(This article belongs to the Special Issue The Design, Data Processing and Applications of Luojia 1-01 Satellite)
Figure 1
<p>Spectral response curve for the detector.</p> ">
Figure 2
<p>The relationship between MTF (modulation transfer function) limited by diffraction and F number.</p> ">
Figure 3
<p>Relationship between SNR(signal to noise ratio) and F-No. of optical system.</p> ">
Figure 4
<p>The structure of the optical system.</p> ">
Figure 5
<p>Modulation transfer function curves of the optical system.</p> ">
Figure 6
<p>Optical system’s structure with compensation lens.</p> ">
Figure 7
<p>The relationship between MTF limited by diffraction and F number.</p> ">
Figure 8
<p>Mechanical structure of the camera.</p> ">
Figure 9
<p>The relationship between solar vector and the camera coordinate system.</p> ">
Figure 10
<p>The changes of the angle between the solar vector and orbital plane changes in one year.</p> ">
Figure 11
<p>The angle between solar vector and camera axis.</p> ">
Figure 12
<p>Profile of the camera’s special shaped hood.</p> ">
Figure 13
<p>PST curves of the camera.</p> ">
Figure 14
<p>Different frames of night-time light image of Moscow area in one orbit. (<b>a</b>) normal image with the angle between axis and solar vector greater than 52 degree; (<b>b</b>) image effected by stray light with the angle between axis and solar vector less than 52 degree.</p> ">
Figure 15
<p>(<b>a</b>) Original image of low gain and high gain mode; (<b>b</b>) HDR image constructed by low gain and high gain image.</p> ">
Figure 16
<p>Line target in an image.</p> ">
Figure 17
<p>Dynamic modulation transfer function curves of Luojia1-01 satellite.</p> ">
Figure 18
<p>SNR(signal to noise ratio) tested results of the images. (<b>a</b>) the relationship between SNR and radiance of optical entrance; (<b>b</b>) the relationship between SNR and illuminance of targets with reflectivity of 0.3 and atmosphere’s transmittance of 0.6.</p> ">
Versions Notes

Abstract

:
Luojia1-01 satellite is the first scientific experimental satellite applied for night-time light remote sensing data acquisition, and the payload is an optical camera with high sensitivity, high radiation measurement accuracy and stable elements of interior orientation. At the same time, a special shaped hood is designed, which significantly improved the ability of the camera to suppress stray light. Camera electronics adopts the integrated design of focal plane and imaging processing, which greatly reduces the volume and weight of the system. In this paper, the design of the optical camera is summarized, and the results of in-orbit imaging performance tests are analyzed. The results show that the dynamic modulation transfer function (MTF) of the camera is better than 0.17, and the SNR is better than 35 dB under the condition of 10 lx illuminance and 0.3 reflectivity and all indicators meet the design requirements. The data obtained have been widely applied in many fields such as the process of urbanization, light pollution analysis, marine fisheries detection and military.

1. Introduction

Night-time light remote sensing image refers to the night-time image of the earth obtained by the optical remote sensing device. Currently the world’s two major night-time light sensors are the United States defense meteorological satellite program (DMSP) with a linear scanning business system (OLS) and polar environment business satellite (S-NPP) carrying the visible infrared imaging radiometer (VIIRS) [1,2,3,4], and their data have been used in gross domestic product, population, carbon emissions, armed conflicts, power consumption and mapping urban extents [5,6,7,8,9,10,11]. The quality of remote sensing data acquired is mainly determined by spatial resolution radiation measurement accuracy, dynamic range and sensitivity. OLS and VIIRS have spatial resolution of 2.7 km and 740 m and dynamic range of 6 bit and 14 bit respectively. Jilin-1 video satellite developed by Chang Guang Satellite Technology Co., Ltd. (Changchun, China) can provide 1 m resolution night-time light color remote sensing images and is the highest spatial resolution satellite used in this field by far. Considering the spatial resolution, width and the influence of radiation measurement precision’s comprehensive effect on remote sensing data, Wuhan University developed Luojia1-01 satellite, which has been successfully launched on 2 June 2018 [12,13,14,15]. The main payload is a camera which can obtain nightlight images of the earth. The camera innovatively uses the combination of a large relative aperture optical system and high sensitivity detector to improve the signal to noise ratio (SNR) of dark scenes. and the spatial resolution is 130 m with 12 bit dynamic range. In addition, we designed a special hood to effectively avoid the influence of solar radiation on camera imaging. In this paper, the design and in-orbit test of night light camera is introduced. Results of in-orbit test show that all the specifications of the camera meet the requirements, and that the camera has shown a good performance.

2. Mission analysis

2.1. Features of Night Light

The visible band radiation information of night-time obtained by optical remote sensing devices mainly comes from human activity light, among which the light on land is mainly urban light, and the light on sea surface mainly comes from fishing boat light and drilling platform [12]. In accordance with the requirements of Chinese urban road lighting, the main and trunk roads of the city generally use high pressure sodium lamp as the light source [13], the spectral distribution of which is concentrated in the spectrum range of 500 nm to 900 nm.
Nightlight source has a low illuminance. According to Chinese urban road lighting standards, the illuminance maintenance value of fast road and main road is 20 lx to 30 lx, and the secondary road is 10 lx to 15 lx, while branch road is just 8 lx to 10 lx, which are far less than the illuminance values of ground during the day [16]. When the moon acts as the light source, the maximum illuminance is about 0.2 lx [17].
When imaging a low-illuminance target, the signal-to-noise ratio (SNR) of the space camera becomes the main factor restricting the imaging quality. In order to improve the SNR of camera imaging, photodetector with high sensitivity and low noise is needed, and optical imaging system with large relative aperture is required.

2.2. Identification of Imaging Methods

The main imaging methods of space camera include line-array push-broom imaging, plane-array gaze imaging, and frame push-broom imaging. Linear array push-broom imaging has strict requirements on satellite’s attitude control, stability and pointing accuracy, and it is easy to produce image shift and imaging quality will be reduced. Due to the absence of image shift mismatch in the plane-array gaze imaging mode, images with high SNR can be obtained by long-time integration, but the imaging width is limited. Frame push-broom can obtain the images with larger width by changing the frame frequency of the camera, the overlap rate of adjacent frames of images can be controlled, which is of great significance for multi-temporal images matching, dynamic targets monitoring, etc. Meanwhile, the frame push-broom mode requires relatively low attitude control of the satellite, so we choose frame push-broom imaging mode as Luojia1-01 night-time light camera’s main imaging mode [18,19].

2.3. Requirements for Payload

The main evaluation indexes of space camera include SNR and MTF (modulation transfer function) of the images. SNR reflects the radiation resolution for visible light camera, which is reflected in the cleanliness of the image and SNR is of great significance for target classification. MTF reflects the frequency characteristics of information transmitted by the photoelectric imaging system, which depends on the diffraction and aberration of the optical system and the characteristics of the photoelectric sensor. MTF reflects the sharpness of image and can evaluate image quality comprehensively. Because the luminous intensity of ground targets is very low, it is essential to obtain image data with high SNR. According to the requirements of radiation quality, SNR is required to reach 20 dB under the condition of ground illuminance of 10 lx and reflectance of 0.3. Static MTF is required to be better than 0.2 at the Nyquist frequency of the camera.

3. Optical and Mechanism Design

3.1. Decomposition for System Indicators

Imaging performance of the space camera is mainly determined by optical imaging system and photodetector. Photoelectric conversion ability and noise level of photodetector have important influences on the camera’s capability.
According to the requirements of the task, a CMOS (Complementary Metal Oxide Semiconductor) detector with 2048 × 2048 pixels manufactured by GPIXEL Inc. is selected. The pixel size of the detector is 11 microns, and the main parameters of the photodetector are shown as Table 1.
Spectral response is an important parameter of the detector, which determines the photoelectric conversion capability of the system. The spectral response curve of the detector is shown in Figure 1.
The input parameters for optical system design mainly include optical system focal length, field of view and relative aperture.
Ground sample distance (GSD) of space camera is related to the satellite orbit height H, pixel size (a) and focal length (f) of the optical system.
The orbit height is 645 km, so the focal length of the optical system is required to be 55 mm in order to achieve the spatial resolution of 130 m at the nadir point.
At this time, the imaging width of the camera is 264 km, which meets the requirement that the width is larger than 200 km.
Large relative aperture optical system has the advantages of high energy collection capacity and less affected by diffraction, so it is beneficial to improve the imaging SNR and MTF. But, the volume and weight of the system will increase rapidly, and the aberrations of the system are not easy to correct. The influence of F number (reciprocal of the relative aperture) on system performance is analyzed below.
Since the camera’s static MTF is targeted to be greater than 0.2, the optical system transfer function after adjustment is required to be greater than 0.5 considering the detector’s MTF. The relationship between MTF and F number at Nyquist frequency (46 lp/mm) is calculated, which is shown in Figure 2.
In addition, the number of signal electrons produced by the camera is inversely proportional to the square of the F number of the optical system. As required by the task, the imaging SNR of the system is estimated. Among which, atmospheric transmittance is 0.6, optical system transmittance is 0.7 and target reflectivity is 0.3 with 10 lx illuminance, and the target is assumed to be a lambert body. Then, the relative aperture of the optical system is an important parameter determining SNR, The influence of F number on SNR of the photoelectric imaging system is analyzed, which is shown in Figure 3.
Considering comprehensively the influence of the MTF, SNR and aberration correction, the design is balanced and the F number of the optical system is set to 2.8 which satisfies the requirements of the system for SNR and MTF.

3.2. Optical System Design

According to the decomposition results of the above system indexes, this system is designed by refraction optical system, which has medium field of view and large relative aperture. According to the requirements of field of view and relative aperture, the optical system in the form of double Gaussian is selected for optimal design. The distortion of the system should be controlled in the design process, and the requirements of image quadric centroid should be satisfied. The design results are shown in Figure 4.
The designed system meets the requirements of image quadric centroid, and the modulation transfer function curve of the system is shown in Figure 5.
Optical system’s MTF value at Nyquist frequency is higher than 0.5 and this can ensure the camera’s static MTF is better than 0.2.
Given that the camera works in a vacuum environment in orbit but the assembly and test are completed under normal pressure, a compensation lens is designed to compensate for the influence of air pressure on the imaging quality of the optical system. The function of the compensation lens is to ensure that the back intercept of the optical system with adjustment lens under normal pressure is the same as that without compensation lens in vacuum. The designed compensation lens is composed of two pieces of transmission elements, and the material is H-K9L. The system structure with the compensation lens is shown as Figure 6.
After the assembly of the camera, MTF of the camera was tested by the slant edge method. The test results show that the MTF of the camera is 0.247 at Nyquist frequency [20].
In addition, we also tested the spectral response of the system, and obtained the normalized spectral response curve of Luojia1-01 satellite by means of mono-chromator scanning, as shown in Figure 7.

3.3. Mechanical Structure Design

Luojia1-01 camera is mainly composed of camera lens, focal plane electric box and hood. The lens is installed in the lens seat by rolling edge method and assembled according to the tolerance requirements of optical design. The lens is connected to the focal plane electric box by screws and the focal plane position is adjusted by a gasket to ensure all the regions of the detector can get good performance. The designed hood is located at the front of the lens. The system profile is shown as Figure 8.
In order to ensure the stability of the optical system, titanium alloy is selected as the structural design material. Its advantage is that the thermal expansion coefficient is close to the optical material selected. The design of camera structure mainly takes into account the assembly tolerance requirements of optical system, the mechanical characteristics and thermal characteristics of the system [21].

4. Design and Analysis of StrayLight Elimination

4.1. The Effect of Stray Light on Imaging

In imaging optical system, some non-imaging light can reach the focal plane and become stray light. Stray light can reduce the imaging SNR and MTF. In particular, when imaging low-illuminance targets, stray light may drown the signal and make the system disabled. Therefore, it is necessary to analyze and suppress the stray light [22,23,24].
Due to that the scattering radiation of the atmosphere is small at night, its influence on imaging can be ignored. But when the satellite observes shadow region, especially near the terminator, solar radiation may be scattered into the optical system by the camera hood or other parts of the satellite. Since the solar radiation energy is much higher than the emission or reflection energy of ground targets, it will be disastrous for imaging. Response of the sensor caused by object targets is related to the optical system’s relative aperture, and response caused by solar radiation is relative to optical system’s PST (Point spread transmittance).
The irradiance of the focal plane generated by the ground target can be calculated according to the following equation:
E 1 = ρ E τ a τ o 4 F 2
Parameters in formula 1 are defined as Table 2.
The irradiance of the focal plane generated by the solar radiation can be calculated according to the following formula:
E 2 = E s u n × P S T ( θ )
where E s u n denotes the irradiance of the sun at the camera. According to Planck’s law, the solar illuminance is calculated to be 608 W / m 2 within the working spectrum range of the optical system.
P S T ( θ ) denotes the PST value at different angles of incidence.
Based on the discussions above, we can see that for a target with 1 lx illuminance and 0.3 reflectivity, and the transmittance of atmosphere 0.6, when PST is 10 8 , irradiance on the focal plane caused by object and solar is on the same level. We wish the system’s PST to be less than 10 10 to protect the system from solar radiation.

4.2. Analysis of the Relationship Between Solar Vector and Camera Attitude

The relationship between the sun vector and the camera optical axis can be represented by the azimuth and altitude angle of the sun vector in the camera coordinate system and the meaning of different angles are shown as Figure 9. Simulation is carried out through STK software, [25] we can see that the sun vector in orbital plane angle beta ranges from 17 degree to 27degree, which is shown as Figure 10.
Angle is smallest on June 3rd of each year, therefore, the relative position of the satellite and the sun is analyzed on this day to obtain the minimum angle between the sun vector and the camera axis. As the satellite has the ability to observe the earth with a swing angle of ±30 ° , we analysis three different kinds of conditions: without swing, rolling right 30 ° and rolling left 30 ° and results are shown as Figure 11. The minimum angle between solar vector and the camera optical axis is 52 ° .

4.3. Special Shaped Hood Design

Considering the influence of solar radiation on camera imaging, a traditional hood cannot meet the requirements of imaging, therefore, a special shaped hood is designed. The objective of the special- shaped hood design is to prevent solar radiation from scattering through the structural parts into optical system. The designed hood is shown as Figure 12.
On the basis of the traditional hood, and considering the influence of the size of the satellite, the special-shaped hood is cut with the azimuth angle 22 degree and 52 degree with optical axis. When the angle between the solar vector and the camera axis is greater than 52 degree, the solar vector cannot enter the optical system through the hood, and the camera will not be affected by solar radiation.
Stray light analysis on satellite model is necessary because other components of the satellite can also reflect or scatter solar radiation and become stray light for camera imaging. The model is analyzed in the optical analysis software Tracepro and the PST of the camera is calculated [26]. The calculation results are shown as Figure 13.
As can be seen from the figure above, when the angle between the incident solar vector and the camera optical axis is greater than 52 degree, the PST of the camera decreases rapidly, and the influence of solar radiation on the camera can be ignored.

4.4. In-orbit Verification of the Hood

In order to verify the performance of the designed hood, a push-broom imaging for the Moscow area was carried out at 3:09 on June 21th, 2018 and the image is shown in Figure 10. The angle between optical axis and solar vector has been changing during the whole process and when it decreased to 52 degree, solar radiation was scattered into the optical imaging system and became stray light. As can be seen from Figure 14a, images are not effected at the beginning, and the angle between solar vector and camera axis is larger than 52 degree, when the angle decreases to 52 degree, stray light appears and images are seriously effected, which is shown in Figure 14b. Both the two images are uncropped and are whole frame with swath 260 km. In-orbit imaging data show that the design of the special-shaped hood has an obvious effect on restraining solar radiation and can effectively improve the operational efficiency of the satellite.

5. HDR mode design

5.1. Imaging Electronics Design

Luojia1-01 night-time light camera adopts the design of integrating imaging, compression, storage and other functions through combining the imaging board and data processing board together. This design method can effectively reduce the size and weight of the camera, which is of great significance for the application of micro-satellite platform.
The control center of the camera is FPGA (Field-Programmable Gate Array). Under the control of FPGA, image data reception and forwarding, configuration and data flow management of ADV212, solid-state disk storage management and external communication management of the whole system are realized.

5.2. High Dynamic Range Image Construction

The radiance of different targets at night are quite different, which requires the camera to have a high dynamic range to obtain more information of the scene. In order to improve the dynamic range of the camera, we designed the HDR mode. The method adopted was that all the pixels could output the gray value of different gains in one imaging, that is, the high gain image and low gain image of the scene could be obtained simultaneously. The high and low gain image output by the camera can be reconstructed to obtain the image with a high dynamic range. According to the characteristics of the camera, when the output value of the high gain image is less than the threshold Tn, it is considered to be in the linear region of the sensor, and the high gain output is considered to be effective. When the DN value of the high gain image is greater than Tn, the high gain image data is invalid. The corresponding low gain image output value is used for conversion. The specific implementation method is as follows:
Y H D R = { Y H ( Y H T H ) a Y L b ( Y H T H )
The parameters and their meanings are listed in Table 3.
The parameter T H we selected is 3800 DN according to the response for high gain image. Figure 15 shows the original image and the high dynamic range image of Shanghai city in China.

6. In-orbit test results

6.1. Dynamic MTF Evaluation in orbit

The main methods for measuring the in-orbit MTF of remote sensing images are point pulse method, line pulse method and slant edge method [27,28,29,30]. According to the imaging characteristics of Luojia1-01 nightlight camera, MTF is calculated by using the line diffusion function based on line targets.
We select a long-straight bridge target in the image which is shown as Figure 16 to calculate MTF. The image is interpolated and the line spread function (LSF) can be further extracted. Then MTF is obtained by differentiating the line spread function.
According to Figure 17, the camera’s dynamic MTF at Nyquist frequency (46 lp/mm) is 0.17, which meets the requirement that dynamic MTF should be better than 0.15.

6.2. SNR evaluation of the camera in-orbit

Traditional signal-to-noise ratio (SNR) test based on image requires that there is an uniform region [31]. However, this is not easy to achieve in night-time images. We proposed a new SNR calculation method using time series images. Detailed methods are as follows:
(1) Adjust the frame frequency of the camera, and then continuous multi-frame images can be obtained for the same ground area in one push-broom imaging.
(2) Through image registration, multiple exposure images of the same target point are obtained. Signal and noise can be calculated respectively on the time series and SNR can be obtained with unit of dB through formula 4.
S N R = 20 × log 10 S i g n a l N o i s e
(3) According to the absolute radiation calibration data, we will get the relation curve between SNR and radiance. We can obtain the relationship, which is between radiance of the camera’s entrance pupil and illuminance on the objects, based on the assumption that the objects observed are lambert body with reflectivity of ρ and the atmosphere’s transmittance τ . The equation is shown below:
L = E × ρ × τ π × K λ
Here, K λ represents the visual function, which is related to the spectral distribution of the light source. According to the spectrum of the sodium vapor lamp, test results are shown in Figure 18.
It can be seen that under the condition of 10 lx illuminance, when the object reflectivity is 0.3 and the atmospheric transmittance is 0.6, the image SNR obtained can reach 35.5 dB, completely meeting the requirements of the design task and effectively ensuring the radiation measurement accuracy of the system.

7. Conclusions

Through the design and development of Luojia1-01 satellite’s payload, the research of high sensitivity optical remote sensing camera for micro-nano satellite platform is explored. The in-orbit test results show that the performance of the camera meets all the design requirements.
Innovation of Luojia1-01 satellite nighttime light camera are shown in the following aspects:
(1) The combination of large relative aperture optical system and large pixel high sensitivity CMOS detector is adopted, and the imaging mode of frame push-broom is adopted to ensure the radiation sensitivity and geometric stability of the camera.
(2) Fully considering the characteristics of the satellite in-orbit imaging environment, in view of the traditional hood’s disadvantage that cannot suppress solar radiation, we designed a special- shaped hood. In-orbit test results show that the design of hood effectively reduces the influence of solar radiation on imaging and test results are consistent with theoretical analysis. The design of the special shape hood is of great significance for micro satellite platforms.
(3) The adoption of HDR imaging mode greatly improves the dynamic range of im aging and enables accurate measurement of different radiance targets in the scene. Meanwhile, the integrated design of electronics is of great significance for the optical micro-satellite platform.
As a professional nightlight remote sensing satellite, Luojia1-01 has acquired a large number of remote sensing data and completed the drawing of a global map. The data obtained has been widely used in many fields.
However, current high-sensitivity nightlight remote sensing cameras can only obtain grayscale images, while spectral information of targets is also quite important, so we believe that multi-spectral information with high SNR nightlight optical camera and other methods such as polarization remote sensing will be the future direction in nightlight remote sensing field.

Author Contributions

Z.Q.S., Y.J.L., X.Z., G.Z. carried out the overall design and optical design; X.J.H., Q.W. and Z.X.W. carried out electrics of camera design; C.L.H. carried out mechanical design; All authors edited the paper.

Funding

This research was funded by the Youth Innovation Promotion Association, Chinese Academy of Science Integrated optical microsatellite key technology based on the concept of “intelligent instrument”; National Science Foundation for Young Scholars of China grant numbers 61505203.

Acknowledgments

We give thanks to the research team of Chang Guang satellite technology Co., LTD and Wuhan University for Luojia1-01 design. Furthermore, the authors would like to thank the reviewers for their helpful comments.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Kyba, C.C.M.; Kuester, T.; De Miguel, A.S.; Baugh, K.; Jechow, A.; Hölker, F.; Bennie, J.; Elvidge, C.D.; Gaston, K.J.; Guanter, L. Artificially lit surface of Earth at night increasing in radiance and extent. Sci. Adv. 2017, 3, e1701528. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Elvidge, C.D.; Baugh, K.; Zhizhin, M.; Hsu, F.C.; Ghosh, T. Viirs nighttime lights. Int. J. Remote Sens. 2017, 38, 5860–5879. [Google Scholar] [CrossRef]
  3. Li, X.; Xu, H.M.; Chen, X.L.; Li, C. Potential of NPP-VIIRS nighttime light imagery for modeling the regional economy of China. Remote Sens. 2013, 5, 3057–3081. [Google Scholar] [CrossRef]
  4. Cao, C.; Shao, X.; Uprety, S. Detecting light outages after severe storms using the S-NPP/VIIRS day/night band radiances. IEEE Geosci. Remote Sens. Lett. 2013, 10, 1582–1586. [Google Scholar] [CrossRef]
  5. Shi, K.F.; Yu, B.L.; Huang, Y.X.; Hu, Y.J.; Yin, B.; Chen, Z.Q.; Chen, L.J.; Wu, J.P. Evaluating the ability of NPP-VIIRS nighttime light data to estimate the gross domestic product and the electric power consumption of china at multiple scales: A comparison with DMSP-OLS data. Remote Sens. 2014, 6, 1705–1724. [Google Scholar] [CrossRef]
  6. Schroeder, W.; Oliva, P.; Giglio, L.; Csiszar, I.A. The New VIIRS 375 m active fire detection data product: Algorithm description and initial assessment. Remote Sens. Environ. 2014, 143, 85–96. [Google Scholar] [CrossRef]
  7. Elvidge, C.D.; Baugh, K.E.; Zhizhin, M.; Hsu, F.C. Why VIIRS data are superior to DMSP for mapping nighttime lights. Proc. Asia-Pac. Adv. Netw. 2013, 35, 62–69. [Google Scholar] [CrossRef]
  8. Yu, B.; Shi, K.; Hu, Y.; Huang, C.; Chen, Z.; Wu, J. Poverty evaluation using NPP-VIIRS nighttime light composite data at the county level in China. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 1217–1229. [Google Scholar] [CrossRef]
  9. Zhou, Y.; Smith, S.J.; Zhao, K.; Imhoff, M.; Thomson, A.; Bond-Lamberty, B.; Asrar, G.R.; Zhang, X.; He, C.; Elvidge, C.D. A global map of urban extent from nightlights. Environ. Res. Lett. 2015, 10, 0554011. [Google Scholar] [CrossRef]
  10. Jiang, W.; He, G.J.; Long, T.F.; Wang, C.; Ni, Y.; Ma, R.Q. Assessing light pollution in China based on nighttime light imagery. Remote Sens. 2017, 9, 135. [Google Scholar] [CrossRef]
  11. Li, D.R.; Li, X. Use of night-time light remote sensing in humanitarian disaster evaluation. Chin. J. Nat. 2018, 40, 169–176. [Google Scholar]
  12. Zhang, G.; Li, L.; Jiang, Y.; Shen, X.; Li, D. On-Orbit Relative Radiometric Calibration of the Night-Time Sensor of the LuoJia1-01 Satellite. Sensors 2018, 18, 4225. [Google Scholar] [CrossRef] [PubMed]
  13. Jiang, W.; He, G.; Long, T.; Guo, H.; Yin, R.; Leng, W.; Liu, H.; Wang, G. Potentiality of Using Luojia 1-01 Nighttime Light Imagery to Investigate Artificial Light Pollution. Sensors 2018, 18, 2900. [Google Scholar] [CrossRef] [PubMed]
  14. Li, X.; Zhao, L.; Li, D.; Xu, H. Mapping Urban Extent Using Luojia1-01 Nighttime Light Imagery. Sensors 2018, 18, 3665. [Google Scholar] [CrossRef] [PubMed]
  15. Li, X.; Li, D.R.; He, X.J.; Michael, J. A preliminary investigation of Luojia-1 nighttime imagery. Remote Sensing Letters. 2019, 10, 526–535. [Google Scholar] [CrossRef]
  16. Standard for Lighting Design of Urban Road. Available online: http://www.mohurd.gov.cn/wjfb/201512/t20151216_225961.html (accessed on 15 February 2019).
  17. Hänel, A.; Posch, T.; Ribas, S.J.; Aubé, M.; Duriscoe, D.; Jechow, A.; Kollath, Z.; Lolkema, D.E.; Moore, C.; Schmidt, N.; et al. Measuring night sky brightness: Methods and challenges. J. Quant. Spectrosc. Radiat. Transf. 2018, 205, 278–290. [Google Scholar] [CrossRef]
  18. Zhao, J.X.; Zhang, T.; Zhang, J.J.; Yuan, G.Q. Study of the effects on frame aerial photography direct-gepreferencing accuracy caused by image motion. Infrared Laser Eng. 2015, 44, 632–638. [Google Scholar]
  19. Li, D.R.; Wang, S.G.; Zhou, Y.Q. An Introduction to Photogrammetry and Remote Sensing; Surveying and Mapping Press: Beijing, China, 2008; pp. 164–169. (In Chinese) [Google Scholar]
  20. Estribeau, M.; Magnan, P. Fast MTF measurement of CMOS imagers at the chip level using ISO 12233 slanted-edge methodology. Proc. SPIE 2004, 5570, 557–567. [Google Scholar] [Green Version]
  21. Weijun, C.; Bin, F.; Eengqin, Z.; Qinglin, L.; Xin, W. High Opto-mechanical Stability Design of Multi-spectral Camera. Spacecr. Recovery Remote Sens. 2012, 33, 85–92. [Google Scholar]
  22. Cipelletti, L.; Weitz, D.A. Ultralow-angle dynamic light scattering with a charge coupled device camera based multispeckle, multitau correlator. Rev. Sci. Instrum. 1999, 70, 3214–3221. [Google Scholar] [CrossRef]
  23. Park, J.; Jang, W.K.; Kim, S.; Jang, H.; Lee, S. Stray Light Analysis of High Resolution Camera for a Low-Earth-Orbit Satellite. J. Opt. Soc. Korea 2011, 15, 52–55. [Google Scholar] [CrossRef] [Green Version]
  24. Zhong, X.; Jia, J.Q. Stray light removing design and simulation of spaceborne camera. Opt. Precision Eng. 2009, 19, 621–625. [Google Scholar]
  25. Ji, F.M.; Liu, R.H.; Ni, Y.D. Simulation analysis of STKX on BDS positioning performance. J. Navig. Position. 2018, 6, 64–68. [Google Scholar]
  26. Sun, C.M.; Zhao, F.; Zhang, Z. Modeling and Simulation of Space Object Optical Scattering Characteristics Using TracePro. Acta Photonica Sincia 2014, 43, 1–5. [Google Scholar]
  27. Peter, D.B. Slanted–Edge MTF for Digital Camera and Scanner Analysis. In Is and Ts Pics Conference; Society for Imaging Science and Technology: Springfield, VA, USA, 2000; pp. 135–138. [Google Scholar]
  28. Estribeau, M.; Magnan, P. Fast MTF Measurement of CMOS Imagers Using ISO 12233 Slanted-Edge Methodology. In Detectors and Associated Signal Processing; International Society for Optics and Photonics: Bellingham, WA, USA, 2004. [Google Scholar]
  29. Sampo, M.B.; Anssi, J.M. Random target method for fast MTF inspection. Opt. Express 2004, 12, 2610–2615. [Google Scholar]
  30. Viallefont, F. Dominique Leger. Improvement of the edge method for on-orbit MTF measurement. Opt. Express 2010, 18, 3531–3545. [Google Scholar] [CrossRef] [PubMed]
  31. Zhang, H.Y.C.; He, X.J.; Su, Z.Q. SNR Model Building of CMOS Imaging System of Rolling Digital Domain TDI Technology. J. Chang. Univ. Sci. Technol. (Nat. Sci. Ed.). 2018, 41, 68–72. [Google Scholar]
Figure 1. Spectral response curve for the detector.
Figure 1. Spectral response curve for the detector.
Sensors 19 00797 g001
Figure 2. The relationship between MTF (modulation transfer function) limited by diffraction and F number.
Figure 2. The relationship between MTF (modulation transfer function) limited by diffraction and F number.
Sensors 19 00797 g002
Figure 3. Relationship between SNR(signal to noise ratio) and F-No. of optical system.
Figure 3. Relationship between SNR(signal to noise ratio) and F-No. of optical system.
Sensors 19 00797 g003
Figure 4. The structure of the optical system.
Figure 4. The structure of the optical system.
Sensors 19 00797 g004
Figure 5. Modulation transfer function curves of the optical system.
Figure 5. Modulation transfer function curves of the optical system.
Sensors 19 00797 g005
Figure 6. Optical system’s structure with compensation lens.
Figure 6. Optical system’s structure with compensation lens.
Sensors 19 00797 g006
Figure 7. The relationship between MTF limited by diffraction and F number.
Figure 7. The relationship between MTF limited by diffraction and F number.
Sensors 19 00797 g007
Figure 8. Mechanical structure of the camera.
Figure 8. Mechanical structure of the camera.
Sensors 19 00797 g008
Figure 9. The relationship between solar vector and the camera coordinate system.
Figure 9. The relationship between solar vector and the camera coordinate system.
Sensors 19 00797 g009
Figure 10. The changes of the angle between the solar vector and orbital plane changes in one year.
Figure 10. The changes of the angle between the solar vector and orbital plane changes in one year.
Sensors 19 00797 g010
Figure 11. The angle between solar vector and camera axis.
Figure 11. The angle between solar vector and camera axis.
Sensors 19 00797 g011
Figure 12. Profile of the camera’s special shaped hood.
Figure 12. Profile of the camera’s special shaped hood.
Sensors 19 00797 g012
Figure 13. PST curves of the camera.
Figure 13. PST curves of the camera.
Sensors 19 00797 g013
Figure 14. Different frames of night-time light image of Moscow area in one orbit. (a) normal image with the angle between axis and solar vector greater than 52 degree; (b) image effected by stray light with the angle between axis and solar vector less than 52 degree.
Figure 14. Different frames of night-time light image of Moscow area in one orbit. (a) normal image with the angle between axis and solar vector greater than 52 degree; (b) image effected by stray light with the angle between axis and solar vector less than 52 degree.
Sensors 19 00797 g014
Figure 15. (a) Original image of low gain and high gain mode; (b) HDR image constructed by low gain and high gain image.
Figure 15. (a) Original image of low gain and high gain mode; (b) HDR image constructed by low gain and high gain image.
Sensors 19 00797 g015
Figure 16. Line target in an image.
Figure 16. Line target in an image.
Sensors 19 00797 g016
Figure 17. Dynamic modulation transfer function curves of Luojia1-01 satellite.
Figure 17. Dynamic modulation transfer function curves of Luojia1-01 satellite.
Sensors 19 00797 g017
Figure 18. SNR(signal to noise ratio) tested results of the images. (a) the relationship between SNR and radiance of optical entrance; (b) the relationship between SNR and illuminance of targets with reflectivity of 0.3 and atmosphere’s transmittance of 0.6.
Figure 18. SNR(signal to noise ratio) tested results of the images. (a) the relationship between SNR and radiance of optical entrance; (b) the relationship between SNR and illuminance of targets with reflectivity of 0.3 and atmosphere’s transmittance of 0.6.
Sensors 19 00797 g018
Table 1. Parameters of the CMOS detector.
Table 1. Parameters of the CMOS detector.
ParametersValue
Active area22.5 mm (H) × 22.5 mm (V)
Pixel size11 μm × 11 μm
Number of active detectors2048 × 2048
Full well/Ke-91
Readout noise (e-)1.47
Dark current (e-/s/pix) @ 25 °C32
Dynamic range (Standard mode)>70 dB
Dynamic range (High dynamic range mode)>96 dB
Working temperature (°C)−55-+80
Power consumption (mW)<600
ADC12
Table 2. Physical meaning of parameters in formula 1.
Table 2. Physical meaning of parameters in formula 1.
ParameterPhysical Meaning
ρ Reflectivity of ground object
E Illuminance of ground object
τ a Transmittance of atmosphere
τ o Transmittance of optical system
F Reciprocal of the relative aperture
Table 3. Parameters and physical meanings in formula 3.
Table 3. Parameters and physical meanings in formula 3.
ItemExplanations
Gain ratio a = K H K L
Y H , Y L ( D N ) Pixel output of the high gain image and low gain image
K H , K L ( D N / e ) Conversion factor for high gain image and low gain image
O H , O L ( D N ) Black level offset of high gain image and low gain image
T H ( D N ) The threshold value for conversion between low gain image and high gain image

Share and Cite

MDPI and ACS Style

Su, Z.; Zhong, X.; Zhang, G.; Li, Y.; He, X.; Wang, Q.; Wei, Z.; He, C.; Li, D. High Sensitive Night-time Light Imaging Camera Design and In-orbit Test of Luojia1-01 Satellite. Sensors 2019, 19, 797. https://doi.org/10.3390/s19040797

AMA Style

Su Z, Zhong X, Zhang G, Li Y, He X, Wang Q, Wei Z, He C, Li D. High Sensitive Night-time Light Imaging Camera Design and In-orbit Test of Luojia1-01 Satellite. Sensors. 2019; 19(4):797. https://doi.org/10.3390/s19040797

Chicago/Turabian Style

Su, Zhiqiang, Xing Zhong, Guo Zhang, Yanjie Li, Xiaojun He, Qiang Wang, Zongxi Wei, Chunling He, and Deren Li. 2019. "High Sensitive Night-time Light Imaging Camera Design and In-orbit Test of Luojia1-01 Satellite" Sensors 19, no. 4: 797. https://doi.org/10.3390/s19040797

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop