Abstract
Detecting the unexpected is essential in Space Situational Awareness (SSA) systems. A robust SSA system must be able to detect and correctly classify uncatalogued debris, spacecraft, and orbital anomalies in space. However, optical and atmospheric artifacts can readily lead to misclassifications. This is because, much like astronomy, SSA lacks strong priors on novel observations. Neuromorphic event-based sensors excel at detecting fast-moving objects and unexpected changes in luminance, such as glints. Unlike conventional cameras, event-based sensors do not capture frames but detect luminance changes through independent pixels, allowing for an extremely high dynamic range and temporal resolution. These properties enable unique opportunities for space imaging, Resident Space Object (RSO) detection and tracking, high-speed adaptive optics, satellite identification, and real-time in-frame astrometry. With recent advances in sensor sensitivity, quality, and resolution, the neuromorphic vision sensor has become a serious contender in SSA applications. While event-based space situational awareness provides significant advantages, the event-based sensing paradigm also introduces novel challenges that are not present in conventional frame-based SSA. Fast and faint point sources can be hard to identify amongst spurious change detections generated by other sources, particularly detections arising from insects, bats, and planes. The lack of absolute luminance information in a neuromorphic sensor makes it more challenging to differentiate between RSOs and atmospheric objects when their trajectories are similar from the observer’s perspective. Spurious detections are not limited to atmospheric artifacts but may also be due to sensor noise. While recent neuromorphic sensors have dramatically improved noise characteristics compared to older models, it is still desirable to operate them as close to the noise floor as possible to detect ever-fainter objects. To address these issues, a binocular telescope can be used in conjunction with neuromorphic sensors. By using two cameras and two telescopes attached to the same mount about 1 m apart we are able to capture “stereo event streams” allowing us to detect the presence of parallax and identify atmospheric objects. Additionally, combining neuromorphic cameras with traditional cameras can provide absolute luminance information, which can improve the accuracy of object detection and classification. Our current angular resolution, about one arc second per pixel, allows detection of parallax of objects within the atmosphere. Correlating the sensor’s outputs will allow us to filter such objects as well as spurious detections while preserving isolated changes caused by faint satellites. We test several correlation algorithms which leverage the inherent high temporal resolution and the spatial sparsity of Neuromorphic sensors. The presented algorithmic and observational methodologies allow us to distinguish atmospheric objects and sensor noise from genuine detections of uncatalogued objects significantly improving on the current state of the art.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 24th Advanced Maui Optical and Space Surveillance Technologies Conference (AMOS 2023), Maui, Hawaii, USA, 19-22 September 2023 |
| Publisher | Maui Economic Development Board |
| Number of pages | 15 |
| Publication status | Published - 2023 |
| Event | Advanced Maui Optical and Space Surveillance Technologies Conference - Maui, United States Duration: 10 Sept 2023 → 22 Sept 2023 Conference number: 24th |
Conference
| Conference | Advanced Maui Optical and Space Surveillance Technologies Conference |
|---|---|
| Country/Territory | United States |
| City | Maui |
| Period | 10/09/23 → 22/09/23 |
Fingerprint
Dive into the research topics of 'Binocular telescope for neuromorphic space situational awareness'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver