loading page

Expanded Dimensionality for Image Spectroscopy via Machine Learning
  • Fangcao Xu,
  • Guido Cervone,
  • Mark Salvador
Fangcao Xu
Pennsylvania State University Main Campus

Corresponding Author:[email protected]

Author Profile
Guido Cervone
Pennsylvania State University Main Campus
Author Profile
Mark Salvador
Zi Inc
Author Profile

Abstract

The generalized solution for the radiance equation is expanded by exploiting multiple hyperspectral image scans acquired by aerial platforms at different viewing angles. A machine learning solution based on convolutional neural networks is used to learn the relationships between the total radiance observed at the sensor, and different atmospheric components of the radiance equation. The goal is to precisely characterize the atmosphere, in order to properly solve the radiance equation, in which atmospheric components constitute important input. Traditionally, these atmospheric components are only estimated from averages of pixels, or assigned using heuristics tables. Compared to traditional image spectroscopy, this expanded radiance equation and machine learning solution integrates quantitative mathematical modeling, multiple scanned hyperspectral images and artificial intelligence. The solution is able to model and predict the transmittance, downwelling and upwelling components of the radiance equation with increased spatial and temporal dimensionality. It’s promising to use different combinations of the multiple scans to parameterize the radiance equation and improve the target detection in varying atmospheric conditions, where current solutions based on a single hyperspectral image normally fail. This works presents initial results of an expanded mathematical solution, along with the results from the convolution networks. Synthetic data were generated using the MODTRAN atmospheric software to simulate different vintage points, atmospheric models, time of the day and year, for an array of specific targets with varying reflectances. More specifically, MODTRAN was used to simulate Longwave Infrared Red between 7.5 and 12 microns with a 17.5 nanometers spectral sensitivity, which correspond to the range and resolution of the Blue Heron Longwave Hyperspectral sensor. Results from the convolutional neural network indicate our machine learning solution is computationally faster than the traditional radiative transfer (RT) model and is able to characterize the impact of varying atmospheric conditions on the at-sensor radiance components.