Hyperspectral Imaging from low cost sensors

In contrast to most artificial and biological optical sensors, hyperspectral imaging systems allows us to accurately recover the full spectral profile of each point in the observed scene. This rich spectral information has been explored and exploited in many fields such as astronomy, biology, chemistry, agriculture, geology, environmental studies and in a wide variety of other academic and commercial applications. Here at the ICVL we are using cutting edge equipment to explore new avenues of research in the field of hyperspectral imagery. The project described in this page explores hyperspectral acquisition from very low cost multispectral (and in paricular, RGB) sensors.

BGU ICVL Hyperspectral Database

The interdisciplinary computational vision lab maintains a constatly growing collection of natural hyperspectral images. The database currently contains 201 Images at 1392×1300 spatial resolution over 519 spectral bands.


Hyperspectral Database

Hyperspectral from RGB

RGB imaging systems capture a surprising amount of spectral information. By leveraging sparse overcomplete dictionaries it is possible to recover hyperspectral information from RGB images.

Filter Optimization for Hyperspectral Estimation

While many methods attempt to recover hyperspectral information from RGB images, not all RGB cameras were created equal. Selecting an optimal RGB camera can increase performance by over 20%.

Replacing RGB filters with customized filter sets can improve performance even further, but finding an optimal filter set within large collections is no simple task.

Evolutionary optimization allows us to efficiently explore large filter collections, quickly finding optimal or near-opimal filter sets for hyperspectral estimaion.

Software

Sample code demonstrating the hyperspectral recovery methodolody described in “Sparse Recovery of Hyperspectral Signal from Natural RGB Images” can be found on the project github repository:

Papers

Video

Presentation at the 2nd EMVA Forum at AIT.

Acknowledgements

This research was supported in part by the by the Israel Science Foundation (ISF FIRST/BIKURA Grant 281/15) and the European Commission (Horizon 2020 grant SWEEPER GA no 644313). We also thank the Frankel Fund and the Helmsley Charitable Trust through the ABC Robotics Initiative, both at Ben-Gurion University of the Negev.