Superpixel based recursive least-squares method for lossless compression of hyperspectral images

KARACA A. C., Gullu M. K.

MULTIDIMENSIONAL SYSTEMS AND SIGNAL PROCESSING, vol.30, no.2, pp.903-919, 2019 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 30 Issue: 2
  • Publication Date: 2019
  • Doi Number: 10.1007/s11045-018-0590-4
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Page Numbers: pp.903-919
  • Keywords: Parallel compression, Superpixel-based compression, Hyperspectral image compression, Recursive least squares, IMPLEMENTATION
  • Yıldız Technical University Affiliated: No


Filtering based compression methods have become a popular research topic in lossless compression of hyperspectral images. Recursive least squares (RLS) based prediction methods provide better decorrelation performance among the filtering based methods. In this paper, two superpixel segmentation based RLS methods, namely SuperRLS and B-SuperRLS, are investigated for lossless compression of hyperspectral images. The proposed methods present a novel parallelization approach for RLS based prediction method. In the first step of SuperRLS, superpixel segmentation is applied to hyperspectral image. Afterwards, hyperspectral image is partitioned into multiple small regions according to the superpixel boundaries. Each region is predicted with RLS method in parallel, and prediction residuals are encoded via arithmetic encoder. Additionally, superpixel based prediction approach provides region of interest compression capability. B-SuperRLS, which is bimodal version of SuperRLS, evaluates both spectral and spatio-spectral correlations for prediction. The performance of the proposed methods are exhaustively analysed in terms of superpixel number, input vector length and number of parallel nodes, used in the prediction. Experimental results show that the proposed parallel architecture dramatically reduces the computation time, and achieves lower bit-rate performances among the state-of-the-art methods.