eduzhai > Applied Sciences > Engineering >

Unsupervised Pansharpening Based on Self-Attention Mechanism

  • Save

... pages left unread,continue reading

Document pages: 17 pages

Abstract: Pansharpening is to fuse a multispectral image (MSI) oflow-spatial-resolution (LR) but rich spectral characteristics with apanchromatic image (PAN) of high-spatial-resolution (HR) but poor spectralcharacteristics. Traditional methods usually inject the extractedhigh-frequency details from PAN into the up-sampled MSI. Recent deep learningendeavors are mostly supervised assuming the HR MSI is available, which isunrealistic especially for satellite images. Nonetheless, these methods couldnot fully exploit the rich spectral characteristics in the MSI. Due to the wideexistence of mixed pixels in satellite images where each pixel tends to covermore than one constituent material, pansharpening at the subpixel level becomesessential. In this paper, we propose an unsupervised pansharpening (UP) methodin a deep-learning framework to address the above challenges based on theself-attention mechanism (SAM), referred to as UP-SAM. The contribution of thispaper is three-fold. First, the self-attention mechanism is proposed where thespatial varying detail extraction and injection functions are estimatedaccording to the attention representations indicating spectral characteristicsof the MSI with sub-pixel accuracy. Second, such attention representations arederived from mixed pixels with the proposed stacked attention network poweredwith a stick-breaking structure to meet the physical constraints of mixed pixelformulations. Third, the detail extraction and injection functions are spatialvarying based on the attention representations, which largely improves thereconstruction accuracy. Extensive experimental results demonstrate that theproposed approach is able to reconstruct sharper MSI of different types, withmore details and less spectral distortion as compared to the state-of-the-art.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×