The substantial volume of continuously gathered remote sensing
data can serve as a valuable information source in mitigating
the effects of natural disasters. This involves identifying
changes in the time series of observations. Considering that
the precise location of the changes may not be available in
real-world scenarios, we propose an unsupervised method for
detecting extreme events in multi-temporal satellite images.
Specifically, we learn a basis matrix of each dimension of the
feature space of the images using the tensor decomposition
learning method. Then, each new image is represented in the
feature space by expressing it as a multilinear combination
of the learned tensor decomposition factors. The predicted
changes can be obtained by comparing and thresholding the
distance of the corresponding extracted features of the images
before and after the event. Experimental results on real
Sentinel-2 multi-temporal images demonstrate that the proposed
method can efficiently detect the effects of fires and
floods with low complexity.