The paper addresses a sequential changepoint detection problem, assuming that the duration of change may be finite and unknown. This problem is of importance for many applications, e.g., for signal and image processing where signals appear and disappear at unknown points in time or space. In contrast to the conventional optimality criterion in quickest change detection that requires minimization of the expected delay to detection for a given average run length to a false alarm, we focus on a reliable maximin change detection criterion of maximizing the minimal probability of detection in a given time (or space) window for a given local maximal probability of false alarm in the prescribed window. We show that the optimal detection procedure is a modified CUSUM procedure. We then compare operating characteristics of this optimal procedure with popular in engineering the Finite Moving Average (FMA) detection algorithm and the ordinary CUSUM procedure using Monte Carlo simulations, which show that typically the later algorithms have almost the same performance as the optimal one. At the same time, the FMA procedure has a substantial advantage -- independence to the intensity of the signal, which is usually unknown. Finally, the FMA algorithm is applied to detecting faint streaks of satellites in optical images.