Motivated by the alternating minimization nature of the Sinkhorn's algorithm and the theoretically faster convergence of accelerated gradient method, in this paper we propose a way to combine alternating minimization and Nesterov-type momentum acceleration. We propose a generic accelerated alternating minimization method and its primal-dual modification for problems with linear constraints enjoying a $1/k^2$ convergence rate, where $k$ is the iteration counter. Moreover, our algorithm converges faster than gradient-type methods in practice as it is free of the choice of the step-size and is adaptive to the local smoothness of the problem. We show how this generic method can be applied to the Optimal Transport problem, we introduce an accelerated Sinkhorn algorithm and estimate its theoretical complexity for the OT problem. We also demonstrate how one can apply the same generic method to the Wasserstein Barycenter problem. As we demonstrate by numerical experiments, the new method is more stable and has faster convergence in practice than the Sinkhorn's algorithm, especially in the regime of high accuracy.