Inter-scanner and inter-protocol discrepancy in MRI datasets are known to lead to significant quantification variability. Hence image-to-image or scanner-to-scanner translation is a crucial frontier in the area of medical image analysis with a lot of potential applications. Nonetheless, a significant percentage of existing algorithms cannot explicitly exploit and preserve texture details from target scanners and offers individual solutions towards specialized task-specific architectures. In this paper, we design a multi-scale texture transfer to enrich the reconstruction images with more details. Specifically, after calculating textural similarity, the multi-scale texture can adaptively transfer the texture information from target images or reference images to restored images. Different from the pixel-wise matching space as done by previous algorithms, we match texture features in a multi-scale scheme implemented in the neural space. The matching mechanism can exploit multi-scale neural transfer that encourages the model to grasp more semantic-related and lesion-related priors from the target or reference images. We evaluate our multi-scale texture GAN on three different tasks without any task-specific modifications: cross-protocol super-resolution of diffusion MRI, T1-Flair, and Flair-T2 modality translation. Our multi-texture GAN rehabilitates more high-resolution structures (i.e., edges and anatomy), texture (i.e., contrast and pixel intensities), and lesion information (i.e., tumor). The extensively quantitative and qualitative experiments demonstrate that our method achieves superior results in inter-protocol or inter-scanner translation over state-of-the-art methods.