We consider the exact channel synthesis problem. This problem concerns the determination of the minimum amount of information required to create exact correlation remotely when there is a certain rate of randomness shared by two terminals. This problem generalizes an existing approximate version, in which the generated joint distribution is required to be close to a target distribution under the total variation (TV) distance measure (instead being exactly equal to the target distribution). We provide single-letter inner and outer bounds on the admissible region of the shared randomness rate and the communication rate for the exact channel synthesis problem. These two bounds coincide for doubly symmetric binary sources. We observe that for such sources, the admissible rate region for exact channel synthesis is strictly included in that for the TV-approximate version. We also extend the exact and TV-approximate channel synthesis problems to sources with countably infinite alphabets and continuous sources; the latter includes Gaussian sources. As by-products, lemmas concerning soft-covering under R\'enyi divergence measures are derived.