This paper first focuses on deriving an alternative approach for proving an extremal entropy inequality (EEI), originally presented in . The proposed approach does not rely on the channel enhancement technique, and has the advantage that it yields an explicit description of the optimal solution as opposed to the implicit approach of . Compared with the proofs in , the proposed alternative proof is also simpler, more direct, more information-theoretic, and has the additional advantage that it offers a new perspective for establishing novel as well as known challenging results such the capacity of the vector Gaussian broadcast channel, the lower bound of the achievable rate for distributed source coding with a single quadratic distortion constraint, and the secrecy capacity of the Gaussian wire-tap channel. The second part of this paper is devoted to some novel applications of the proposed mathematical results. The proposed mathematical techniques are further exploited to obtain a more simplified proof of the EEI without using the entropy power inequality (EPI), to build the optimal solution for a special class of broadcasting channels with private messages and to obtain a mutual information-based performance bound for the mean square-error of a linear Bayesian estimator of a Gaussian source embedded in an additive noise channel.