We introduce a novel generalization of entropy and conditional entropy from which most definitions from the literature can be derived as particular cases. Within this general framework, we investigate the problem of designing countermeasures for information leakage. In particular, we seek metric-invariant solutions, i.e., they are robust against the choice of entropy for quantifying the leakage. The problem can be modelled as an information channel from the system to an adversary, and the countermeasures can be seen as modifying this channel in order to minimise the amount of information that the outputs reveal about the inputs. Our main result is to fully solve the problem under the highly symmetrical design constraint that the number of inputs that can produce the same output is capped. Our proof is constructive and the optimal channels and the minimum leakage are derived in closed form.