Memristive crossbars can efficiently implement Binarized Neural Networks (BNNs) wherein the weights are stored in high-resistance states (HRS) and low-resistance states (LRS) of the synapses. We propose SwitchX mapping of weights onto crossbars such that the power consumed by the crossbars and the impact of crossbar non-idealities, that lead to degradation in computational accuracy, are minimized. Essentially, SwitchX maps the binary weights in such manner that the crossbar comprises of more HRS than LRS synapses. Increased HRS in a crossbar will decrease the overall output dot-product current and thus lead to power savings. Interestingly, BNNs mapped onto crossbars with SwitchX also exhibit better robustness against adversarial attacks than the corresponding software BNN baseline as well as the standard crossbar mapped BNNs. Finally, we combine SwitchX with state-aware training (that further increases the feasibility of HRS states during weight mapping) to boost the robustness and energy-efficiency of BNN on hardware. We find that this approach yields stronger defense against adversarial attacks than Adversarial training, a state-of-the-art software defense. We perform experiments using benchmark datasets (CIFAR-100 & CIFAR-10) and show that SwitchX combined with state-aware training can yield upto ~35% improvements in clean accuracy and ~6-16% in adversarial accuracies against conventional BNNs on a 32x32 crossbar, while gaining ~22% savings in overall crossbar power consumption.