Traditionally, the delay margin of a looped system is computed by considering both the controller and system representations that evolve in the same space (e.g. either continuous or discrete-time). However, as in practice the system is continuous and the controller is mostly embedded in a computer, the looped - controller / system pair - model is hybrid. As a consequence, the computed delay margin might vary with respect to the continuous (or discrete one). This paper proposes a novel approach to compute the exact delay margin of hybrid systems, and more specifically, when a discrete-time controller is looped with a continuous-time system. The main interest is then to provide the practitioners with a way to select the appropriate discretization technique for maximizing the delay margin and to be able to exactly evaluate the delay margin before implementation on target. The main idea is to approximate the discrete-time controller with an equivalent continuous-time one (often with higher order) and to exploit the classical continuous-time frequency-based analysis strategies.