Robust Sensor Design Against Multiple Attackers with Misaligned Control Objectives

Muhammed O. Sayin, Tamer Basar

We introduce a robust sensor design framework to provide defense against attackers that can bypass/hijack the existing defense mechanisms. For effective control, such attackers would still need to have access to the state of the system because of the presence of plant noise. We design "affine" sensor outputs to control their perception of the system so that their adversarial intentions would not be fulfilled or even inadvertently end up having a positive impact. The specific model we adopt is a Gauss-Markov process driven by a controller with a "private" malicious/benign quadratic control objective. We seek to defend against the worst possible distribution over the controllers' objectives in a robust way. Under the solution concept of game-theoretic hierarchical equilibrium, we obtain a semi-definite programming problem equivalent to the problem faced by the sensor against a controller with an arbitrary, but known control objective even when the sensor has noisy measurements. Based on this equivalence relationship, we provide an algorithm to compute the optimal affine sensor outputs. Finally, we analyze the ensuing performance numerically for various.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment