Rotation Equivariant Operators for Machine Learning on Scalar and Vector Fields

Paul Shen, Michael Herbst, Venkat Viswanathan

We develop theory and software for rotation equivariant operators on scalar and vector fields, with diverse applications in simulation, optimization and machine learning. Rotation equivariance (covariance) means all fields in the system rotate together, implying spatially invariant dynamics that preserve symmetry. Extending the convolution theorems of linear time invariant systems, we theorize that linear equivariant operators are characterized by tensor field convolutions using an appropriate product between the input field and a radially symmetric kernel field. Most Green's functions and differential operators are in fact equivariant operators, which can also fit unknown symmetry preserving dynamics by parameterizing the radial function. We implement the Julia package EquivariantOperators.jl for fully differentiable finite difference equivariant operators on scalar, vector and higher order tensor fields in 2d/3d. It can run forwards for simulation or image processing, or be back propagated for computer vision, inverse problems and optimal control. Code at https://aced-differentiate.github.io/EquivariantOperators.jl/

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment