To content
Department of Mechanical Engineering
Winning contribution to the ReLU Function Challenge

FHERMA ReLU Function Challenge Won!

-
in
  • celebrations
Visualization of our Relu approximation solution © RCS​/​DT
Janis, Dieter, Philipp, and Moritz authored the winning submission to the “ReLU Function Challenge”.

The Rectified Linear Unit (ReLU) is famously used in machine learning as an activation function in neural networks. The challenge, organized by FHERMA, was aimed at an efficient and accurate evaluation of ReLU under state-of-the-art homomorphic encryption (HE). One major constraint when working with HE is that polynomials can be efficiently evaluated, while non-linear functions, such as ReLU, are not natively supported.

One key idea in our solution was to design a polynomial approximation of ReLU using a tailored formulation of a regression task resulting in a mixed-integer optimization problem. Tightly coupled to this formulation was our scheme for evaluating the designed polynomial, which allowed us to extend the range of involved monomials beyond the maximum order that is commonly found in other approaches to evaluating polynomials with fixed-depth circuits.

A more in-depth discussion of the implemented method and the achieved results can be found in our article on the FHERMA website.