what is relu in tensorflow

HTML
Relu, TensorFlow.
Relu is a transformation that adds non-linearity. 
With non-linearity, we improve the effectiveness 
of a neural network. Relu replaces negative values with zero. 
It only keeps positive values (and leaves them unchanged).
In other words like the Absolute value.

Source

Also in HTML: