Activation function, mostly used in neural networks. ReLU stands for “Rectified Linear Unit”. f(x)=max(0,x)