In artificial neural networks, the

**activation function**of a node defines the output of that node given an input or set of inputs. A standard computer chip circuit can be seen as a digital network of**activation functions**that can be "ON" (1) or "OFF" (0), depending on input.Subsequently, one may also ask, what is ReLU CNN?

**ReLu**: The rectifier function is an activation function f(x) = Max(0, x) which can be used by neurons just like any other activation function, a node using the rectifier activation function is called a

**ReLu**node.

What is leaky ReLU?

**Leaky ReLU**.

**Leaky**ReLUs are one attempt to fix the “dying

**ReLU**” problem. Instead of the function being zero when x < 0, a

**leaky ReLU**will instead have a small negative slope (of 0.01, or so). Some people report success with this form of activation function, but the results are not always consistent.

What is the purpose of the activation function?

Activation functions are really important for a Artificial Neural Network to learn and make sense of something really complicated and

**Non**-linear complex functional mappings between the inputs and response variable.They introduce**non**-linear properties to our Network.Their main purpose is to convert a input signal of a