# Exercise - Activation Functions

{% file src="<https://content.gitbook.com/content/Z1AtAIaq9iuBBWNOucSy/blobs/E6FUax7UHMNddWEJ4lFD/Activation%20Functions.ipynb>" %}

Download this notebook and implement the missing parts. You will have to implement four of the most common activations functions: Sigmoid, TanH, ReLU and LeakyReLU. The implemented activation functions should look like this:

### Sigmoid

<figure><img src="https://content.gitbook.com/content/Z1AtAIaq9iuBBWNOucSy/blobs/dEVMJ2LpVPqNwgezdQir/sigmoid.png" alt=""><figcaption></figcaption></figure>

### TanH

<figure><img src="https://content.gitbook.com/content/Z1AtAIaq9iuBBWNOucSy/blobs/5oCGh0WW9TIPYPmc77kh/tanh.png" alt=""><figcaption></figcaption></figure>

### ReLU

<figure><img src="https://content.gitbook.com/content/Z1AtAIaq9iuBBWNOucSy/blobs/Cbn5kssYIZHzaMpPZxMC/relu.png" alt=""><figcaption></figcaption></figure>

### LeakyReLU

<figure><img src="https://content.gitbook.com/content/Z1AtAIaq9iuBBWNOucSy/blobs/cECvgpHvmrY0BPDAAx4b/leaky_relu.png" alt=""><figcaption></figcaption></figure>
