Author Archive | Zhe Ming Chng

Loss function

Loss Functions in TensorFlow

The loss metric is very important for neural networks. As all machine learning models are one optimization problem or another, the loss is the objective function to minimize. In neural networks, the optimization is done with gradient descent and backpropagation. But what are loss functions, and how are they affecting your neural networks? In this […]

Continue Reading
Using Activation Functions in TensorFlow<br/>Photo by <a href="https://unsplash.com/photos/EAJoIzfAibI">Victor Freitas</a>. Some rights reserved.

Using Activation Functions in Neural Networks

Activation functions play an integral role in neural networks by introducing nonlinearity. This nonlinearity allows neural networks to develop complex representations and functions based on the inputs that would not be possible with a simple linear regression model. Many different nonlinear activation functions have been proposed throughout the history of neural networks. In this post, […]

Continue Reading