Take some time out to understand standard machine learning terms you will come across…
Machine Learning is a broad field with various other sub-fields such as computer vision, natural language processing, speech recognition, and plenty more.
Despite the many branching subfields of machine learning, some key terms are common across all subfields.
This article presents some common machine learning terms, along with their description and an explanation of the presented terminologies.
From reading this article, you will either gain knowledge in the previously unseen machine learning words or get a refresher on terminologies you might have come across in your machine learning practice
An activation function is a mathematical operation that transforms the result or signals of neurons into a normalized output.
The purpose of an activation function as a component of a neural network is to introduce non-linearity within the network. The inclusion of an activation function enables the neural network to have greater representational power and solve complex functions.
The activation function is also referred to as ‘squashing function’.
Examples of common activation function are as follow:
- Sigmoid function
- Softmax function