Activation

Activation is the last operation in a perceptron (or neuron, or neural unit). It is typically a non-linear scalar-to-scalar function, such as ReLU (which maps negative values to zero and keeps non-negative values unchanged).
Related concepts:
Perceptron