Hyperparameter Optimization

Hyperparameter optimization is the process of optimizing "high level" parameters that control the architecture of a machine learning model -- typically neural networks. This is in contrast to the model's parameters, which are optimized by the learning algorithm -- e.g. stochastic gradient descent and backpropagation in case of neural networks. For example: the number of hidden units of a given layer of a neural network is a hyperparameter of said network.
Related concepts:
Grid SearchRandom Search