3-1-5. Dropout:
Dropout [15] is a method employed in deep learning to keep the model from overfitting to the training data by dropping out neurons randomly during the training process. The aim is to make the model learn more generalized features and avoid memorizing the training data. The ratio of dropped out neurons is a hyperparameter that can be fine-tuned to achieve the ideal balance between overfitting and underfitting. Dropout is often combined with other regularization techniques to enhance the performance of models in computer vision and natural language processing applications.