"It is a formal concept based on fuzzy topology that removes
geometric anomalies on fuzzy regions." (Markus Schneider, "Fuzzy Spatial Data
Types for Spatial Uncertainty Management in Databases", 2008)
"It is any method of preventing overfitting of data by a model and it is used for solving ill-conditioned parameter-estimation problems." (Cecilio Angulo & Luis Gonzalez-Abril, "Support Vector Machines", 2009)
"Optimization of both complexity and performance of a neural
network following a linear aggregation or a multi-objective algorithm." (M P
Cuéllar et al, "Multi-Objective Training of Neural Networks", 2009)
"Including a term in the error function such that the training process favours networks of moderate size and complexity, that is, networks with small weights and few hidden units. The goal is to avoid overfitting and support generalization." (Frank Padberg, "Counting the Hidden Defects in Software Documents", 2010)
"It refers to the procedure of bringing in additional
knowledge to solve an ill-posed problem or to avoid overfitting. This
information appears habitually as a penalty term for complexity, such as
constraints for smoothness or bounds on the norm." (Vania V Estrela et al, "Total
Variation Applications in Computer Vision", 2016)
"This is a general method to avoid overfitting by applying additional constraints to the model that is learned. A common approach is to make sure the model weights are, on average, small in magnitude." (Rayid Ghani & Malte Schierholz, "Machine Learning", 2017)
"Regularization is a method of penalizing complex models to reduce their variance. Specifically, a penalty term is added to the loss function we are trying to minimize [...]" (Chris Albon, "Machine Learning with Python Cookbook", 2018)
"Regularization, generally speaking, is a wide range of ML techniques aimed at reducing overfitting of the models while maintaining theoretical expressive power." (Jonas Teuwen & Nikita Moriakov, "Convolutional neural networks", 2020)
No comments:
Post a Comment