25 May 2018

🔬Data Science: Multilayer Perceptron (Definitions)

"A neural net composed of three or more slabs (and therefore two or more layers of weighted connection paths); such nets are capable of solving more difficult problems than are single layer nets. They are often trained by backpropagation." (Laurene V Fausett, "Fundamentals of Neural Networks: Architectures, Algorithms, and Applications", 1994)

"A fully connected feedforward NN with at least one hidden layer that is trained using back-propagation algorithmic techniques." (Ioannis Papaioannou et al, "A Survey on Neural Networks in Automated Negotiations", Encyclopedia of Artificial Intelligence, 2009)

"A kind of feed-forward neural network which has at least one hidden layer of neurons." (Fernando Mateo et al, "A 2D Positioning Application in PET Using ANNs", Encyclopedia of Artificial Intelligence, 2009)

"A neural network that has one or more hidden layers, each of which has a linear combination function and executes a nonlinear activation function on the input to that layer." (Robert Nisbet et al, "Handbook of statistical analysis and data mining applications", 2009)

"It has a layered architecture consisting of input, hidden and output layers. Each layer consists of a number of perceptrons." (Siddhartha Bhattacharjee et al, "Quantum Backpropagation Neural Network Approach for Modeling of Phenol Adsorption from Aqueous Solution by Orange Peel Ash", 2013)

"A type of neural network. The MLP is the most common, and arguably the simplest, neural network used for classification." (Meta S Brown, "Data Mining For Dummies", 2014)

"An artificial neural network model with feed forward architecture that maps sets of input data onto a set of desired outputs iteratively, through the process of learning. A MLP consists of an input layer of neurons, one or more hidden layers of neurons and an output layer of neurons, where each layer is fully connected to the next layer." (Eitan Gross, "Stochastic Neural Network Classifiers", 2015) 

"an important class of ANN that typically consists of the input layer, one or more hidden layers of computation nodes, and an output layer. The input signal propagates through the network in a forward direction, on a layer-by-layer basis." (Pablo Escandell-Montero et al,"Artificial Neural Networks in Physical Therapy", 2015)

"Arguably the most popular artificial neural network model. It is usually composed by three or four layers of units. Each unit is fully connected to the units of the previous layer. Learning is customarily performed via the backpropagation rule." (D T Pham & M Castellani, "The Bees Algorithm as a Biologically Inspired Optimisation Method", 2015)

"Is an ANN type that requires a reference to learn patterns. It is trained using (error) back propagation algorithm." (Kandarpa K Sarma, "Learning Aided Digital Image Compression Technique for Medical Application", 2016)

"MLP is a feed forward neural network with one or more layers between input and output layer and are used to solve non-linearly separable problems. MLPs are trained using the back propagation algorithm. MLPs are widely used in pattern classification, recognition, prediction, etc." (Mridusmita Sharma & Kandarpa K Sarma, "Soft-Computational Techniques and Spectro-Temporal Features for Telephonic Speech Recognition: An Overview and Review of Current State of the Art", 2016)

No comments:

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.