A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Constrained Extreme Learning Machines: A Study on Classification Cases
[article]
2015
arXiv
pre-print
Extreme learning machine (ELM) is an extremely fast learning method and has a powerful performance for pattern recognition tasks proven by enormous researches and engineers. ...
In this paper, we proposed new ways, named "constrained extreme learning machines" (CELMs), to randomly select hidden neurons based on sample distribution. ...
Random Sum Extreme Learning Machine The Random Sum Extreme Learning Machine (RSELM) utilizes sum vectors of random sample vectors regardless of classes to construct the weights from the input layer to ...
arXiv:1501.06115v2
fatcat:4e72pju7ivg77fa2o2fvlwzs2q
Recent Trends in ELM and MLELM: A review
2017
Advances in Science, Technology and Engineering Systems
Extreme Learning Machine (ELM) is a high effective learning algorithm for the single hidden layer feed forward neural networks. ...
Multilayer extreme learning machine is a learning algorithm of an Artificial Neural Network (ANN) which takes to be good for deep learning and extreme learning machine. ...
Input weights and biases of the hidden layer are random in case of ELM, but they are orthogonal in ELM-AE. ...
doi:10.25046/aj020108
fatcat:xqt6x6epojeqhiipctxwgohcem
Simulation of discharge coefficient of side weirs placed on convergent canals using modern self-adaptive extreme learning machine
2020
Applied Water Science
In this study, a new artificial intelligence model entitled "self-adaptive extreme learning machine" (SAELM) is developed for simulating the discharge coefficient of side weirs located upon rectangular ...
Side weirs are broadly used in irrigation channels, drainage systems and sewage disposal canals for controlling and adjusting the flow in main channels. ...
In the ELM, weights and biases between hidden and output layer neurons are allocated randomly. ...
doi:10.1007/s13201-019-1136-0
fatcat:f5mhj3myzzhn7aqwyxjnzyzx2q
A hybrid Deep Boltzmann Functional Link Network for classification problems
2016
2016 IEEE Symposium Series on Computational Intelligence (SSCI)
In a DBFLN, the features generated at the two hidden layers of the DBM act as the input features and the enhancement layer responses of the FLN. ...
The performance of the DBFLN in classifying the image quality is compared with those of Support Vector Machines, Extreme Learning Machines, Random Vector Functional Link Network, and Deep Belief Network ...
ACKNOWLEDGMENT The authors wish to extend their thanks to the ATMRI:2014-R8, Singapore, for providing financial support to conduct this study. ...
doi:10.1109/ssci.2016.7850114
dblp:conf/ssci/SavithaCSLS16
fatcat:ctgf3urzpjfndh6arydabtftxu
A Hybrid Method Based on Extreme Learning Machine and Self Organizing Map for Pattern Classification
2020
Computational Intelligence and Neuroscience
However, an improper number of hidden neurons and random parameters have a great effect on the performance of the extreme learning machine. ...
Extreme learning machine is a fast learning algorithm for single hidden layer feedforward neural network. ...
Error minimized extreme learning machine (EM-ELM) [17] randomly adds neurons to the hidden layer one by one or group by group and updates output weights recursively. ...
doi:10.1155/2020/2918276
pmid:32908471
pmcid:PMC7468594
fatcat:pa43c6jjqbdjvpzomnauj3jwfu
Some Tricks in Parameter Selection for Extreme Learning Machine
2017
IOP Conference Series: Materials Science and Engineering
Extreme learning machine (ELM) is a widely used neural network with random weights (NNRW), which has made great contributions to many fields. ...
between the input layer and hidden layer, the randomization range of the threshold of hidden nodes, and the type of activation functions. ...
Extreme learning machine Extreme learning machine (ELM) is a special single hidden layer feed-forward neural network (SLFN), which was proposed by Huang's group in 2004 [9] . ...
doi:10.1088/1757-899x/261/1/012002
fatcat:bmprpgjx4rfrtcwa4yesmf4k3a
Extreme Learning Machine Weights Optimization Using Genetic Algorithm In Electrical Load Forecasting
2018
Journal of Information Technology and Computer Science
Extreme Learning Machine method uses random input weight within range -1 to 1. Before the electric load prediction process runs, genetic algorithms first optimizing the input weight. ...
Thus this study implies that Extreme Learning Machine (ELM) method with weight optimization using Genetics Algorithm (GA) can be used in electrical load forecasting problem and give better prediction result ...
Extreme Learning Machine According to Fig. 1 ELM has a 3 layer structure, namely input layer, hidder layer, and output layer. ...
doi:10.25126/jitecs.20183154
fatcat:orwgrnmpujdybma3sfcuxdcwqy
Car-following model based on IPSO-ELM
2020
jecet
The Improved Particle Swarm Optimization (IPSO) algorithm is used to search the global optimal solution optimization weights, thresholds and number of hidden layer nodes, and a carfollowing model based ...
In order to simulate these driving behaviors more realistically, this paper adopts the Extreme Learning Machine (ELM) algorithm with the advantages of high learning efficiency, strong generalization ability ...
The methods of machine learning generally include neural networks, support vector machines, extreme learning machines, decision trees, and random forests. ...
doi:10.24214/jecet.c.9.1.01320
fatcat:aiaf2e7zsrfxviqeypzsqj3i5e
Modification of Hidden Layer Weight in Extreme Learning Machine Using Gain Ratio
2016
MATEC Web of Conferences
Extreme Learning Machine (ELM) is a method of learning feed forward neural network quickly and has a fairly good accuracy. ...
This method is devoted to a feed forward neural network with one hidden layer where the parameters (i.e. weight and bias) are adjusted one time randomly at the beginning of the learning process. ...
In this research, we will add feature weight using gain ratio method as a multiplier factor of hidden layer random weight in Extreme Learning Machine, in the hope that this modification of ELM weighting ...
doi:10.1051/matecconf/20165803010
fatcat:2pcisdcxvnbtvhdnwamiptidge
Comparing error minimized extreme learning machines and support vector sequential feed-forward neural networks
2012
Neural Networks
Recently, error minimized extreme learning machines (EM-ELMs) have been proposed as a simple and efficient approach to build single-hidden-layer feedforward networks (SLFNs) sequentially. ...
They add random hidden nodes one by one (or group by group) and update the output weights incrementally to minimize the sum-of-squares error in the training set. ...
The second conclusion of the study is that, independently of the strategy used (input or random), the number of candidates for the hidden-layer weights is a parameter that controls the trade-off between ...
doi:10.1016/j.neunet.2011.08.005
pmid:21959130
fatcat:fm7abu3sjrbe5llhxyylymle5i
This special issue puts together 15 papers selected carefully from 93 submissions to the international symposium on extreme learning machines (ELM2011, http://www. extreme-learning-machines.org/ELM2011 ...
This symposium provides a forum for academics, researchers and engineers to share and exchange research and development experience on both theoretical studies and practical applications of ELM techniques ...
The paper ''A study on random weights between input and hidden layers in extreme learning machine'' authored by Ran Wang, Sam Kwong, and Xizhao Wang, investigates the impact of random weights during the ...
doi:10.1007/s00500-012-0832-6
fatcat:gwo2e7mlfrbkxj57ntajyq3wty
Local Receptive Fields Based Extreme Learning Machine
2015
IEEE Computational Intelligence Magazine
Extreme learning machine (ELM), which was originally proposed for "generalized" single-hidden layer feedforward neural networks (SLFNs), provides efficient unified learning solutions for the applications ...
In fact, all the parameters of hidden nodes can be independent of training samples and randomly generated according to any continuous probability ...
Philip Chen from University of Macau, Macau for the constructive and fruitful discussions. ...
doi:10.1109/mci.2015.2405316
fatcat:o5n3ivzuizgrrfxyxc53intk5e
Multi-Layer Learning Machines and Smart Sensor Applications
2021
Advances in Artificial Intelligence and Machine Learning
In the ELM, the weighted inputs, and hidden layer biases are shown with randomized generation method, and
the weighted outputs are gotten by implementing a regularized least square method as appeared in ...
learning where one of them is neural networks widely used and, in this work,
we will show a comparison between our proposed method and the machine learning method. ...
doi:10.54364/aaiml.2021.1103
fatcat:kzz2i4f74bgldm7kmev5ek5l34
Exploring the Extreme Learning Machine for Classification of Brain MRIs
2019
International Journal of Engineering and Advanced Technology
Alzheimer, Glioma and Multiple Sclerosis are considered for this work. The Two Hidden layer Extreme learning Machine (TELM) is used for classification of samples into normal or pathological. ...
Accuracy, Recall, Sensitivity and F-score are considered as the classification performance measures in this paper ...
Two hidden layer Extreme Learning Machine Due to the randomly generated hidden layer parameters i.e. weights between the input layer and hidden layer and biases in the hidden layer, the accuracy of the ...
doi:10.35940/ijeat.a1909.129219
fatcat:wqzr6nz46jhrje4t3ynlk5frhi
Machine Learning Techniques and Extreme Learning Machine for Early Breast Cancer Prediction
2020
VOLUME-8 ISSUE-10, AUGUST 2019, REGULAR ISSUE
The problem can be resolved by diagnosing the problem in early spam of time and by providing results with more accuracy.In this paper, different machine learning and neural network algorithmhave been studied ...
The results reveal that extreme learning machine comes to be the better algorithm. ...
Extreme Learning Machine(ELM) It is a technique which is used as single hidden layer feedforward neural network which chooses hidden nodes randomly and determines output weights [23] as in Fig. 6 . ...
doi:10.35940/ijitee.d1411.029420
fatcat:jzkuyc4er5aq3hxoyxpklp52nm
« Previous
Showing results 1 — 15 out of 54,328 results