Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








101,848 Hits in 3.4 sec

Adaptive Mixtures of Local Experts

Robert A. Jacobs, Michael I. Jordan, Steven J. Nowlan, Geoffrey E. Hinton
1991 Neural Computation  
We demonstrate that the learning procedure divides up a vowel discrimination task into appropriate subtasks, each of which can be solved by a very simple expert network.  ...  We present a new supervised learning procedure for systems composed of many separate networks, each of which learns to handle a subset of the complete set of training cases.  ...  Hinton is a fellow of the Canadian Institute for Advanced Research.  ... 
doi:10.1162/neco.1991.3.1.79 fatcat:4p7qojbaujbrbmwv2iv74mfimi

Self-organized segmentation of time series: separating growth hormone secretion in acromegaly from normal controls

K. Prank, M. Kloppstech, S.J. Nowlan, T.J. Sejnowski, G. Brabant
1996 Biophysical Journal  
acting in parallel (adaptive mixtures of local experts).  ...  By performing a self-organized segmentation of the alternating phases of secretory bursts and quiescence of GH, we significantly improved the performance of the multiple network system over that of the  ...  adaptive mixtures of local experts used for time series prediction.  ... 
doi:10.1016/s0006-3495(96)79825-9 pmid:8744293 pmcid:PMC1225235 fatcat:yaap53vaunbvjhmacrht7ttmha

Interpolating Earth-science Data using RBF Networks and Mixtures of Experts

Ernest Wan, Don Bone
1996 Neural Information Processing Systems  
We present a mixture of experts (ME) approach to interpolate sparse, spatially correlated earth-science data.  ...  correlated regions and learn the local covariation model of the data in each region.  ...  MIXTURE OF GRBF EXPERTS Mixture of experts (Jacobs et al , 1991 ) is a modular neural network architecture in which a number of expert networks augmented by a gating network compete to learn the data.  ... 
dblp:conf/nips/WanB96 fatcat:ygg75pxxfrhs3ltrfgae6m4myy

Multi-modal Gated Mixture of Local-to-Global Experts for Dynamic Image Fusion [article]

Yiming Sun, Bing Cao, Pengfei Zhu, Qinghua Hu
2023 arXiv   pre-print
Our model consists of a Mixture of Local Experts (MoLE) and a Mixture of Global Experts (MoGE) guided by a multi-modal gate.  ...  To fill this gap, we propose a dynamic image fusion framework with a multi-modal gated mixture of local-to-global experts, termed MoE-Fusion, to dynamically extract effective and comprehensive information  ...  Specifically, we propose a dynamic image fusion framework with a multi-modal gated mixture of local-to-global experts, termed MoE-Fusion, which consists of a Mixture of Local Experts (MoLE) and a Mixture  ... 
arXiv:2302.01392v2 fatcat:mzmeppya5zdsln5lpzqarknfwa

A divide-and-conquer learning architecture for predicting unknown motion

Patrice Wira, Jean-Philippe Urban, Julien Gresser
2001 The European Symposium on Artificial Neural Networks  
In this paper, we propose a modular prediction scheme consisting in a mixture of expert architecture.  ...  Several Kalman filters are forced to adapt their dynamics and parameters to different parts of the whole dynamics of the system.  ...  This is done by the divide-and-conquer principle of the mixture of experts.  ... 
dblp:conf/esann/WiraUG01 fatcat:i46swdyzbjeavnva5k4sv6ooda

Terrain Segmentation with On-Line Mixtures of Experts for Autonomous Robot Navigation [chapter]

Michael J. Procopio, W. Philip Kegelmeyer, Greg Grudic, Jane Mulligan
2009 Lecture Notes in Computer Science  
We describe an on-line machine learning ensemble technique, based on an adaptation of the mixture of experts (ME) model, for predicting terrain in autonomous outdoor robot navigation.  ...  We use the distribution of training data as the source of the a priori pointwise mixture coefficients that form the soft gating network in the ME model.  ...  The authors gratefully acknowledge the contribution of Sandia National Laboratories, the National Science Foundation, the DARPA LAGR program, and the anonymous referees' insightful comments.  ... 
doi:10.1007/978-3-642-02326-2_39 fatcat:skl3kijkunbeplewrmoe42sgg4

Page 213 of Neural Computation Vol. 6, Issue 2 [page]

1994 Neural Computation  
Adaptive mixtures of local experts. Neural Comp. 3, 79-87. Jordan, M. I., and Jacobs, R. A. 1992. Hierarchies of adaptive experts. In Ad- vances in Neural Information Processing Systems 4, J.  ...  Convergence Properties of the EM Approach to Learning in Mixture-of-Experts Architectures. Computational Cognitive Science Tech. Rep. 9301, MIT, Cambridge, MA. Little, R. J. A., and Rubin, D.  ... 

Page 372 of Neural Computation Vol. 8, Issue 2 [page]

1996 Neural Computation  
Adaptive mixtures of local experts. Neural Comp. 3(1), 79-87. Jordan, M. I., and Jacobs, R. A. 1992. Hierarchies of adaptive experts. In NIPS 4, J. Moody, S. Hansen, and R. Lippman, eds.  ...  Hierarchical mixtures of experts and the EM algorithm. Neural Comp. 6(1), 181-214. Kadirkamanathan, V., and Niranjan, M. 1992.  ... 

A universal image coding approach using sparse steered Mixture-of-Experts regression

Ruben Verhack, Thomas Sikora, Lieven Lange, Glenn Van Wallendael, Peter Lambert
2016 2016 IEEE International Conference on Image Processing (ICIP)  
To this end, we introduce a sparse Mixture-of-Experts regression approach for coding images in the pixel domain.  ...  As such, each component in the mixture of experts steers along the direction of highest correlation. The conditional density then serves as the regression function.  ...  In this paper, we introduce a sparse Steered Mixture-of-Experts (SMoE) representation for images that provide local adaptability with global support.  ... 
doi:10.1109/icip.2016.7532737 dblp:conf/icip/VerhackSLWL16 fatcat:b2fxpw5tnfdeddrql5jdoghz7u

Learning to combine multi-sensor information for context dependent state estimation

Alexandre Ravet, Simon Lacroix, Gautier Hattenberger, Bertrand Vandeportaele
2013 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems  
This knowledge is later used as a decision rule in the fusion task in order to dynamically select the most appropriate subset of sensors. For this purpose we use the Mixture of Experts framework.  ...  In our application, each expert is a Kalman filter fed by a subset of sensors, and a gating network serves as a mediator between individual filters, basing its decision on sensor inputs and contextual  ...  Using Localized gating network to encode decision rules Besides adapting the perception modalities, we also aim at switching smoothly between experts.  ... 
doi:10.1109/iros.2013.6697111 dblp:conf/iros/RavetLHV13 fatcat:rhc2t36njne2xn7qwyj6cqsk5e

Normalized Gaussian Network Based on Variational Bayes Inference and Hierarchical Model Selection
変分法的ベイズ推定法に基づく正規化ガウス関数ネットワークと階層的モデル選択法

Junichiro YOSHIMOTO, Shin ISHII, Masa-aki SATO
2003 Transactions of the Society of Instrument and Control Engineers  
We introduce a hierarchical prior distribution of the model parameters and the NGnet is trained based on the variational Bayes (VB) inference.  ...  The performance of our method is evaluated by using function approximation and nonlinear dynamical system identification problems. Our method achieved better performance than existing methods.  ...  Hinton: Adaptive mixtures of local experts, Neural Computation, 3, 79/87 (1991) 17) S.R. Waterhouse: Classification and regression using mixtures of experts, Ph. D.  ... 
doi:10.9746/sicetr1965.39.503 fatcat:bwsnt6at3vdclij6xktdugvtoy

Mixture of Experts and Local-Global Neural Networks

Mayte Suárez-Fariñas, Carlos Eduardo Pedreira
2003 The European Symposium on Artificial Neural Networks  
In this paper we investigate mixture of experts problems in the context of Local-Global Neural Networks.  ...  Because of its local characteristics, this type of approach brings the advantage of improving interpretability.  ...  Local-Global Neural Networks and Mixture of Experts The idea of using a mixture of experts for achieving a complex mapping function, based on a "divide and conquer" strategy, was proposed by [4] .  ... 
dblp:conf/esann/Suarez-FarinasP03 fatcat:iqqv4ve6bzfhnipmz3xc4ict5a

Adaptive mixture-of-experts models for data glove interface with multiple users

Jong-Won Yoon, Sung-Ihk Yang, Sung-Bae Cho
2012 Expert systems with applications  
In order to solve these problems and implement multi-user data glove interface successfully, we propose an adaptive mixture-of-experts model for data-glove based hand gesture recognition models which can  ...  The mixture-of-experts model is trained with an expectation-maximization (EM) algorithm and an on-line learning rule.  ...  Acknowledgements This research was supported by the Original Technology Research Program for Brain Science through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science  ... 
doi:10.1016/j.eswa.2011.10.030 fatcat:tu42pjhimjf3pbef6lxqwbrzty

Team Deep Mixture of Experts for Distributed Power Control [article]

Matteo Zecchin, David Gesbert, Marios Kountouris
2020 arXiv   pre-print
With this goal in mind, we propose an architecture inspired from the well-known Mixture of Experts (MoE) model, which was previously used for non-linear regression and classification tasks in various contexts  ...  In particular, it was established that DNNs can be used to derive policies that are robust with respect to the information noise statistic affecting the local information (e.g.  ...  Deep Mixture of Experts Dating back to 1991, the Mixture of Expert (MoE) model has been proposed by Jacobs et al. [13] as an ensemble method based on the dividi et impera" principle.  ... 
arXiv:2007.14147v1 fatcat:34i4vsztvrei5izhldw2xevxrm
« Previous Showing results 1 — 15 out of 101,848 results