A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
Stepwise Induction of Model Trees
[chapter]
2001
Lecture Notes in Computer Science
Its main characteristic is the induction of trees with two types of nodes: regression nodes, which perform only straight-line regression, and splitting nodes, which partition the sample space. ...
In this paper a method for the data-driven construction of model trees is presented, namely the Stepwise Model Tree Induction (SMOTI) method. ...
The authors thank Lynn Rudd for her help in reading the paper and Marcello Lucente for his collaboration in conducting the experiments. ...
doi:10.1007/3-540-45411-x_3
fatcat:nxbcflntjzcfdfn3lorgi3gowa
Adapting Peepholing to Regression Trees
[chapter]
2005
Lecture Notes in Computer Science
should be considered when searching for the best cut point split. ...
This paper presents an adaptation of the peepholing method to regression trees. ...
The work of Catlett addresses classification trees grown using information gain [6] as the criterion for selecting the best split of a node. ...
doi:10.1007/11595014_30
fatcat:3zwzz336qrblriufr5p453tr2a
Trading-Off Local versus Global Effects of Regression Nodes in Model Trees
[chapter]
2002
Lecture Notes in Computer Science
Its main characteristic is the induction of trees with two types of nodes: regression nodes, which perform only straight-line regression, and split nodes, which partition the sample space. ...
In this paper a method for the top-down induction of model trees is presented, namely the Stepwise Model Tree Induction (SMOTI) method. ...
The authors thank Valentina Tamma and Domenico Pallotta for their collaboration and Tom Mitchell for his valuable comments on a preliminary version of this paper. ...
doi:10.1007/3-540-48050-1_43
fatcat:e6mizs6ak5hbviprd2fgj3acz4
Top-down induction of model trees with regression and splitting nodes
2004
IEEE Transactions on Pattern Analysis and Machine Intelligence
Its main characteristic is the induction of trees with two types of nodes: regression nodes, which perform only straight-line regression, and splitting nodes, which partition the feature space. ...
Model trees are an extension of regression trees that associate leaves with multiple regression models. ...
regression nodes and splitting nodes are compared for selection. 3. ...
doi:10.1109/tpami.2004.1273937
pmid:15460282
fatcat:5i3gr7b5bbh6zi4tdknan5yehq
JSRT: James-Stein Regression Tree
[article]
2020
arXiv
pre-print
Given a target data for prediction, a regression tree is first constructed based on a training dataset before making prediction for each leaf node. ...
To address this issue, we propose a novel regression tree, named James-Stein Regression Tree (JSRT) by considering global information from different nodes. ...
., t d }, a regression tree chooses the best splitting feature and value pair to split into two subsets (children nodes). ...
arXiv:2010.09022v2
fatcat:mwnetpa6knga7bzcwj2dwhnizi
Speeding-Up Hoeffding-Based Regression Trees With Options
2011
International Conference on Machine Learning
Option trees build upon regular trees by adding splitting options in the internal nodes. As such they are known to improve accuracy, stability and reduce ambiguity. ...
In this paper, we present on-line option trees for faster learning on numerical data streams. ...
The main motivation is that introducing option nodes removes the need for selecting the best splitting attribute. ...
dblp:conf/icml/IkonomovskaGZD11
fatcat:g7fmcedn7faz7d5cs2tpptfxiy
A regression tree approach using mathematical programming
2017
Expert systems with applications
This work introduces a novel methodology of node partitioning which, in a single optimisation model, simultaneously performs the two tasks of identifying the break-point of a binary split and assignment ...
of multivariate functions to either leaf, thus leading to an efficient regression tree model. ...
Acknowledgement Funding from the UK Engineering and Physical Sciences Research Council (to LY, SL and LGP through the EPSRC Centre for Innovative Manufacturing in Emergent Macromolecular Therapies, EP/ ...
doi:10.1016/j.eswa.2017.02.013
fatcat:yhsima3k6vaw7asf2ysb5qk7ou
Decision Stream: Cultivating Deep Decision Trees
[article]
2017
arXiv
pre-print
Tree node splitting based on relevant feature selection is a key step of decision tree learning, at the same time being their major shortcoming: the recursive nodes partitioning leads to geometric reduction ...
Our experimental results reveal that the proposed approach significantly outperforms the standard decision tree learning methods on both regression and classification tasks, yielding a prediction error ...
The parameters of DT were tuned for each dataset, including best criterion selection and tree pre-pruning. ...
arXiv:1704.07657v3
fatcat:c2bwttudtjaq3ij3rnmdjgx3ju
Visualizing the decision-making process in deep neural decision forest
[article]
2019
arXiv
pre-print
We then apply NDF on a multi-task coordinate regression problem and demonstrate the distribution of routing probabilities, which is vital for interpreting NDF yet not shown for regression problems. ...
Deep neural decision forest (NDF) achieved remarkable performance on various vision tasks via combining decision tree and deep representation learning. ...
We gratefully acknowledge the support of NVIDIA Corporation with the donation of the Titan Xp GPU used for this research. ...
arXiv:1904.09201v1
fatcat:4zm5va2z2ng5vluuduw2nnj7ni
An Introduction to Tree-Structured Modeling With Application to Quality of Life Data
2011
Nursing Research
Discussion-As illustrated by the QOL analysis example, tree methods generate interesting and easily understood findings that cannot be uncovered via traditional linear regression analysis. ...
(BCEI) study, and (c) implications for their potential use in nursing research are discussed. ...
The greedy search scheme for best split achieves only local optimality for each node, but the resultant tree models are not necessarily globally optimal. ...
doi:10.1097/nnr.0b013e318221f9bc
pmid:21720217
pmcid:PMC3136208
fatcat:4gliayuscbafzfuqg4ksvlharm
Classification and regression tree analysis vs. multivariable linear and logistic regression methods as statistical tools for studying haemophilia
2015
Haemophilia
Classification trees (CTs) are used to analyse categorical outcomes and regression trees (RTs) to analyse continuous ones. ...
Aims: The present paper sought to didactically explain how, why, and when to use classification and regression tree (CART) analysis for haemophilia research. ...
The program found no other explanatory variable for the patients in the child node containing the 29 patients aged ≤45 years that best split this node into two new child nodes. ...
doi:10.1111/hae.12778
pmid:26248714
fatcat:u6q6bezshzd7nptndeowu3nkuu
PLANET
2009
Proceedings of the VLDB Endowment
>> run the basic tree growing algorithm on the
records
Output the best splits for each node in the subtree
Ensembles
• Bagging
Construct multiple trees in parallel, each on a sample of the ...
; launch an InMemory MapReduce job to grow the entire subtree For larger nodes, launches a MapReduce job to find candidate best splits Collects results from MapReduce jobs and chooses the best split ...
doi:10.14778/1687553.1687569
fatcat:3wwopwox5beldadom7grqo6rwe
SETAR-Tree: A Novel and Accurate Tree Algorithm for Global Time Series Forecasting
[article]
2022
arXiv
pre-print
The depth of the tree is controlled by conducting a statistical linearity test commonly employed in TAR models, as well as measuring the error reduction percentage at each node split. ...
global Pooled Regression (PR) models in the leaves allowing the models to learn cross-series information and also uses some time-series-specific splitting and stopping procedures. ...
Regression trees use every possible binary split on every input attribute to determine the node splits. ...
arXiv:2211.08661v1
fatcat:gu5g347im5dd5lcb6gmhwsjqva
Clustering Mixed Data Using Non-Normal Regression Tree For Process Monitoring
2012
Zenodo
In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. ...
The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability. ...
The decision tree performs an iterative binary partitioning that splits a decision node into two sub-division nodes. There are two types of the decision tree, classification tree and regression tree. ...
doi:10.5281/zenodo.1329427
fatcat:m37im46o3bgq3hrxqqaky4rm3u
Using Turning Point Detection to Obtain Better Regression Trees
[chapter]
2013
Lecture Notes in Computer Science
The issue of detecting optimal split points for linear regression trees is examined. ...
A novel approach called Turning Point Regression Tree Induction (TPRTI) is proposed which uses turning points to identify the best split points. ...
that best reduces the variance in the nodes is chosen for the split, and its mean value is selected as the split value. ...
doi:10.1007/978-3-642-39712-7_25
fatcat:aymjv5qly5g37owq2oyrcidit4
« Previous
Showing results 1 — 15 out of 61,912 results