The isotonic extension of isotonic mappings
نویسندگان
چکیده
منابع مشابه
Bayesian isotonic density regression.
Density regression models allow the conditional distribution of the response given predictors to change flexibly over the predictor space. Such models are much more flexible than nonparametric mean regression models with nonparametric residual distributions, and are well supported in many applications. A rich variety of Bayesian methods have been proposed for density regression, but it is not c...
متن کاملOptimal Reduced Isotonic Regression
Isotonic regression is a shape-constrained nonparametric regression in which the ordinate is a nondecreasing function of the abscissa. The regression outcome is an increasing step function. For an initial set of n points, the number of steps in the isotonic regression, m, may be as large as n. As a result, the full isotonic regression has been criticized as overfitting the data or making the re...
متن کاملIsotonic Classification Trees
We propose a new algorithm for learning isotonic classification trees. It relabels non-monotone leaf nodes by performing the isotonic regression on the collection of leaf nodes. In case two leaf nodes with a common parent have the same class after relabeling, the tree is pruned in the parent node. Since we consider problems with ordered class labels, all results are evaluated on the basis of L1...
متن کاملOnline Isotonic Regression
We consider the online version of the isotonic regression problem. Given a set of linearly ordered points (e.g., on the real line), the learner must predict labels sequentially at adversarially chosen positions and is evaluated by her total squared loss compared against the best isotonic (nondecreasing) function in hindsight. We survey several standard online learning algorithms and show that n...
متن کاملIsotonic Hawkes Processes
0 g⇤(w⇤ ·xt)dt = P j2Si aijg ⇤ (w⇤ ·xj). Set y⇤ i = g ⇤ (w⇤ ·xi) to be the expected value of each yi. Let ̄ Ni be the expected value of Ni. Then we have ̄ Ni = P j2Si aijy ⇤ j . Clearly we do not have access to ̄ Ni. However, consider a hypothetical call to the algorithm with input {(xi, ̄ Ni)}i=1 and suppose it returns ḡk. In this case, we define ȳk i = ḡk(w̄k · xi). Next we begin the proof and int...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Časopis pro pěstování matematiky
سال: 1967
ISSN: 0528-2195
DOI: 10.21136/cpm.1967.108392