نتایج جستجو برای: normalization constant

تعداد نتایج: 243334  

2010
Marios Poulos Vassilios S. Belesiotis Nikos Alexandris

This paper focuses on solving the problems of preparing and normalizing data that are captured from a classroom observation, and are linked with significant relevant properties. We adapt these data using a Bayesian model that creates normalization conditions to a well fitted artificial neural network. We separate the method in two stages: first implementing the data variable in a functional mul...

Journal: :Statistical applications in genetics and molecular biology 2006
Yifan Huang Jason C Hsu Mario Peruggia Abigail A Scott

Maintenance genes can be used for normalization in the comparison of gene expressions. Even though the absolute expression levels of maintenance genes may vary considerably among different tissues or cells, a set of maintenance genes may provide suitable normalization if their expression levels are relatively constant in the specific tissues or cells of interest. A statistical procedure is prop...

2003
Jian Huang Hsun-Chih Kuo Irina Koroleva Cun-Hui Zhang Marcelo Bento Soares

Motivation: Microarray analysis is a technology for monitoring gene expression levels on a large scale and has been widely used in functional genomics. A challenging issue in the analysis of microarray data is normalization. A proper normalization procedure ensures that the intensity ratios provide meaningful measures of relative expression levels. There are two important questions concerning n...

Journal: :Bioinformatics 2003
Sue C. Geller Jeff P. Gregg Paul Hagerman David M. Rocke

MOTIVATION Most methods of analyzing microarray data or doing power calculations have an underlying assumption of constant variance across all levels of gene expression. The most common transformation, the logarithm, results in data that have constant variance at high levels but not at low levels. Rocke and Durbin showed that data from spotted arrays fit a two-component model and Durbin, Hardin...

2015
Daniel Fehr Jonathan Scarlett Alfonso Martinez

This paper proposes a new method to reduce the error rate of channel codes over an AWGN channel by renormalizing the codewords to a constant energy before transmission and decoding with the original codebook. Evaluation of the randomcoding error exponent reveals that this normalization technique approaches the constant-composition error exponent for certain pairs of rate and signal-to-noise ratio.

Journal: :Journal of Machine Learning Research 2017
Takashi Takenouchi Takafumi Kanamori

In this paper, we focus on parameters estimation of probabilistic models in discrete space. A naive calculation of the normalization constant of the probabilistic model on discrete space is often infeasible and statistical inference based on such probabilistic models has difficulty. In this paper, we propose a novel estimator for probabilistic models on discrete space, which is derived from an ...

2003
R. Hoffmann F. Knechtli J. Rolf R. Sommer

We present a new normalization condition for the axial current, which is derived from the PCAC relation with non–vanishing mass. Using this condition reduces the O(r0m) corrections to the axial current normalization constant ZA for an easier chiral extrapolation in the cases, where simulations at zero quark–mass are not possible. The method described here also serves as a preparation for a dete...

2018
Xiaoxia Wu Rachel Ward L'eon Bottou

Adjusting the learning rate schedule in stochastic gradient methods is an important unresolved problem which requires tuning in practice. If certain parameters of the loss function such as smoothness or strong convexity constants are known, theoretical learning rate schedules can be applied. However, in practice, such parameters are not known, and the loss function of interest is not convex in ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید