نتایج جستجو برای: patient dropout
تعداد نتایج: 714395 فیلتر نتایج به سال:
Recurrent neural networks (RNNs) stand at the forefront of many recent developments in deep learning. Yet a major difficulty with these models is their tendency to overfit, with dropout shown to fail when applied to recurrent layers. Recent results at the intersection of Bayesian modelling and deep learning offer a Bayesian interpretation of common deep learning techniques such as dropout. This...
In clinical trials with multiple visits, dropouts often occur and the population of patients who dropped out may be different from the population of patients who completed the study. To assess treatment effects over the population of all randomized patients, which is called the intention-to-treat analysis and is required by regulatory agencies, last observation analysis (LOAN) focuses on the la...
Longitudinal studies often gather joint information on time to some event (survival analysis, time to dropout) and serial outcome measures (repeated measures, growth curves). Depending on the purpose of the study, one may wish to estimate and compare serial trends over time while accounting for possibly non-ignorable dropout or one may wish to investigate any associations that may exist between...
A common problem encountered in statistical analysis is that of missing data, which occurs when some variables have missing values in some units. The present paper deals with the analysis of longitudinal continuous measurements with incomplete data due to non-ignorable dropout. In repeated measurements data, as one solution to a such problem, the selection model assumes a mechanism of outcome-d...
We propose a novel framework to adaptively adjust the dropout rates for the deep neural network based on a Rademacher complexity bound. The state-of-the-art deep learning algorithms impose dropout strategy to prevent feature co-adaptation. However, choosing the dropout rates remains an art of heuristics or relies on empirical grid-search over some hyperparameter space. In this work, we show the...
Missing data and especially dropouts frequently arise in longitudinal data. Maximum likelihood estimates are consistent when data are missing at random (MAR) but, as this assumption is not checkable, pattern mixture models (PMM) have been developed to deal with informative dropout. More recently, latent class models (LCM) have been proposed as a way to relax PMM assumptions. The aim of this pap...
This study examines dropout incidence, moment of dropout, and switching behavior in organized exercise programs for seniors in the Netherlands, as determined in a prospective cohort study (with baseline measurements at the start of the exercise program and follow-up after 6 months; N = 1,725, response rate 73%). Participants were community-living individuals 50+ who participated in different fo...
The big breakthrough on the ImageNet challenge in 2012 was partially due to the ‘dropout’ technique used to avoid overfitting. Here, we introduce a new approach called ‘Spectral Dropout’ to improve the generalization ability of deep neural networks. We cast the proposed approach in the form of regular Convolutional Neural Network (CNN) weight layers using a decorrelation transform with fixed ba...
Dropout training, originally designed for deep neural networks, has been successful on high-dimensional single-layer natural language tasks. This paper proposes a theoretical explanation for this phenomenon: we show that, under a generative Poisson topic model with long documents, dropout training improves the exponent in the generalization bound for empirical risk minimization. Dropout achieve...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید