نتایج جستجو برای: SGD
تعداد نتایج: 1169 فیلتر نتایج به سال:
objective: oxidative stress plays a key role in the pathophysiology of brain ischemia and neurodegenerative disorders.previous studies indicated that viola tricolor and viola odorataare rich sources of antioxidants. this study aimed to determine whether these plants protect neurons against serum/glucose deprivation (sgd)-induced cell death in an in vitro model of ischemia and neurodegeneration....
Objective: Oxidative stress plays a key role in the pathophysiology of brain ischemia and neurodegenerative disorders.Previous studies indicated that Viola tricolor and Viola odorataare rich sources of antioxidants. This study aimed to determine whether these plants protect neurons against serum/glucose deprivation (SGD)-induced cell death in an in vitro model of ischemia and neurodegeneration....
Objective: Oxidative stress is associated with the pathogenesis of brain ischemia and other neurodegenerative disorders. Previous researches have shown the antioxidant activity of Viola odorata L. In this project, we studied neuro-protective and reactive oxygen species (ROS) scavenging activities of methanol (MeOH) extract and other fractions isolated from <e...
We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the project...
Stochastic Gradient Descent (SGD) is arguably the most popular of the machine learning methods applied to training deep neural networks (DNN) today. It has recently been demonstrated that SGD can be statistically biased so that certain elements of the training set are learned more rapidly than others. In this article, we place SGD into a feedback loop whereby the probability of selection is pro...
In this paper, we propose and analyze SQuARM-SGD, a communication-efficient algorithm for decentralized training of large-scale machine learning models over network. each node performs fixed number local SGD steps using Nesterov's momentum then sends sparsified quantized updates to its neighbors regulated by locally computable triggering criterion. We provide convergence guarantees our general ...
Stochastic gradient descent (SGD) is a popular stochastic optimization method in machine learning. Traditional parallel SGD algorithms, e.g., SimuParallel SGD [1], often require all nodes to have the same performance or to consume equal quantities of data. However, these requirements are difficult to satisfy when the parallel SGD algorithms run in a heterogeneous computing environment; low-perf...
Stochastic gradient descent (SGD) and its variants have become more and more popular in machine learning due to their efficiency and effectiveness. To handle large-scale problems, researchers have recently proposed several parallel SGD methods for multicore systems. However, existing parallel SGD methods cannot achieve satisfactory performance in real applications. In this paper, we propose a f...
The convergence of Stochastic Gradient Descent (SGD) using convex loss functions has been widely studied. However, vanilla SGD methods using convex losses cannot perform well with noisy labels, which adversely affect the update of the primal variable in SGD methods. Unfortunately, noisy labels are ubiquitous in real world applications such as crowdsourcing. To handle noisy labels, in this paper...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید