We propose a class of very simple modifications gradient descent and stochastic leveraging Laplacian smoothing. show that when applied to large variety machine learning problems, ranging from logistic regression deep neural nets, the proposed surrogates can dramatically reduce variance, allow take larger step size, improve generalization accuracy. The methods only involve multiplying usual (sto...