Jeffrey's prior sampling of deep sigmoidal networks
نویسندگان
چکیده
Neural networks have been shown to have a remarkable ability to uncover low dimensional structure in data: the space of possible reconstructed images form a reduced model manifold in image space. We explore this idea directly by analyzing the manifold learned by Deep Belief Networks and Stacked Denoising Autoencoders using Monte Carlo sampling. The model manifold forms an only slightly elongated hyperball with actual reconstructed data appearing predominantly on the boundaries of the manifold. In connection with the results we present, we discuss problems of sampling high-dimensional manifolds as well as recent work [M. Transtrum, G. Hart, and P. Qiu, Submitted (2014)] discussing the relation between high dimensional geometry and model reduction. ∗ [email protected] 1 ar X iv :1 70 5. 10 58 9v 1 [ co nd -m at .d is -n n] 2 5 M ay 2 01 7
منابع مشابه
Maximally informative stimuli and tuning curves for sigmoidal rate-coding neurons and populations.
A general method for deriving maximally informative sigmoidal tuning curves for neural systems with small normalized variability is presented. The optimal tuning curve is a nonlinear function of the cumulative distribution function of the stimulus and depends on the mean-variance relationship of the neural system. The derivation is based on a known relationship between Shannon's mutual informat...
متن کاملRectifier Nonlinearities Improve Neural Network Acoustic Models
Deep neural network acoustic models produce substantial gains in large vocabulary continuous speech recognition systems. Emerging work with rectified linear (ReL) hidden units demonstrates additional gains in final system performance relative to more commonly used sigmoidal nonlinearities. In this work, we explore the use of deep rectifier networks as acoustic models for the 300 hour Switchboar...
متن کاملContinuous Sigmoidal Belief Networks Trained using Slice Sampling
Real-valued random hidden variables can be useful for modelling latent structure that explains correlations among observed variables. I propose a simple unit that adds zero-mean Gaussian noise to its input before passing it through a sigmoidal squashing function. Such units can produce a variety of useful behaviors, ranging from deterministic to binary stochastic to continuous stochastic. I sho...
متن کاملResurrecting the sigmoid in deep learning through dynamical isometry: theory and practice
It is well known that the initialization of weights in deep neural networks can have a dramatic impact on learning speed. For example, ensuring the mean squared singular value of a network’s input-output Jacobian isO(1) is essential for avoiding the exponential vanishing or explosion of gradients. The stronger condition that all singular values of the Jacobian concentrate near 1 is a property k...
متن کاملSaturation Probabilities of Continuous-Time Sigmoidal Networks
From genetic regulatory networks to nervous systems, the interactions between elements in biological networks often take a sigmoidal or S-shaped form. This paper develops a probabilistic characterization of the parameter space of continuous-time sigmoidal networks (CTSNs), a simple but dynamically-universal model of such interactions. We describe an efficient and accurate method for calculating...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1705.10589 شماره
صفحات -
تاریخ انتشار 2017