We study the problem of distributed training neural networks (NNs) on devices with heterogeneous, limited, and time-varying availability computational resources. present an adaptive, resource-aware, on-device learning mechanism, DISTREAL, which is able to fully efficiently utilize available resources in a manner, increasing convergence speed. This achieved dropout mechanism that dynamically adj...