Distributed deep learning has emerged as an essential approach for training large-scale deep neural networks by utilising multiple computational nodes. This methodology partitions the workload either ...
1. T.N. Truong, F. Trahay, J. Domke, A. Drozd, E. Vatai, J. Liao, M. Wahib, B. Gerofi, "Why Globally Re-shuffle? Revisiting Data Shuffling in Large Scale Deep ...