Profiling and Improving the PyTorch Dataloader for high-latency Storage: A Technical Report

by   Ivan Svogor, et al.

A growing number of Machine Learning Frameworks recently made Deep Learning accessible to a wider audience of engineers, scientists, and practitioners, by allowing straightforward use of complex neural network architectures and algorithms. However, since deep learning is rapidly evolving, not only through theoretical advancements but also with respect to hardware and software engineering, ML frameworks often lose backward compatibility and introduce technical debt that can lead to bottlenecks and sub-optimal resource utilization. Moreover, the focus is in most cases not on deep learning engineering, but rather on new models and theoretical advancements. In this work, however, we focus on engineering, more specifically on the data loading pipeline in the PyTorch Framework. We designed a series of benchmarks that outline performance issues of certain steps in the data loading process. Our findings show that for classification tasks that involve loading many files, like images, the training wall-time can be significantly improved. With our new, modified ConcurrentDataloader we can reach improvements in GPU utilization and significantly reduce batch loading time, up to 12X. This allows for the use of the cloud-based, S3-like object storage for datasets, and have comparable training time as if datasets are stored on local drives.


page 11

page 12

page 13

page 15

page 19

page 25


Memory visualization tool for training neural network

Software developed helps world a better place ranging from system softwa...

Quantifying and Improving Performance of Distributed Deep Learning with Cloud Storage

Cloud computing provides a powerful yet low-cost environment for distrib...

DLL: A Blazing Fast Deep Neural Network Library

Deep Learning Library (DLL) is a new library for machine learning with d...

Deep Lake: a Lakehouse for Deep Learning

Traditional data lakes provide critical data infrastructure for analytic...

The Framework Tax: Disparities Between Inference Efficiency in Research and Deployment

Increased focus on the deployment of machine learning systems has led to...

Neural Network Libraries: A Deep Learning Framework Designed from Engineers' Perspectives

While there exist a plethora of deep learning tools and frameworks, the ...

On Generalization and Regularization in Deep Learning

Why do large neural network generalize so well on complex tasks such as ...

Please sign up or login with your details

Forgot password? Click here to reset