Papers
arxiv:2405.18047

2BP: 2-Stage Backpropagation

Published on May 28
ยท Submitted by akhaliq on May 29
#2 Paper of the day
Authors:

Abstract

As Deep Neural Networks (DNNs) grow in size and complexity, they often exceed the memory capacity of a single accelerator, necessitating the sharding of model parameters across multiple accelerators. Pipeline parallelism is a commonly used sharding strategy for training large DNNs. However, current implementations of pipeline parallelism are being unintentionally bottlenecked by the automatic differentiation tools provided by ML frameworks. This paper introduces 2-stage backpropagation (2BP). By splitting the backward propagation step into two separate stages, we can reduce idle compute time. We tested 2BP on various model architectures and pipelining schedules, achieving increases in throughput in all cases. Using 2BP, we were able to achieve a 1.70x increase in throughput compared to traditional methods when training a LLaMa-like transformer with 7 billion parameters across 4 GPUs.

Community

Seems like zero bubble PP: https://huggingface.co/papers/2401.10241

ยท
Paper author

It does the same thing, I've been working on and off on 2BP since August 2023 their paper was released in November 2023 (which was a bit of a blow). However, since we test on different model architectures beyond transformers and suggest ways to use 2BP to increase GPU occupancy, we still thought it was worth publishing.

There's a simple-english rewrite of the paper here - feedback from the authors is welcome! https://www.aimodels.fyi/papers/arxiv/2bp-2-stage-backpropagation

The Innovation of 2-Stage Backpropagation: Faster DNN Training!

๐Ÿ‘‰ Subscribe: https://www.youtube.com/@Arxflix
๐Ÿ‘‰ Twitter: https://x.com/arxflix
๐Ÿ‘‰ LMNT (Partner): https://lmnt.com/

By Arxflix
9t4iCUHx_400x400-1.jpg

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2405.18047 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2405.18047 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2405.18047 in a Space README.md to link it from this page.

Collections including this paper 8