Cuda out of memory error.

#67
by ibrim - opened

I'm using a DGX a100 machine with multiple GPUs of 80GB each but keep getting the following error:-

return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 1024.00 MiB (GPU 1; 79.35 GiB total capacity; 75.93 GiB already allocated; 1.86 GiB free; 75.93 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

I've tried reducing batch_size, setting cuda_visible_devices to all the gpus and also tried dataparallel but nothing works.

+-----------------------------------------------------------------------------+
| NVIDIA-SMI 450.80.02 Driver Version: 450.80.02 CUDA Version: 11.0 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 Graphics Device On | 00000000:01:00.0 Off | 0 |
| N/A 50C P0 68W / 275W | 7095MiB / 81252MiB | 0% Default |
| | | Disabled |
+-------------------------------+----------------------+----------------------+
| 1 Graphics Device On | 00000000:47:00.0 Off | 0 |
| N/A 51C P0 72W / 275W | 535MiB / 81252MiB | 0% Default |
| | | Disabled |
+-------------------------------+----------------------+----------------------+
| 2 Graphics Device On | 00000000:81:00.0 Off | 0 |
| N/A 50C P0 69W / 275W | 2131MiB / 81252MiB | 0% Default |
| | | Disabled |
+-------------------------------+----------------------+----------------------+
| 3 DGX Display On | 00000000:C1:00.0 Off | N/A |
| 35% 47C P8 N/A / 50W | 1MiB / 3911MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 4 Graphics Device On | 00000000:C2:00.0 Off | 0 |
| N/A 49C P0 65W / 275W | 3MiB / 81252MiB | 0% Default |
| | | Disabled |
+-------------------------------+----------------------+----------------------+

GPU details

test with batch size 1
make sure your max sequence length is 2048

Sign up or log in to comment