Question: Measure how long does it take to compete 1 epoch training using different batch size on single GPU. Start from batch size 32, increase
Measure how long does it take to compete 1 epoch training using different batch size on single GPU. Start from batch size 32, increase by 4-fold for each measurement (i.e., 32, 128, 512...) until single GPU memory cannot hold the batch size. For each run, run 2 epochs, the first epoch is used to warmup CPU/GPU cache; and you should report the training time (excluding data I/O; but including data movement from CPU to GPU, gradients calculation and weights update) based on the 2nd epoch training.
Step by Step Solution
3.36 Rating (152 Votes )
There are 3 Steps involved in it
Answer Here is a Pythonlike pseudocode to illustrate the process import time import torch Assuming y... View full answer
Get step-by-step solutions from verified subject matter experts
