Question: Measure how long does it take to compete 1 epoch training using different batch size on single GPU. Start from batch size 32, increase

Measure how long does it take to compete 1 epoch training using different batch size on single GPU. Start

Measure how long does it take to compete 1 epoch training using different batch size on single GPU. Start from batch size 32, increase by 4-fold for each measurement (i.e., 32, 128, 512...) until single GPU memory cannot hold the batch size. For each run, run 2 epochs, the first epoch is used to warmup CPU/GPU cache; and you should report the training time (excluding data I/O; but including data movement from CPU to GPU, gradients calculation and weights update) based on the 2nd epoch training.

Step by Step Solution

3.36 Rating (152 Votes )

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock

Answer Here is a Pythonlike pseudocode to illustrate the process import time import torch Assuming y... View full answer

blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mechanical Engineering Questions!