Question: An algorithm takes 0.5 seconds to run on an input of size 100. How long will it take to run on an input of size

An algorithm takes 0.5 seconds to run on an input of size 100. How long will it take to run on an input of size 1000 if the algorithm has a running time that is linear? quadratic?log-linear? cubic
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
