Question: An algorithm takes 0.5 milliseconds for an input size of 100. How long will it take for input sizes of 500, 1000, 10000, 100000, if
An algorithm takes 0.5 milliseconds for an input size of 100. How long will it take for input sizes of 500, 1000, 10000, 100000, if the running time is the following?
- Linear
- O(N log N)
- Quadratic
- Cubic
- Exponential
Hints:
- 1000 milliseconds = 1 second
- You can abbreviate "milliseconds" as "ms" and "seconds" as "s".
- Units: each numerical answer must include the units (either ms or s).
- Show your work
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
