Question: Suppose a program takes 2 0 milliseconds to process input of size 1 0 , 0 0 0 . If the runtime is T (

Suppose a program takes 20 milliseconds to process input of size 10,000. If the runtime is T(n)=
klog(n), where n is the input size, how long will the program take to process input of
size ...(see below)
T(n)=klog(n)
If we square the input size, T(n2)=klog(n2)=2klog(n)=2T(n)
a)100,000,000?2**10,000**20ms=20,000**20ms=400,000ms=400 seconds
b)100?
c)10?
 Suppose a program takes 20 milliseconds to process input of size

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!