Question: There are two algorithms called Alg 1 and Alg 2 for a problem of size n . Alg 1 runs in n 2 microseconds and

There are two algorithms called Alg1 and Alg2 for a problem of size n. Alg1 runs in n2 microseconds and Alg2 runs in 100n log n microseconds. Alg1 can be implemented using 4 hours of programmer time and needs 2 minutes of CPU time. On the other hand, Alg2 requires 15 hours of programmer time and 6 minutes of CPU time. If programmers are paid 20 dollars per hour and
CPU time costs 50 dollars per minute, how many times must a problem instance of size 500 be solved using Alg2 in order to justify its development cost?(THE LAST SENTENCE IN THIS CONTENT NEEDS THE ACTUAL SOLUTION)
Now,
c[1](x)= ;
c[2](x);
n^2*x +80;=100 n logn x+300;
putting x =500 we get,
500^2*50/60*10^(-6)*x +80;=100*(500)*log(500)*(x)*(50/(60))*10^(-6)+300;
.
500^2*50/60*10^(-6)*x +80;
.
100*500*log(500)*50/60*10^(-6)*x +300;
.
5*x - log(500)*x =5280;
.
5x -2.698x =5280
.
x =2293.657689
x =2294
Hence, the program must be run 2294 times to justify the development cost of the Algorithm 2.
"Compile time was not accounted in your fixed cost."

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!