Question: [ Maximum mark: 1 7 ] The time taken for a computer to process a task is defined by the number of operations it needs

[Maximum mark: 17]
The time taken for a computer to process a task is defined by the number of operations it needs to perform. The number of operations, P, and the time taken, t seconds, for a specific algorithm were recorded. This information is summarized as follows:
When p=100,t=2s;
When p=500,t=10s;
When p=1000,t=40 :
This information was used to create Model C , where t is a function of p.
t(p)=rp2+sp,r,sinZ
At 100 operations, Model C can be represented by the equation.
100r+s=2
Write down a second equation to represent Model C, when the number of operations is 500.
Find r and s.
Find the coordinates of the vertex of the graph of y=t(p)
Using the values given and your answer to part (b), sketch the graph of y=t(p) for 0p1000 and 0t50, clearly showing the vertex.
Hence, identify why Model C may not be appropriate for very large numbers of operations.
[1]
Additional data was used to create Model D, a revised model for the time taken by the algorithm.
Model D: t(p)=0.8p2-2.5p
Use Model D to calculate an estimate for the time taken for 2000 operations. [2]
7. The actual time taken for 2000 operations is 3200 seconds. Calculate the percentage error in the estimate in part (f).
[2]
8. It is found that once an algorithm starts, there is an average delay of 0.5 seconds before the first operation is executed. An algorithm is given a task with o operations and must complete within 5000 seconds.
Using Model D and taking the delay into account, calculate the maximum number of operations the algorithm can handle.
[3]
[ Maximum mark: 1 7 ] The time taken for a

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!