Question: The programmer involved in question 1 realizes that the program needs debugging and modifies it so that 1 % of the statements now print a

The programmer involved in question 1 realizes that the program needs debugging and modifies it so that 1% of the statements now print a value to the terminal. Suppose it takes 100 nanoseconds (100*10-9 seconds) to perform the corresponding I/O operation for each print statement and no other statement from the program can be executed until the I/O has completed. Now how long would the program take to complete (assume the total number of statements is the same, just that 1% of them now perform I/O)?
The programmer from question 2 successfully debugs the program and removes all of the IO statements (still leaving 5*1012 statements.) However, they want to speed up the execution of the program, so they attempt to move their program to a (non-pipelined) machine with a 4 GHz processor. However, this computer has a 3 step instruction cycle. Thus, the new computer can only complete 1 instruction every 3 clock cycles. Will the program run faster on this new computer? Why/Why not?
The programmer involved in question 1 realizes

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!