Question: Question 1 Consider the following code segment that examines the elements of two lists: matches = 0 for i in range(len(lst1)) : for j in
Question 1
Consider the following code segment that examines the elements of two lists:
matches = 0
for i in range(len(lst1)) :
for j in range(len(lst2)) :
if lst1[i] == lst2[j] :
matches = matches + 1
What can you conclude about the running time of this code segment if both lists contain n elements?
Its running time will be O(n).
Its running time will be O(n2).
Its running time will be O(log n).
Its running time will be O(n log n)
Question 2
In the textbook, we found that the number of element visits for merge sort totaled n + 5nlog2n. Which of the following is the appropriate big-Oh notation for merge sort?
Question options:
| 5nlog2n | |
| n + log2 | |
| n + 5n | |
| nlog2n |
Question 3
Consider the swap function shown below from the SelectionSorter class. If we modified it as shown in the swap2 function shown below, what would be the effect on the sort function?
def swap(values, i, j) :
temp = values[i]
values[i] = values[j]
values[j] = temp
def swap2(values, i, j) :
values[i] = values[j]
values[j] = values[i]
Question options:
| There would be no effect | |
| Some list elements would be overwritten | |
| It would sort the list in reverse order | |
| It would still be correct, but run a little faster |
Question 4
A particular algorithm visits n3 + nlog(n) + n! elements in order to perform its task on a list of n elements. Which big-Oh expression best describes the growth rate of this algorithm?
Question options:
| O(n) | |
| O(n3) | |
| O(n!) | |
| O(nlog(n)) |
Question 5
Consider the following function, which correctly merges two lists:
def mergeLists(first, second, values) :
iFrist = 0
iSecond = 0
j = 0
while iFirst < len(first) and iSecond < len(second) :
if first[iFirst] < second[iSecond] :
values[j] = first[iFirst]
iFirst = iFirst + 1
else :
values[j] = second[iSecond]
iSecond = iSecond + 1
j = j + 1
while iFrist < len(first) :
values[j] = first[iFirst]
iFirst = iFirst + 1
j = j + 1
while iSecond < len(second) :
values[j] = second[iSecond]
iSecond = iSecond + 1
j = j + 1
What is the big-Oh complexity of this algorithm, where n is the total number of elements in first and second?
Question options:
| O(1) | |
| O(log(n)) | |
| O(n) | |
| O(n2) |
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
