如何通过递归调用来调整和分析代码的运行时间,是O(n)吗?
A = [10,8,7,6,5]
def Algorithm(A):
ai = max(A) # find largest integer
i = A.index(ai)
A[i] = 0
aj = max(A) # finding second largest integer
A[i] = abs(ai - aj) # update A[i]
j = A.index(aj)
A[j] = 0 # replace the A[j] by 0
if aj == 0: # if second largest item equals
return ai # to zero return the largest integer
return Algorithm(A) # call Algorithm(A) with updated A
发布于 2018-11-18 06:04:17
下面是它的详细信息:
def Algorithm(A):
ai = max(A) # O(n)
i = A.index(ai) # O(n)
A[i] = 0 # O(1)
aj = max(A) # O(n)
A[i] = abs(ai - aj) # O(1)
j = A.index(aj) # O(n)
A[j] = 0 # O(1)
if aj == 0: # O(1)
return ai # O(1)
return Algorithm(A) # recursive call, called up to n times recursively
只要max(A)
不是0
,最后一个递归调用就会被调用,在最坏的情况下,如果都是正的,就是n
倍。
因此,直到最后一行的所有内容都是O(n)
,最后一行使所有内容都运行n次,所以它的总和是O(n^2)
发布于 2018-11-18 06:10:22
一开始,我有点怀疑你的算法是否真的运行在O(n)。还有下面的程序
import timeit, random
import matplotlib.pyplot as plt
code = """
def Algorithm(A):
ai = max(A) # find largest integer
i = A.index(ai)
A[i] = 0
aj = max(A) # finding second largest integer
A[i] = abs(ai - aj) # update A[i]
j = A.index(aj)
A[j] = 0 # replace the A[j] by 0
if aj == 0: # if second largest item equals
return ai # to zero return the largest integer
return Algorithm(A) # call Algorithm(A) with updated A
Algorithm(%s)
"""
x, y = [], []
lst = [random.randint(-1000, 10000)]
for i in range(1000):
lst.append(random.randint(-1000, 10000))
time = timeit.timeit(stmt=code % lst, number=10)
x.append(i)
y.append(time)
plt.plot(x, y)
plt.show()
为不同的随机生成的列表测量算法的运行时间(并随后绘制)。结果
清楚地支持非线性增长。换句话说,由于该算法的复杂度为O(n^2),因此无法证明它在O(n)内运行。
https://stackoverflow.com/questions/53355560
复制相似问题