In this assignment, you will compare the characteristics and performance of different classifiers, namely logistic regression, k-nearest neighbours and naive Bayes. You will experiment with these extensions and extend the provided code. Note that you should understand the code first instead of using it as a black box. Python versions of the code have been provided. You are free to work with whichever you wish.
作为Machine Learning的三大基础算法
本题给出了以上三大算法的基本实现,但是需要根据测试框架的调度逻辑,实现未完成的测试函数。 本题偏重工程性质,在不断的调试中,会加深对算法的理解。
下面是check_grad函数的实现
def check_grad(func, X, epsilon, *args):
if len(X.shape) != 2 or X.shape[1] != 1:
raise ValueError("X must be a vector")
y, dy, = func(X, *args)[:2] # get the partial derivatives dy
dh = np.zeros((len(X), 1))
for j in xrange(len(X)):
dx = np.zeros((len(X), 1))
dx[j] += epsilon
y2 = func(X+dx, *args)[0]
dx = -dx
y1 = func(X+dx, *args)[0]
dh[j] = (y2 - y1)/(2*epsilon)
print np.hstack((dy, dh)) # print the two vectors
d = LA.norm(dh-dy)/LA.norm(dh+dy) # return norm of diff divided by norm of sum
return d
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。