• Arthur

Python代写|机器学习代写 Assignment 4 k-Means, Gaussian Mixture Models, EM


部分内容如下

Submission Instructions You must typeset the answers to the theory questions using

LATEX or Microsoft Word and compile them into a single PDF file. Add your NetID to

every filename you submit. Create a ZIP file containing both the PDF file and the

completed Jupyter notebook. Name it hYour-NetIDi hw4.zip. Submit the ZIP

file on NYU Classes. The due date is May 6, 2020, 11:55 PM.

Show your work in the problems below!!!

1. k-Means learning with Gradient Descent. (10 points) k-means clustering can

be reinterpreted in a probabilistic setting as a set of observed data points fxigi=1 and a

latent (unobserved) cluster assignment for each of those points fzigi=1 (Lecture Notes

x4.2).

Your task is to describe the procedure to train this model via gradient descent. You

must clearly write update steps for both W and Z and discuss the algorithm (either in

words or using pseudo code)

Please read this paper about k-Means and GD (it can help with this task): https:

//papers.nips.cc/paper/989-convergence-properties-of-the-k-means-algorithms.

pdf


2. Gaussian Mixture Models. (10 points)


3. Expectation Maximization. (10 points) Write step by step derivation of the

covariance matrix update as part of M-step (Maximization step) in Gaussian Mixture

Model learning.

Additional details (to make derivations easier):

1. Start with the EM objective function and look for the derivative of the objective

w.r.t. the covariance matrix elements.

2. Assume the covariance matrix to be diagonal.

3. Assume the new mean vector to be already computed in each iteration.

4. Check pages near page 56 of the lecture notes for further details.

CS代写|Python代写|机器学习代写|代码代写|程序代写|C++代写|java代写