Your Cart
Loading

Machine Learning Course lab 7-Solved

On Sale
$15.00
$15.00
Added to cart


(Kernels & Neural Network Introduction)

Goals.      The goals of this exercise are to:

•  Gain a better understanding of properties of valid kernel functions.

•  Familiarize you with the cross-entropy loss for multi-class classification.

•  Introduce you to the PyTorch deep learning framework.

•  Explore the representational capacity of neural networks by approximating 2d functions.


Theory Exercises

Problem 1 (Kernels):

In class we have seen that many kernel functions k(x,x′) can be written as inner products ϕ(x)⊤ϕ(x′), for a suitably chosen vector-function ϕ(·) (often called a feature map). Let us say that such a kernel function is valid. We further discussed many operations on valid kernel functions that result again in valid kernel functions. Here are two more.

1.   Let k1(x,x′) be a valid kernel function. Let f be a polynomial with positive coefficients. Show that k(x,x′) = f(k1(x,x′)) is a valid kernel.

2.   Show that k(x,x′) = exp(k1(x,x′)) is a valid kernel assuming that k1(x,x′) is a valid kernel. Hint: You can use the following property: if (Kn)n≥0 is a sequence of valid kernels and if there exists a function

K : X × X → R such that for all , then K is a valid kernel.

Problem 2 (Softmax Cross Entropy):

In this exercise, we study multi-class classification with the softmax-cross-entropy loss (or simply cross-entropy) which can be seen as a generalization of the logistic loss to more than 2 classes. First, we define the softmax of a vector x = [x1,...,xd]⊤ is a vector z = [z1,...,zd]⊤ with:

                                                                                    .                                                                                (1)

The label y is an integer denoting the target class. To turn y into a probability distribution for use with crossentropy, we use one-hot encoding:

(

                                                                                                    ⊤                                             1, if k = y

                                                    onehot(y) = y = [y1,...,yd]        where yk =                                                                             (2)

0, otherwise

The cross-entropy is given by:

                                                                                                                                                           (3)

We ask you to do the following:

1.   Equation 1 potentially computes exp of large positive numbers which is numerically unstable. Modify Eq. 1 to avoid positive numbers in exp. Hint: Use maxj(xj).

2.   Derive . You may assume that y is a one-hot vector.

3.   What values of xi minimize the softmax-cross-entropy loss? To avoid complications, practitioners sometimes use a trick called label smoothing where y is replaced by yˆ  for some small value e.g. ϵ = 0.


Programming Exercises

Problem 3 (PyTorch Introduction and Neural Network Training):

The accompanying Jupyter Notebook contains a brief introduction to PyTorch along with two neural network exercises. You will explore the representational capacity of neural networks by approximating 2d functions and train a digit classifier. Note that some details like the backpropagation algorithm will be explained in detail next week. For now, you can use the PyTorch autograd as a black box that returns you the gradients needed for optimization.

We recommend running the notebook on Google Colab which provides you with a free GPU and does not require installing any packages.

1. Open the colab link for the lab 7:

https://colab.research.google.com/github/epfml/ML_course/blob/master/labs/ex07/template/ex07.ipynb 2. To save your progress, click on “File > Save a Copy in Drive” to get your own copy of the notebook.

3.   Click ‘connect’ on top right to make the notebook executable (or ‘open in playground’).

4.   Work your way through the introduction and exercises.

Alternatively you can download the notebook from GitHub and install PyTorch locally, see the instructions on pytorch.org.

Additional Tutorials: If you plan on using PyTorch in your own projects, we recommend additionally going through the official tutorials after the exercise session:

•  Deep Learning with PyTorch: a 60-minute Blitz

•  Learning PyTorch with Examples

2


You will get a ZIP (477KB) file

Customer Reviews

There are no reviews yet.