Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Upcoming SlideShare
×

# Dm part03 neural-networks-homework

433 views

Published on

Published in: Education
• Full Name
Comment goes here.

Are you sure you want to Yes No
Your message goes here
• Be the first to comment

• Be the first to like this

### Dm part03 neural-networks-homework

1. 1. Data Mining Homework 3 Submit to Blackboard in electronic form before 10 am, November 25, 2010. For questions, please contact the teaching assistants Spyros Martzoukos: S.Martzoukos@uva.nl (English only!) Jiyin He: j.he@uva.nl (English only!) Exercise 1: Information Gain and Attributes with Many Values Information gain is deﬁned as gain(S, A) = H(S) − v∈values(A) |Sv| |S| H(Sv) Following to this deﬁnition, information gain favors attributes with many values. Why? Give an example. Exercise 2: Missing Attribute Values Consider the following set of training instances. Instance 2 has a missing value for attribute a1. instance a1 a2 class 1 true true + 2 ? true + 3 true false - 4 false false + Apply at least two diﬀerent strategies for dealing with missing attribute values and show how they work in this concrete example. Exercise 3: Perceptrons 3.1 What is the function of the learning rate in the perceptron training rule? 3.2 What kind of Boolean functions can be modeled with perceptrons and which Boolean functions can not be modeled and why? 1
2. 2. 3.3 Assume the following set of instances with the weights: w0 = 0.4 and w1 = 0.8. The threshold is 0. instance x0 x1 target class 1 1.0 1.0 1 2 1.0 0.5 1 3 1.0 -0.8 -1 4 1.0 -0.2 -1 What are the output values for each instance before the threshold function is applied? What is the accuracy of the model when applying the threshold function? Exercise 4: Gradient Descent Consider the data in Exercise 1.3. Apply the gradient descent algorithm and compute the weight updates for one iteration. You can assume the same initial weights, threshold, and learning rate as in Exercise 3.3. Exercise 5: Stochastic Gradient Descent Consider the data in Exercise 3.3. Apply the stochastic gradient descent algo- rithm and compute the weight updates for one iteration. You can assume the sae initial weights, threshold, and learning rate as in Exercise 3.3. 2