This document summarizes work using logistic regression for a machine learning project. It loads data, plots it to show examples labeled as admitted or not admitted. It then initializes parameters, calculates the initial cost and gradient. Later parts predict probabilities for new data, calculate predictions, and accuracies. Functions are defined for the sigmoid function, prediction, plotting data and calculating the cost function and gradient.
1. Guided By :Dr Furqan Aziz
Presented By: Islam Ud Islam
Algorithm used : Logistic Regression
Date : May-11-2018
Machine Learning
2. clear; close all; clc
data = load('Data.txt');
X = data(:, [1, 2]); y = data(:, 3);
3. %% ==================== Part 1: Plotting
fprintf(['Plotting data with + indicating (y = 1) examples
and o ' ...
'indicating (y = 0) examples.n']);
plotData(X, y);
hold on;
xlabel('Exam 1 score')
ylabel('Exam 2 score')
legend('Admitted', 'Not admitted')
hold off;
4. Part 2: Compute Cost and Gradient ============
% In this part, we will implement the cost and gradient
[m, n] = size(X);
% Add intercept term to x and X_test
X = [ones(m, 1) X];
% Initialize fitting parameters
initial_theta = zeros(n + 1, 1);
% Compute and display initial cost and gradient
[cost, grad] = costFunction(initial_theta, X, y);
fprintf('Cost at initial theta (zeros): %fn', cost);
fprintf('Gradient at initial theta (zeros): n');
fprintf(' %f n', grad);
5. %% ============== Part 4: Predict and Accuracies
==============
prob = sigmoid([1 45 85] * theta);
fprintf(['For a student with scores 45 and 85, we
predict an admission ' ...
'probability of %fnn'], prob);
t
p = predict(theta, X);
6. function g = sigmoid(z)
%SIGMOID Compute sigmoid functoon
% J = SIGMOID(z) computes the sigmoid
of z.
g = 1 ./ (1 + exp(-z));
end
7. function p = predict(theta, X)
m = size(X, 1); % Number of training
examples
p = round(sigmoid(X * theta));
end