SlideShare a Scribd company logo
1 of 16
Download to read offline
Math 156
Homework3(Cui Yi) ID number:605068398
Math 156
Homework3(Cui Yi) ID number:605068398
Q1.
Solution
Q2.
Solution
Q3.
Solution
Q4.
Solution
Q5.
Solution
Q6.
Solution
Q7.
Solution
Q1.
Solution
The simplest representation of a linear discriminant function is obtained by taking
a linear function of the input vector so that
​ We can avoid these difficulties by considering a single K-class discriminant comprising K linear
functions of the form:
​ The decision boundary between class and class is therefore given by and
hence corresponds to a (D − 1)-dimensional hyperplane defined by .
​ Consider two points and both of which lie on the decision surface.Because
, we have and hence the vector is orthogonal to every vector
lying within the decision surface, and sow determines the orientation of the decision surface.
Similarly, if is a point on the decision surface, then , and so the normal distance from the
origin to the decision surface is given by
Figure 1. Scatter plot
Q2.
Solution
It makes sense to classify this data with a linear classifier.
mu = [1 2]; Sigma = [.1 .05; .05 .2];
r = mvnrnd(mu, Sigma, 50);
scatter(r(:,1),r(:,2),'b.');
hold on
mu1 = [2 4]; Sigma1 = [.2 -.1; -.1 .3];
r1 = mvnrnd(mu1, Sigma1, 50);
scatter(r1(:,1),r1(:,2),'r+');
Figure 2. Scatter plot and boundary
Q3.
Solution
X = [[r; r1] ones(2*50, 1)];
b = [ones(50, 1); -ones(50, 1)];
coef = lscov(X, b);
% plot the coefficient line using that learning data we generated
xline = [0; 3];
yline = (-coef(3)-coef(1).*line_x)./coefs(2);
plot( xline, yline, '-k' );
xlim([0 3]); ylim([1 6]);
Q4.
Solution
mu = [1 2]; Sigma = [.1 .05; .05 .2];
r2 = mvnrnd(mu, Sigma, 50);
scatter(r2(:,1),r2(:,2),'b.');
hold on
mu1 = [2 4]; Sigma1 = [.2 -.1; -.1 .3];
r3 = mvnrnd(mu1, Sigma1, 50);
scatter(r3(:,1),r3(:,2),'r+');
%size=50;
covar_mat2 = 8.*eye(2);
R = [r2; r3];
new_inds = randperm( (2*50) );
Correct = b(new_inds);
R = R( new_inds, : );
R_ALL = [R, ones(2*50, 1)];
classify = sum( bsxfun( @times, R_ALL, coefs(:)' ), 2);
classify(classify < 0) = -1;
classify(classify >= 0) = 1;
Accuracy = sum(classify == Correct./(2*50)
Figure 3.Scatter plot and boundary
Accuracy =
0.9900
Figure 4
Q5.
Solution
mu = [2 2]; Sigma = [.2 .05; .05 .3];
r4 = mvnrnd(mu, Sigma, 50);
scatter(r4(:,1),r4(:,2),'b.');
hold on
mu1 = [2 4]; Sigma1 = [.4 -.1; -.1 .3];
r5 = mvnrnd(mu1, Sigma1, 50);
scatter(r5(:,1),r5(:,2),'r+');
hold on
mu2 = [3 3]; Sigma2 = [.5 -.3; -.3 .4];
r6 = mvnrnd(mu2, Sigma2, 50);
scatter(r6(:,1),r6(:,2),'kd');
Figure 5.Scatter plot and boundary
Figure 6.Scatter plot and boundary(After adjusting)
When calculating the accuracy between r4 and r5,
Accuracy =
When calculating the accuracy between r4 and r6
Accuracy =
When calculating the accuracy between r5 and r6
Accuracy =
The overall accuracy =
R = [r4; r5];
0.9400
R = [r4; r6];
0.7200
R = [r5; r6];
0.5000
0.7200
Figure 7.New points
Q6.
Solution
Then we test the performance by creating some new sets.
When calculating the accuracy between r4 and r5,
Accuracy =
When calculating the accuracy between r4 and r6
R = [r4; r5];
0.8400
R = [r4; r6];
Accuracy =
When calculating the accuracy between r5 and r6
Accuracy =
The overall accuracy =
The success rate do nor make sense given the distribution of the data, which implies the
discriminant functions least squares do not make sense in when
And it shows that least squares is highly sensitive to outliers.
Q7.
Solution
Calculating all the x in set 1 2 3 for example, k=1 to 15
0.5400
R = [r5; r6];
0.5200
0.6333
%mu = [2 2]; Sigma = [.2 .05; .05 .3];
%r4 = mvnrnd(mu, Sigma, 50);
scatter(r4(:,1),r4(:,2),'b.');
hold on
%mu1 = [2 4]; Sigma1 = [.4 -.1; -.1 .3];
%r5 = mvnrnd(mu1, Sigma1, 50);
scatter(r5(:,1),r5(:,2),'r+');
hold on
%mu2 = [3 3]; Sigma2 = [.5 -.3; -.3 .4];
%r6 = mvnrnd(mu2, Sigma2, 50);
scatter(r6(:,1),r6(:,2),'kd');
legend('r4','r5','r6','Location','best');
%Using the Euclidean distance
figure
accuracy=zeros(50,15);
for k=1:15
for cycy=1:50
newpoint = r4(cycy,:);
%line(newpoint(1),newpoint(2),'marker','x','color','k',...
%'markersize',k,'linewidth',2)
x=[r4;r5;r6];
Mdl = KDTreeSearcher(x);
[n,d] = knnsearch(Mdl,newpoint,'k',k);
%line(x(n,1),x(n,2),'color',[.5 .5 .5],'marker','o',...
% 'linestyle','none','markersize',10) ;
cx(cycy,:)=0;
cy(cycy,:)=0;
cz(cycy,:)=0;
for i = 1:k
if n(i) <=50
cx(cycy,:)=cx(cycy,:)+1;
else if n(i)<=100
cy(cycy,:)=cy(cycy,:)+1;
else
cz(cycy,:)=cz(cycy,:)+1;
end
end
end
end
classify=[cx cy cz];
the_point_in_cluster(:,k)=cx./(cy+cz+cx);
subplot(3,5,k)
plot(the_point_in_cluster(:,k),'o')
title(sprintf('k = %d',k))
end
Figure 8. The percentage of each points in Set D1
Figure 9. 10 nearest points of the first row of D2
For example, when we want to get 10 nearest points of the first row of D1, we can see that 8 points in
10 points are all in D1, so it will be classified in D1 and it also show the correctness of the KNN
classification.
Figure 10. 10 nearest points of the first row of D2
Figure 11. 10 nearest points of the first row of D3
When we want to get 10 nearest points of the first row of D2, we can see that 6 points in 10 points
are all in D2, so it will be classified in D2 and it also show the correctness of the KNN classification.
Figure 12. The percentage of each points in Set D2
Figure 13. The percentage of each points in Set D3
When we want to get 10 nearest points of the first row of D3, we can see that 10 points in 10 points
are all in D3, so it will be classified in D3 and it also show the correctness of the KNN classification.
Adjusting the parameter of the function, we can get
Finally, I calculate the overall accuracy of the KNN classification.
The failure of least squares should not surprise us when we recall that it corresponds to
maximum likelihood under the assumption of a Gaussian conditional distribution, whereas binary
target vectors clearly have a distribution that is far from Gaussian. By adopting k-nearest neighbor
classifier, we can find KNN algorithm is quite useful in multi-classification problem.
Whether we use linear method or unlinear method like KNN depends on the number of classes
that we should to classify, the dimension of the points in the data and the situations of the degree of
intersection and overlap.
KNN is useful when the data have relatively high level of intersection nad overlap. Also, it can
solve the unlinear problem that the linear classifier can not solve.
%After using "fitcknn" and "predict"
%Calculate the accuracy
Acc_1 = sum(lab(1:50)==1);
Acc_2 = sum(lab(51:100)==2);
Acc_3 = sum(lab(101:159)==3);
Accuracy(i) = ((Acc_1)+(Acc_2)+(Acc_3))./3;
1 0.8133
2 0.8200
3 0.8333
4 0.8400
5 0.8533
6 0.8600
7 0.8600
8 0.8533
9 0.8533
10 0.8533
11 0.8600
12 0.8600
13 0.8400
14 0.8533
15 0.8400

More Related Content

What's hot

1.3 midpoint and distance formulas
1.3 midpoint and distance formulas1.3 midpoint and distance formulas
1.3 midpoint and distance formulasmasljr
 
Distance & midpoint formulas 11.5
Distance & midpoint formulas 11.5Distance & midpoint formulas 11.5
Distance & midpoint formulas 11.5bweldon
 
Lecture determinants good one
Lecture determinants good oneLecture determinants good one
Lecture determinants good oneHazel Joy Chong
 
Data Augmentation and Disaggregation by Neal Fultz
Data Augmentation and Disaggregation by Neal FultzData Augmentation and Disaggregation by Neal Fultz
Data Augmentation and Disaggregation by Neal FultzData Con LA
 
Distance and midpoint notes
Distance and midpoint notesDistance and midpoint notes
Distance and midpoint notescarolinevest77
 
Pythagorean theorem and distance formula powerpoint1
Pythagorean theorem and distance formula powerpoint1Pythagorean theorem and distance formula powerpoint1
Pythagorean theorem and distance formula powerpoint140505903
 
2.3 Distance and Midpoint Formulas
2.3 Distance and Midpoint Formulas2.3 Distance and Midpoint Formulas
2.3 Distance and Midpoint Formulassmiller5
 
Two point form Equation of a line
Two point form Equation of a lineTwo point form Equation of a line
Two point form Equation of a lineJoseph Nilo
 
Distance and midpoint formula
Distance and midpoint formulaDistance and midpoint formula
Distance and midpoint formulaNoel Guzman
 
Obj. 7 Midpoint and Distance Formulas
Obj. 7 Midpoint and Distance FormulasObj. 7 Midpoint and Distance Formulas
Obj. 7 Midpoint and Distance Formulassmiller5
 
10.1 Distance and Midpoint Formulas
10.1 Distance and Midpoint Formulas10.1 Distance and Midpoint Formulas
10.1 Distance and Midpoint Formulasswartzje
 
Measuring Segments and Coordinate Plane
Measuring Segments and Coordinate PlaneMeasuring Segments and Coordinate Plane
Measuring Segments and Coordinate PlaneGrenada High School
 
Numerical Method Analysis: Algebraic and Transcendental Equations (Linear)
Numerical Method Analysis: Algebraic and Transcendental Equations (Linear)Numerical Method Analysis: Algebraic and Transcendental Equations (Linear)
Numerical Method Analysis: Algebraic and Transcendental Equations (Linear)Minhas Kamal
 
1013 midpointdistnaceandpythag
1013 midpointdistnaceandpythag1013 midpointdistnaceandpythag
1013 midpointdistnaceandpythagjbianco9910
 
Math unit29 using graphs to solve equations
Math unit29 using graphs to solve equationsMath unit29 using graphs to solve equations
Math unit29 using graphs to solve equationseLearningJa
 
1..3 distance formula
1..3 distance formula1..3 distance formula
1..3 distance formulac_thomas
 

What's hot (20)

1.3 midpoint and distance formulas
1.3 midpoint and distance formulas1.3 midpoint and distance formulas
1.3 midpoint and distance formulas
 
4.3 cramer’s rule
4.3 cramer’s rule4.3 cramer’s rule
4.3 cramer’s rule
 
Distance & midpoint formulas 11.5
Distance & midpoint formulas 11.5Distance & midpoint formulas 11.5
Distance & midpoint formulas 11.5
 
Lecture determinants good one
Lecture determinants good oneLecture determinants good one
Lecture determinants good one
 
Data Augmentation and Disaggregation by Neal Fultz
Data Augmentation and Disaggregation by Neal FultzData Augmentation and Disaggregation by Neal Fultz
Data Augmentation and Disaggregation by Neal Fultz
 
Distance and midpoint notes
Distance and midpoint notesDistance and midpoint notes
Distance and midpoint notes
 
Pythagorean theorem and distance formula powerpoint1
Pythagorean theorem and distance formula powerpoint1Pythagorean theorem and distance formula powerpoint1
Pythagorean theorem and distance formula powerpoint1
 
2.3 Distance and Midpoint Formulas
2.3 Distance and Midpoint Formulas2.3 Distance and Midpoint Formulas
2.3 Distance and Midpoint Formulas
 
distance formula
distance formuladistance formula
distance formula
 
Two point form Equation of a line
Two point form Equation of a lineTwo point form Equation of a line
Two point form Equation of a line
 
Distance and midpoint formula
Distance and midpoint formulaDistance and midpoint formula
Distance and midpoint formula
 
Obj. 7 Midpoint and Distance Formulas
Obj. 7 Midpoint and Distance FormulasObj. 7 Midpoint and Distance Formulas
Obj. 7 Midpoint and Distance Formulas
 
10.1 Distance and Midpoint Formulas
10.1 Distance and Midpoint Formulas10.1 Distance and Midpoint Formulas
10.1 Distance and Midpoint Formulas
 
Measuring Segments and Coordinate Plane
Measuring Segments and Coordinate PlaneMeasuring Segments and Coordinate Plane
Measuring Segments and Coordinate Plane
 
Distance formula
Distance formulaDistance formula
Distance formula
 
Numerical Method Analysis: Algebraic and Transcendental Equations (Linear)
Numerical Method Analysis: Algebraic and Transcendental Equations (Linear)Numerical Method Analysis: Algebraic and Transcendental Equations (Linear)
Numerical Method Analysis: Algebraic and Transcendental Equations (Linear)
 
1013 midpointdistnaceandpythag
1013 midpointdistnaceandpythag1013 midpointdistnaceandpythag
1013 midpointdistnaceandpythag
 
Midpoint & distance
Midpoint & distanceMidpoint & distance
Midpoint & distance
 
Math unit29 using graphs to solve equations
Math unit29 using graphs to solve equationsMath unit29 using graphs to solve equations
Math unit29 using graphs to solve equations
 
1..3 distance formula
1..3 distance formula1..3 distance formula
1..3 distance formula
 

Similar to Machine hw3

K-means Clustering Algorithm with Matlab Source code
K-means Clustering Algorithm with Matlab Source codeK-means Clustering Algorithm with Matlab Source code
K-means Clustering Algorithm with Matlab Source codegokulprasath06
 
matrixMultiplication
matrixMultiplicationmatrixMultiplication
matrixMultiplicationCNP Slagle
 
Calculus a Functions of Several Variables
Calculus a Functions of Several Variables Calculus a Functions of Several Variables
Calculus a Functions of Several Variables Harington Dinklage
 
QR Factorizations and SVDs for Tall-and-skinny Matrices in MapReduce Architec...
QR Factorizations and SVDs for Tall-and-skinny Matrices in MapReduce Architec...QR Factorizations and SVDs for Tall-and-skinny Matrices in MapReduce Architec...
QR Factorizations and SVDs for Tall-and-skinny Matrices in MapReduce Architec...Austin Benson
 
matrices and determinantes
matrices and determinantes matrices and determinantes
matrices and determinantes gandhinagar
 
Scientific Computing II Numerical Tools & Algorithms - CEI40 - AGA
Scientific Computing II Numerical Tools & Algorithms - CEI40 - AGAScientific Computing II Numerical Tools & Algorithms - CEI40 - AGA
Scientific Computing II Numerical Tools & Algorithms - CEI40 - AGAAhmed Gamal Abdel Gawad
 
Nbhm m. a. and m.sc. scholarship test september 20, 2014 with answer key
Nbhm m. a. and m.sc. scholarship test september 20, 2014 with answer keyNbhm m. a. and m.sc. scholarship test september 20, 2014 with answer key
Nbhm m. a. and m.sc. scholarship test september 20, 2014 with answer keyMD Kutubuddin Sardar
 
ImageSegmentation (1).ppt
ImageSegmentation (1).pptImageSegmentation (1).ppt
ImageSegmentation (1).pptNoorUlHaq47
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.pptAVUDAI1
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.pptDEEPUKUMARR
 
A Novel Cosine Approximation for High-Speed Evaluation of DCT
A Novel Cosine Approximation for High-Speed Evaluation of DCTA Novel Cosine Approximation for High-Speed Evaluation of DCT
A Novel Cosine Approximation for High-Speed Evaluation of DCTCSCJournals
 
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...Jialin LIU
 
Pattern Recognition - Designing a minimum distance class mean classifier
Pattern Recognition - Designing a minimum distance class mean classifierPattern Recognition - Designing a minimum distance class mean classifier
Pattern Recognition - Designing a minimum distance class mean classifierNayem Nayem
 
machinelearning project
machinelearning projectmachinelearning project
machinelearning projectLianli Liu
 

Similar to Machine hw3 (20)

K-means Clustering Algorithm with Matlab Source code
K-means Clustering Algorithm with Matlab Source codeK-means Clustering Algorithm with Matlab Source code
K-means Clustering Algorithm with Matlab Source code
 
matrixMultiplication
matrixMultiplicationmatrixMultiplication
matrixMultiplication
 
v39i11.pdf
v39i11.pdfv39i11.pdf
v39i11.pdf
 
Calculus a Functions of Several Variables
Calculus a Functions of Several Variables Calculus a Functions of Several Variables
Calculus a Functions of Several Variables
 
QR Factorizations and SVDs for Tall-and-skinny Matrices in MapReduce Architec...
QR Factorizations and SVDs for Tall-and-skinny Matrices in MapReduce Architec...QR Factorizations and SVDs for Tall-and-skinny Matrices in MapReduce Architec...
QR Factorizations and SVDs for Tall-and-skinny Matrices in MapReduce Architec...
 
main
mainmain
main
 
Image segmentation
Image segmentationImage segmentation
Image segmentation
 
matrices and determinantes
matrices and determinantes matrices and determinantes
matrices and determinantes
 
Ch07 ans
Ch07 ansCh07 ans
Ch07 ans
 
Determinants
DeterminantsDeterminants
Determinants
 
Scientific Computing II Numerical Tools & Algorithms - CEI40 - AGA
Scientific Computing II Numerical Tools & Algorithms - CEI40 - AGAScientific Computing II Numerical Tools & Algorithms - CEI40 - AGA
Scientific Computing II Numerical Tools & Algorithms - CEI40 - AGA
 
Nbhm m. a. and m.sc. scholarship test september 20, 2014 with answer key
Nbhm m. a. and m.sc. scholarship test september 20, 2014 with answer keyNbhm m. a. and m.sc. scholarship test september 20, 2014 with answer key
Nbhm m. a. and m.sc. scholarship test september 20, 2014 with answer key
 
ImageSegmentation (1).ppt
ImageSegmentation (1).pptImageSegmentation (1).ppt
ImageSegmentation (1).ppt
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.ppt
 
ImageSegmentation.ppt
ImageSegmentation.pptImageSegmentation.ppt
ImageSegmentation.ppt
 
A Novel Cosine Approximation for High-Speed Evaluation of DCT
A Novel Cosine Approximation for High-Speed Evaluation of DCTA Novel Cosine Approximation for High-Speed Evaluation of DCT
A Novel Cosine Approximation for High-Speed Evaluation of DCT
 
TABREZ KHAN.ppt
TABREZ KHAN.pptTABREZ KHAN.ppt
TABREZ KHAN.ppt
 
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
A Mathematically Derived Number of Resamplings for Noisy Optimization (GECCO2...
 
Pattern Recognition - Designing a minimum distance class mean classifier
Pattern Recognition - Designing a minimum distance class mean classifierPattern Recognition - Designing a minimum distance class mean classifier
Pattern Recognition - Designing a minimum distance class mean classifier
 
machinelearning project
machinelearning projectmachinelearning project
machinelearning project
 

Recently uploaded

PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPramod Kumar Srivastava
 
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...Florian Roscheck
 
Data Science Jobs and Salaries Analysis.pptx
Data Science Jobs and Salaries Analysis.pptxData Science Jobs and Salaries Analysis.pptx
Data Science Jobs and Salaries Analysis.pptxFurkanTasci3
 
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一F sss
 
Call Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts ServiceCall Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts ServiceSapana Sha
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort servicejennyeacort
 
{Pooja: 9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
{Pooja:  9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...{Pooja:  9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
{Pooja: 9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...Pooja Nehwal
 
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdfKantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdfSocial Samosa
 
Schema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdfSchema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdfLars Albertsson
 
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...dajasot375
 
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样vhwb25kk
 
GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]📊 Markus Baersch
 
Call Girls In Mahipalpur O9654467111 Escorts Service
Call Girls In Mahipalpur O9654467111  Escorts ServiceCall Girls In Mahipalpur O9654467111  Escorts Service
Call Girls In Mahipalpur O9654467111 Escorts ServiceSapana Sha
 
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDINTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDRafezzaman
 
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...soniya singh
 
How we prevented account sharing with MFA
How we prevented account sharing with MFAHow we prevented account sharing with MFA
How we prevented account sharing with MFAAndrei Kaleshka
 
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
Industrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfIndustrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfLars Albertsson
 
DBA Basics: Getting Started with Performance Tuning.pdf
DBA Basics: Getting Started with Performance Tuning.pdfDBA Basics: Getting Started with Performance Tuning.pdf
DBA Basics: Getting Started with Performance Tuning.pdfJohn Sterrett
 

Recently uploaded (20)

PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
 
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
 
Data Science Jobs and Salaries Analysis.pptx
Data Science Jobs and Salaries Analysis.pptxData Science Jobs and Salaries Analysis.pptx
Data Science Jobs and Salaries Analysis.pptx
 
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
 
Call Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts ServiceCall Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts Service
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
 
{Pooja: 9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
{Pooja:  9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...{Pooja:  9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
{Pooja: 9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
 
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdfKantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
 
Schema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdfSchema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdf
 
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
 
VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...
VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...
VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...
 
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
 
GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]
 
Call Girls In Mahipalpur O9654467111 Escorts Service
Call Girls In Mahipalpur O9654467111  Escorts ServiceCall Girls In Mahipalpur O9654467111  Escorts Service
Call Girls In Mahipalpur O9654467111 Escorts Service
 
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDINTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
 
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
 
How we prevented account sharing with MFA
How we prevented account sharing with MFAHow we prevented account sharing with MFA
How we prevented account sharing with MFA
 
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
 
Industrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfIndustrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdf
 
DBA Basics: Getting Started with Performance Tuning.pdf
DBA Basics: Getting Started with Performance Tuning.pdfDBA Basics: Getting Started with Performance Tuning.pdf
DBA Basics: Getting Started with Performance Tuning.pdf
 

Machine hw3

  • 1. Math 156 Homework3(Cui Yi) ID number:605068398 Math 156 Homework3(Cui Yi) ID number:605068398 Q1. Solution Q2. Solution Q3. Solution Q4. Solution Q5. Solution Q6. Solution Q7. Solution
  • 2. Q1. Solution The simplest representation of a linear discriminant function is obtained by taking a linear function of the input vector so that ​ We can avoid these difficulties by considering a single K-class discriminant comprising K linear functions of the form: ​ The decision boundary between class and class is therefore given by and hence corresponds to a (D − 1)-dimensional hyperplane defined by . ​ Consider two points and both of which lie on the decision surface.Because , we have and hence the vector is orthogonal to every vector lying within the decision surface, and sow determines the orientation of the decision surface. Similarly, if is a point on the decision surface, then , and so the normal distance from the origin to the decision surface is given by
  • 3. Figure 1. Scatter plot Q2. Solution It makes sense to classify this data with a linear classifier. mu = [1 2]; Sigma = [.1 .05; .05 .2]; r = mvnrnd(mu, Sigma, 50); scatter(r(:,1),r(:,2),'b.'); hold on mu1 = [2 4]; Sigma1 = [.2 -.1; -.1 .3]; r1 = mvnrnd(mu1, Sigma1, 50); scatter(r1(:,1),r1(:,2),'r+');
  • 4. Figure 2. Scatter plot and boundary Q3. Solution X = [[r; r1] ones(2*50, 1)]; b = [ones(50, 1); -ones(50, 1)]; coef = lscov(X, b); % plot the coefficient line using that learning data we generated xline = [0; 3]; yline = (-coef(3)-coef(1).*line_x)./coefs(2); plot( xline, yline, '-k' ); xlim([0 3]); ylim([1 6]);
  • 5. Q4. Solution mu = [1 2]; Sigma = [.1 .05; .05 .2]; r2 = mvnrnd(mu, Sigma, 50); scatter(r2(:,1),r2(:,2),'b.'); hold on mu1 = [2 4]; Sigma1 = [.2 -.1; -.1 .3]; r3 = mvnrnd(mu1, Sigma1, 50); scatter(r3(:,1),r3(:,2),'r+'); %size=50; covar_mat2 = 8.*eye(2); R = [r2; r3]; new_inds = randperm( (2*50) ); Correct = b(new_inds); R = R( new_inds, : ); R_ALL = [R, ones(2*50, 1)]; classify = sum( bsxfun( @times, R_ALL, coefs(:)' ), 2); classify(classify < 0) = -1; classify(classify >= 0) = 1; Accuracy = sum(classify == Correct./(2*50)
  • 6. Figure 3.Scatter plot and boundary Accuracy = 0.9900
  • 7. Figure 4 Q5. Solution mu = [2 2]; Sigma = [.2 .05; .05 .3]; r4 = mvnrnd(mu, Sigma, 50); scatter(r4(:,1),r4(:,2),'b.'); hold on mu1 = [2 4]; Sigma1 = [.4 -.1; -.1 .3]; r5 = mvnrnd(mu1, Sigma1, 50); scatter(r5(:,1),r5(:,2),'r+'); hold on mu2 = [3 3]; Sigma2 = [.5 -.3; -.3 .4]; r6 = mvnrnd(mu2, Sigma2, 50); scatter(r6(:,1),r6(:,2),'kd');
  • 8. Figure 5.Scatter plot and boundary Figure 6.Scatter plot and boundary(After adjusting)
  • 9. When calculating the accuracy between r4 and r5, Accuracy = When calculating the accuracy between r4 and r6 Accuracy = When calculating the accuracy between r5 and r6 Accuracy = The overall accuracy = R = [r4; r5]; 0.9400 R = [r4; r6]; 0.7200 R = [r5; r6]; 0.5000 0.7200
  • 10. Figure 7.New points Q6. Solution Then we test the performance by creating some new sets. When calculating the accuracy between r4 and r5, Accuracy = When calculating the accuracy between r4 and r6 R = [r4; r5]; 0.8400 R = [r4; r6];
  • 11. Accuracy = When calculating the accuracy between r5 and r6 Accuracy = The overall accuracy = The success rate do nor make sense given the distribution of the data, which implies the discriminant functions least squares do not make sense in when And it shows that least squares is highly sensitive to outliers. Q7. Solution Calculating all the x in set 1 2 3 for example, k=1 to 15 0.5400 R = [r5; r6]; 0.5200 0.6333 %mu = [2 2]; Sigma = [.2 .05; .05 .3]; %r4 = mvnrnd(mu, Sigma, 50); scatter(r4(:,1),r4(:,2),'b.'); hold on %mu1 = [2 4]; Sigma1 = [.4 -.1; -.1 .3]; %r5 = mvnrnd(mu1, Sigma1, 50); scatter(r5(:,1),r5(:,2),'r+'); hold on %mu2 = [3 3]; Sigma2 = [.5 -.3; -.3 .4]; %r6 = mvnrnd(mu2, Sigma2, 50); scatter(r6(:,1),r6(:,2),'kd'); legend('r4','r5','r6','Location','best');
  • 12. %Using the Euclidean distance figure accuracy=zeros(50,15); for k=1:15 for cycy=1:50 newpoint = r4(cycy,:); %line(newpoint(1),newpoint(2),'marker','x','color','k',... %'markersize',k,'linewidth',2) x=[r4;r5;r6]; Mdl = KDTreeSearcher(x); [n,d] = knnsearch(Mdl,newpoint,'k',k); %line(x(n,1),x(n,2),'color',[.5 .5 .5],'marker','o',... % 'linestyle','none','markersize',10) ; cx(cycy,:)=0; cy(cycy,:)=0; cz(cycy,:)=0; for i = 1:k if n(i) <=50 cx(cycy,:)=cx(cycy,:)+1; else if n(i)<=100 cy(cycy,:)=cy(cycy,:)+1; else cz(cycy,:)=cz(cycy,:)+1; end end end end classify=[cx cy cz]; the_point_in_cluster(:,k)=cx./(cy+cz+cx); subplot(3,5,k) plot(the_point_in_cluster(:,k),'o') title(sprintf('k = %d',k)) end
  • 13. Figure 8. The percentage of each points in Set D1 Figure 9. 10 nearest points of the first row of D2 For example, when we want to get 10 nearest points of the first row of D1, we can see that 8 points in 10 points are all in D1, so it will be classified in D1 and it also show the correctness of the KNN classification.
  • 14. Figure 10. 10 nearest points of the first row of D2 Figure 11. 10 nearest points of the first row of D3 When we want to get 10 nearest points of the first row of D2, we can see that 6 points in 10 points are all in D2, so it will be classified in D2 and it also show the correctness of the KNN classification.
  • 15. Figure 12. The percentage of each points in Set D2 Figure 13. The percentage of each points in Set D3 When we want to get 10 nearest points of the first row of D3, we can see that 10 points in 10 points are all in D3, so it will be classified in D3 and it also show the correctness of the KNN classification. Adjusting the parameter of the function, we can get
  • 16. Finally, I calculate the overall accuracy of the KNN classification. The failure of least squares should not surprise us when we recall that it corresponds to maximum likelihood under the assumption of a Gaussian conditional distribution, whereas binary target vectors clearly have a distribution that is far from Gaussian. By adopting k-nearest neighbor classifier, we can find KNN algorithm is quite useful in multi-classification problem. Whether we use linear method or unlinear method like KNN depends on the number of classes that we should to classify, the dimension of the points in the data and the situations of the degree of intersection and overlap. KNN is useful when the data have relatively high level of intersection nad overlap. Also, it can solve the unlinear problem that the linear classifier can not solve. %After using "fitcknn" and "predict" %Calculate the accuracy Acc_1 = sum(lab(1:50)==1); Acc_2 = sum(lab(51:100)==2); Acc_3 = sum(lab(101:159)==3); Accuracy(i) = ((Acc_1)+(Acc_2)+(Acc_3))./3; 1 0.8133 2 0.8200 3 0.8333 4 0.8400 5 0.8533 6 0.8600 7 0.8600 8 0.8533 9 0.8533 10 0.8533 11 0.8600 12 0.8600 13 0.8400 14 0.8533 15 0.8400