Successfully reported this slideshow.

# Machine Learning.pdf   ×

# Machine Learning.pdf

This lecture introduces machine learning, linear regression, and logistic regression.

This lecture introduces machine learning, linear regression, and logistic regression.

### Machine Learning.pdf

1. 1. Machine Learning Engr.Muhammad Suleman Memon Assistant Professor & HoD Department of Information Technology Email: engrsuleman14@gmail.com Cell#: 923332941065
2. 2. What is Machine Learning Machine learning is a segment of artificial intelligence. It is designed to make computers learn by themselves and perform operations without human intervention. It means a computer or a system designed with machine learning will identify, analyse and change accordingly and give the expected output when it comes across a new pattern of data, without any need of humans.
3. 3. How Machine Learning Works?
4. 4. Examples of Machine Learning
5. 5. Machine Learning – Methods SUPERVISED LEARNING UNSUPERVISED LEARNING SEMI-SUPERVISED LEARNING REINFORCEMENT LEARNING
6. 6. Machine Learning – Algorithms Neural networks Decision trees Random forests Support vector machines Nearest- neighbor mapping k-means clustering Self-organizing maps Expectation maximization Bayesian networks Kernel density estimation Principal component analysis Singular value decomposition
7. 7. Machine Learning Tools and Libraries
8. 8. Which Industries Use Machine Learning? ▪ Pharmaceuticals ▪ Banks and Financial Services ▪ Health Care and Treatments ▪ Online Sales ▪ Mining, Oil and Gas ▪ Government Schemes
9. 9. Linear Regression in Machine Learning Linear Regression is an algorithm that belongs to supervised Machine Learning. It tries to apply relations that will predict the outcome of an event based on the independent variable data points. The relation is usually a straight line that best fits the different data points as close as possible. The output is of a continuous form, i.e., numerical value. For example, the output could be revenue or sales in currency, the number of products sold, etc.
10. 10. Linear Regression Equation Linear regression can be expressed mathematically as: y= β0+ β 1x+ ε Y= Dependent Variable X= Independent Variable β 0= intercept of the line β1 = Linear regression coefficient (slope of the line) ε = random error
11. 11. Types of Linear Regression Simple Linear Regression Multiple Linear Regression Non-Linear Regression
12. 12. Simple Linear Regression A simple straight-line equation involving slope (dy/dx) and intercept (an integer/continuous value) is utilized in simple Linear Regression. y=mx+c where y denotes the output, x is the independent variable, and c is the intercept when x=0.
13. 13. Multiple Linear Regression ▪ When a number of independent variables are more than one, the governing linear equation applicable to regression takes a different form like: ▪ y= c+m1x1+m2x2… mnxn where represents the coefficient responsible for impact of different independent variables x1, x2 etc.
14. 14. Non-Linear Regression ▪ When the best fitting line is not a straight line but a curve, it is referred to as Non-Linear Regression.
15. 15. Advantages of Linear Regression For linear datasets, Linear Regression performs well to find the nature of the relationship among different variables. Linear Regression algorithms are easy to train and the Linear Regression models are easy to implement. Although, the Linear Regression models are likely to over-fit, but can be avoided using dimensionality reduction techniques such as regularization (L1 and L2) and cross- validation.
16. 16. Disadvantages of Linear Regression An important disadvantage of Linear Regression is that it assumes linearity between the dependent and independent variables, which is rarely represented in real-world data. It assumes a straight-line relationship between the dependent and independent variables, which is unlikely many times. It is prone to noise and overfitting. In datasets where the number of observations is lesser than the attributes, Linear Regression might not be a good choice as it can lead to overfitting. This is because the algorithm can start considering the noise while building the model.
17. 17. Key Benefits of Linear Regression Easy to Implement Scalability Interpretability Applicability in real-time
18. 18. Use Cases of Linear Regression ➢ Agriculture ➢ Banking ➢ Finance ➢ Education ➢ Marketing
19. 19. Agriculture ▪ Can be used to predict the amount of rainfall and crop yield.
20. 20. Banking ▪ it is implemented to predict probability of loan defaults.
21. 21. Finance sector ▪ used to predict stock prices and assess associated risks.
22. 22. Healthcare Sector ▪ Helpful in modelling the healthcare costs, predicting the length of stay in hospitals for patients.
23. 23. Sports analytics ▪ Can be used to predict the performance of players in upcoming games. Similarly
24. 24. Education ▪ Can be used in education to predict student performances in different courses.
25. 25. Business ▪ To forecast product demands, predict product sales, decide on marketing and advertising strategies, and so on.
26. 26. Best Practices for Linear Regression 1. Follow the Assumptions 2. Start with a Simple Model First 3. Use Visualizations 4. Start with Sample Dataset 5. Shifting to Multi-Linear Regression 6. Applying Linear Regression Model to Real-life Problems 7. Choosing Appropriate Data
27. 27. Frequently Asked Questions (FAQs) 1. What is the output of Linear Regression in machine learning? 2. What are the benefits of using Linear Regression? 3. How do you explain a Linear Regression model? 4. Which type of dataset is used for Linear Regression? 5. Which ML model is best for regression?
28. 28. Logistic Regression Logistic Regression is a popular statistical model used for binary classification, that is for predictions of the type this or that, yes or no, A or B, etc. Logistic regression can, however, be used for multiclass classification. 0: negative class 1: positive class •Some examples of classification are mentioned below: Email: spam / not spam Online transactions: fraudulent / not fraudulent Tumor: malignant / not malignant
29. 29. Logistic Regression Hypothesis
30. 30. How does Logistic Regression work? ➢ Logistic Regression uses a more complex cost function than Linear Regression, this cost function is called the ‘Sigmoid function’ or also known as the ‘logistic function’ instead of a linear function. ➢ The hypothesis of logistic regression tends to limit the cost function between 0 and 1. Therefore linear functions fail to represent it as it can have a value greater than 1 or less than 0 which is not possible as per the hypothesis of logistic regression.
31. 31. How does Logistic Regression work?
32. 32. Decision Boundary The prediction function returns a probability score between 0 and 1. If you want to map the discrete class (true/false, yes/no), you will have to select a threshold value above which you will be classifying values into class 1 and below the threshold value into class 2. p≥0.5,class=1 p<0.5,class=0 For example, suppose the threshold value is 0.5 and your prediction function returns 0.7, it will be classified as positive. If your predicted value is 0.2, which is less than the threshold value, it will be classified as negative.
33. 33. Decision Boundary
34. 34. Linear vs Logistic Regression Linear Regression Logistic Regression Outcome In linear regression, the outcome (dependent variable) is continuous. It can have any one of an infinite number of possible values. In logistic regression, the outcome (dependent variable) has only a limited number of possible values. The dependent variable Linear regression is used when your response variable is continuous. For instance, weight, height, number of hours, etc. Logistic regression is used when the response variable is categorical in nature. For instance, yes/no, true/false, red/green/blue, 1st/2nd/3rd/4th, etc.
35. 35. Linear vs Logistic Regression The independent variable In Linear Regression, the independent variables can be correlated with each other. In logistic Regression, the independent variables should not be correlated with each other. (no multi-collinearity) Equation Linear regression gives an equation which is of the form Y = mX + C, means equation with degree 1. Logistic regression gives an equation which is of the form Y = eX + e-X.
36. 36. Linear vs Logistic Regression ficient interpretation In linear regression, the coefficient interpretation of independent variables are quite straightforward (i.e. holding all other variables constant, with a unit increase in this variable, the dependent variable is expected to increase/decrease by xxx). In logistic regression, depends on the family (binomial, Poisson, etc.) and link (log, logit, inverse-log, etc.) you use, the interpretation is different. Error minimization technique Linear regression uses ordinary least squares method to minimise the errors and arrive at a best possible fit, while logistic regression uses maximum likelihood method to arrive at the solution. Logistic regression is just the opposite. Using the logistic loss function causes large errors to be penalized to an asymptotic constant.
37. 37. Linear vs Logistic Regression