Your SlideShare is downloading. ×
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.


Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply



Published on



Published in: Technology

1 Like
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. Introduction to
    XLMiner and Microsoft Office are registered trademarks of the respective owners.
    Prediction is the process of finding how the value of predictor variables predict the response variable based on the predictor variables, or to study the relationship between the response variable and predictor variables.
    XLMiner provides different tools to perform this task:
    Multiple Linear regression.
    K-nearest neighbors
    Regression tree
    Neural Network
  • 3. PREDICTION- Multiple linear regression
    This procedure performs linear regression on the selected dataset. This fits a linear model of the form
    Y= b0 + b1X1 + b2X2+ .... + bkXk+ e 
    where Y is the dependent variable (response) and X1, X2,.. .,Xk are the independent variables (predictors) and e is random error.  b0 , b1, b2, .... bk are known as the regression coefficients, which have to be estimated from the data.  The multiple linear regression algorithm in XLMiner� chooses regression coefficients so as to minimize the difference between predicted values and actual values.  
    Linear regression is performed either to predict the response variable based on the predictor variables, or to study the relationship between the response variable and predictor variables. For example, using linear regression, the crime rate of a state can be explained as a function of other demographic factors like population, education, male to female ratio etc
  • 4. PREDICTION- Multiple linear regression
    Check those options that you want to be displayed in the output
  • 5. PREDICTION- Multiple linear regression(output)
    Use the navigator to view other outputs
  • 6. PREDICTION- Multiple linear regression(output)
  • 7. PREDICTION- K-nearest neighbors
    The k-NN technique is like the regression technique – here, the nearest neighbours to a particular object have more weight than the distant ones.
    An object is classified according to a vote by its neighbours. It is then classified to the class most common in its k-neighbours.
  • 8. PREDICTION- K-nearest neighbors
  • 9. PREDICTION- K-nearest neighbors(output)
  • 10. PREDICTION- Regression tree
    A single output (prediction) variable, which should be numerical, and one or more input (predictor) variables exist. The input variables can be a mixture of continuous and categorical variables. A regression tree is a decision tree where each node of the tree tests the value of the predictor variable to determine the prediction variable. The leaf nodes of the tree contain the output variables. 
    Regression tree is built through a process known as binary recursive partitioning. This is an iterative process of splitting the data into partitions, and then splitting it up further on each of the branches
    Since the tree is grown from the training data set, when it has reached full structure it usually suffers from over-fitting (i.e. it is "explaining" random elements of the training data that are not likely to be features of the larger population of data). This results in poor performance on real life data. Therefore, it is pruned using the validation data set.
    Regression trees are not used for classification; rather, they are used to approximate real-valued functions.
  • 11. PREDICTION- Regression tree
  • 12. PREDICTION- Regression tree
  • 13. PREDICTION- Regression tree
  • 14. PREDICTION- Regression tree-Full tree
  • 15. PREDICTION- Regression tree- Pruned Tree
  • 16. PREDICTION- Neural Networks
    The model for the artificial neural network is made when the records from the data base are processed one at a time and the computed value of their output is then compared with the actual value. The difference is again taken into account and fed back to the model so as to perfect it.
    This can go on for many iterations. XLMiner offers the architecture of multilayer feed-forward for modelling a neural network.
  • 17. PREDICTION- Neural Networks
  • 18. PREDICTION- Neural Networks
  • 19. PREDICTION- Neural Networks
  • 20. Thank you
    For more visit:
  • 21. Visit more self help tutorials
    Pick a tutorial of your choice and browse through it at your own pace.
    The tutorials section is free, self-guiding and will not involve any additional support.
    Visit us at