• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
BAQMaR - Conference DM
 

BAQMaR - Conference DM

on

  • 740 views

 

Statistics

Views

Total Views
740
Views on SlideShare
593
Embed Views
147

Actions

Likes
0
Downloads
0
Comments
0

1 Embed 147

http://www.baqmar.be 147

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    BAQMaR - Conference DM BAQMaR - Conference DM Presentation Transcript

    • Annual Conference 2010
    •  
    • Data Mining Track WIFI ibahn_conference CODE: 01A3D9
    • Walking In Circles
    • IT’S GREAT BEING A MODELER However…
    • SO, WHAT DO YOU DO FOR A LIVING? Something with data Some stuff with statistics and numbers It’s complicated… (do you have an hour?)
    • SO, WHAT DO YOU DO FOR A LIVING? However…
    • THE CRM CIRCLE
    • FURTHER READING…
      • K. Coussement and D. Van den Poel, Improving Customer Complaint Management by Automatic Email Classification Using Linguistic Style Features as Predictors, Decision Support Systems, 44 (4) (2008), pp 370-382.
    • FURTHER READING…
      • Buckinx W., Geert Verstraeten, and Dirk Van den Poel
      • (2007), “Predicting customer loyalty using the internal transactional database," Expert Systems with Applications , 32 (1).
    • IF IT DOESN’T EXIST…
      • ESTIMATE IT!
    • THE MODEL FARM
      • Easy analyses, high value
      • Outsourcing to Romenia, India, …
      • Modeling automation
      •  MODEL FARM
    • Modeling Automation
    • MODEL AUTOMATION
    • MODEL FARMERS ?
    • MODEL ARCHITECTS
    • IS AUTOMATION A THREAT?
    • OR AN OPPORTUNITY?
    • LET’S EVALUATE RATIONALLY
      • The charm argument…
        • bOb van Reeth
      • Performance…
        • Statistical and/or financial?
      • Model farm…
        • Design is more important than performance
    • RE!SET ?
      • Evoluationary
      • First Movers
      • Laggards
    • Data Mining Track WIFI ibahn_conference CODE: 01A3D9
    • Web Analytics 2.0: state of the art, challenges and opportunities Prof. dr. Bart Baesens Department of Decision Sciences and Information Management K.U.Leuven (Belgium) Vlerick Leuven Ghent Management School (Belgium) School of Management University of Southampton (United Kingdom) [email_address] www.dataminingapps.com Twitter: DataMiningApps Facebook: Data Mining with Bart
    • Overview
      • Introduction and example applications
      • Data collection on the Web
      • Web usage metrics and analysis challenges
      • Example advanced data mining applications
      • Conclusions
    • Web Intelligence
      • Web Intelligence : advanced analysis techniques applied to the Web; often referred to as Web mining
      • Common categories of Web mining
        • Web usage mining : discovering interesting patterns in how visitors use a Web site
          • E.g. association rules for visited pages
        • Web content mining : extracting useful information or discovering knowledge from Web page contents
          • E.g. information retrieval/extraction, automatic document categorization, etc.
        • Web structure mining : mining the hyperlink structure of the Web
          • E.g. identifying authoritative pages, web community detection, etc.
    • Example goals of Web analytics & mining
      • Improve web site design
        • e.g. how do (segments of) customers navigate through my site and do they find what they are looking for?
      • Measure the effectiveness of marketing strategies, SEM (SEO/PPC) activities, and advertising campaigns
        • e.g. monitor effect of new strategy on traffic and (more importantly) conversion rate, effectiveness of banner campaigns, search engine visibility, etc.
        • Landing (destination) page optimization: try design variations & test differences in conversion, etc.
      • Personalization: deliver content specific to each visitor
        • e.g. recommender systems, targeted offers, etc.
      • Etc.
    • Data collection on the Web
      • Web server logs
      • Page tagging (client side)
        • “ tagging” web page with a code snippet referencing a separate JavaScript file
      • Cookies
        • small text string that a Web server can send to a visitor's Web browser (as part of its HTTP response)
      • Web beacons
      195.162.218.155 - - [27/Jun/2002:00:01:54 +0200] "GET /dutch/shop/detail.html HTTP/1.1" 200 38890 "http://www.msn.be/shopping/food/" "Mozilla/4.0 (MSIE 6.0)"
    • Example web usage metrics (Kaushik, 2009)
      • Page Views
        • not that meaningful yet on its own
        • Page definition issues
      • Sessions/visits
        • Pages visited within one session
        • Based on IP address/ user agent/ cookies…
      • New/Repeat/Return visitors
      • Top entry/exit pages/destinations, …
      • Time on page, Time on site
      • Site abandonment rate
      • Average visits/days to purchase
      • Referrers
      • Search terms
      • Engagement
      • … .
    • Analysis challenges
      • Extremely messy data
        • Extensive preprocessing needed (e.g. irrelevant requests, sessionization, …)
      • Information overload: too many metrics!
      • Focus on actionable metrics
        • Bounce rate: ratio of visits where visitor left instantly
        • Conversion rate: percentage of visits or of unique visitors for which we observed the event (e.g. purchase, pdf download, registration, …)
      • Event driven Web analytics (e.g. AJAX )
        • Impact on existing metrics (e.g. bounce rate)
        • Definition of new metrics
        • Granularity of event capture
    • Analysis challenges (contd.)
      • Imperfect data
        • Cookies blocked/deleted
        • Users sharing computers, IP addresses, …
      • Focus on
        • Trends
          • Compare against previous period
          • Upward/downward trend, time series analysis, …
        • Segmentation
          • Bounce rates can be segmented by, e.g., traffic source (which sources are sending you bad traffic?), referral page, search engine, top landing pages, countries (geography), …
    • Dashboards
    • Example advanced data mining applications
      • Navigation analysis using sequence mining
      • Multivariate testing using experimental design
      • Semi-supervised learning for message/document labeling
      • Recommender systems using collaborative filtering
      • (Social) network based learning
    • Navigation analysis
      • Path analysis : analysis of frequent navigation patterns
        • From a given page which other pages does a group of users visit next in x% of the times
      • Funnel : focus on pre-determined sequence
        • Scope of one visit: e.g. stages of the checkout process
        • Conversion funnel over longer period of time
      • Page overlay / click density analysis:
        • clicks or other metrics overlaid directly on actual pages
        • can traverse through website as groups of users navigated through it
        • can also show conversion rate for each link on page
      • Heat maps : coloring indicates click frequencies
      • Apply segmentation where possible!
    • SAS Web Analytics: funnel example
    • SAS Web Analytics: site overlay
    • Experiment and test
      • Present different pages, page elements, etc. to random sample of actual visitors
      • Statistically compare metric of interest
        • For example, is conversion rate, bounce rate, etc. significantly better for one page design than other?
      • Example of pages to optimize:
        • Landing page (page you land on after clicked on ad)
        • Page in checkout process
        • Most popular pages
        • Pages with high bounce rates
      • Design of Experiments!
    • Multivariate testing
      • Variables are sections or elements of the page for which you want to test different variations
        • X1: version a or b
        • X2: version a or b or c
        • X3: version a or b
        • X4: a (blue) or b (green)
      X1: headline X2: sales copy X3: button text X3: image (e.g. “hero shot”) X4: button color
    • Semi-Supervised Classification
      • Motivation: data labeling can be expensive, difficult and unreliable especially in a Web data context
      • Examples (Joachims 1999 )
        • Social Media Analytics
          • Sentiment analysis using Twitter Tweets
        • Netnews filtering
          • User labels some news articles as interesting or not (training set)
        • Spam e-mail detection
        • Web page classification
      • Data mining techniques needed (e.g. transductive SVM)
    • Example Recommender System
    • Collaborative Filtering: Methods
      • When identifying buying patterns, make recommendation decisions for a specific user based on the judgments of users with similar interests (Resnick et al. 1994)
      • User-User methods
        • Identify like-minded users (e.g. k-nearest neighbor)
      • Item-Item methods
        • Correlation analysis
        • Regression analysis
        • Association rule mining
        • Bayesian belief networks
    • Example Bayesian Network
    • Social Networks Applications
      • Social networks
        • E-mail traffic
        • Research papers connected by citations
        • Telephone calls
        • LinkedIn, Facebook, MySpace, Friendster, Xing, …
      • Applications
        • Web community mining
        • Fraud detection
        • Terrorism detection (suspicion scoring)
        • Product recommendations
        • Churn detection
        • Epidemiology (spread of illness)
        • Protein-Protein interactions
    • Twitter metrics
      • Number of followers
          • Calculate using Twitter API or e.g. twittercounter.com
      • Churn rate: number of followers you lose in a given period
      • Message amplification: number of retweets of your messages, e.g. number of retweets per thousand followers in the last week, retweet quotient (# retweets/# tweets), …
          • measures viralness of your tweets
      • Average shared links click-through rate for links you share on Twitter
          • also calculate conversion rate for users clicking those links
      • Conversation rate: replies sent per day, replies received per day, tweets sent per day, average tweet length, ..
      • Composite metrics: e.g. Klout score ( klout.com )
    • Components of a Network Learning System
      • Non-relational (local) classifiers
          • Only uses local (e.g., customer-specific) information
          • Can be estimated using traditional machine learning methods (nearest neighbor, decision trees, …)
          • Used to generate the priors for the relational learning and collective inference
      • Relational model
          • Makes use of the relations/links in the network
      • Collective inference
          • Determines how the unknown values are estimated together, influencing each other
    • Example ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?
    • Conclusions
      • Lots of (imperfect) data in a web based context
      • Data preprocessing
      • Lots of opportunities for advanced data mining
      • Large scale, on-line, real-time data mining
      • Integration of on-line/off-line data
    •  
    • Annual Conference 2010