SlideShare a Scribd company logo
1 of 61
Download to read offline
Analyse your SEO Data with R and Kibana
June 10th, 2016
Vincent Terrasi
Vincent Terrasi
--
SEO Director - Groupe M6Web
CuisineAZ, PasseportSanté, MeteoCity, …
--
Join the OVH adventure in July 2016
Blog : data-seo.com
Agenda
Mission : Do a Real-Time Log Analysis Tool
1. Using Screaming Frog to crawl a website
2. Using R for SEO Analysis
3. Using PaasLogs to centralize logs
4. Using Kibana to build fancy dashboards
5. Test !
3
“The world is full of obvious things which nobody by any chance ever observes.”
Sherlock Holmes Quote
Real-Time Log Analysis Tool 4
• Screaming Frog
• Google Analytics
• R
Crawler
• IIS Logs
• Apache Logs
• Nginx Logs
Logs
Using Screaming Frog
Screaming Frog : Export Data 6
When the crawl is
finished, click the
export button and save
the XLSX file
Add your url and click
the start button
Screaming Frog : Data ! 7
"Address"
"Content"
"Status Code"
"Status"
"Title 1"
"Title 1 Length"
"Title 1 Pixel Width"
"Title 2"
"Title 2 Length"
"Title 2 Pixel Width"
"Meta Description 1"
"Meta Description 1 Length“
"Meta Description 1 Pixel Width"
"Meta Keyword 1"
"Meta Keywords 1 Length"
"H1-1"
"H1-1 length"
"H2-1"
"H2-1 length"
"H2-2"
"H2-2 length"
"Meta Robots 1“
"Meta Refresh 1"
"Canonical Link Element 1"
"Size"
"Word Count"
"Level"
"Inlinks"
"Outlinks"
"External Outlinks"
"Hash"
"Response Time"
"Last Modified"
"Redirect URI“
"GA Sessions"
"GA % New Sessions"
"GA New Users"
"GA Bounce Rate"
"GA Page Views Per Sesssion"
"GA Avg Session Duration"
"GA Page Value"
"GA Goal Conversion Rate All"
"GA Goal Completions All"
"GA Goal Value All"
"Clicks"
"Impressions"
"CTR"
"Position"
"H1-2"
"H1-2 length"
Using R
Why R ?
Scriptable
Big Community
Mac / PC / Unix
Open Source
 7500 packages
9
Documentation
WheRe ? How ?
 https://www.cran.r-project.org/
10
Rgui RStudio
Using R : Step 1
Export All Urls
11
"request“;"section“;"active“;
"speed“;"compliant“;"depth“;"inlinks"
Packages :
Stringr
Ggplot
Dplyr
Readxl
R Examples
 Crawl via Screaming Frog
 Classify URLs by :
 Section
 Load Time
 Number of Inlinks
 Detect Active Pages
 Min 1 visit per month
 Detect Compliant Pages
 Canonical Not Equal
 Meta No-index
 Bad HTTP Status Code
 Detect Duplicate Meta
12
R : read files 13
# Read xlsx file
urls <- read_excel("internal_html_blog.xlsx",
sheet = 1,
col_names = TRUE,
skip=1)
# Read csv file
urls <- read.csv2("internal_html_blog.csv", sep=";", header = TRUE)
Detect Active Pages 14
#default
urls_select$Active <- FALSE
urls_select$Active[ which(urls_select$`GA Sessions` > 0) ] <- TRUE
#factor
urls_select$Active <- as.factor(urls_select$Active)
Classify URLs by Section 15
schemas <- read.csv(“conf.csv”,header = FALSE, col.names = "schema", stringsAsFactors = FALSE)
urls_select$Cat <- "no match"
for (j in 1:length(schemas))
{
urls_select$Cat[ which(stri_detect_fixed(urls_select$Address , schemas[j]) ) ] <- schemas[j]
}
/agenda/sorties-cinema/
/agenda/parutions/
/agenda/evenements/
/agenda/programme-tv/
/encyclopedie/
Conf.csv
Classify URLs By Load Time 16
urls_select$Speed <- NA
urls_select$Speed[ which(urls_select$`Response Time` < 0.501 ) ] <- "Fast“
urls_select$Speed [ which(urls_select$`Response Time` >= 0.501
& urls_select$`Response Time` < 1.001) ] <- "Medium“
urls_select$Speed[ which(urls_select$`Response Time` >= 1.001
& urls_select$`Response Time` < 2.001) ] <- "Slow“
urls_select$Speed[ which(urls_select$`Response Time` >= 2.001) ] <- "Slowest"
urls_select$Speed <- as.factor(urls_select$Speed)
Classify URLs By Number of Inlinks 17
urls_select$`Group Inlinks` <- "URLs with No Follow Inlinks"
urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` < 1 ) ] <- "URLs with No Follow Inlinks"
urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` == 1 ) ] <- "URLs with 1 Follow Inlink“
urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` > 1
& urls_select$`Inlinks` < 6) ] <- "URLs with 2 to 5 Follow Inlinks“
urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` >= 6
& urls_select$`Inlinks` < 11 ) ] <- "URLs with 5 to 10 Follow Inlinks“
urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` >= 11) ] <- "URLs with more than 10 Follow Inlinks"
urls_select$`Group Inlinks` <- as.factor(urls_select$`Group Inlinks`)
Detect Compliant Pages 18
# Compliant Pages
# Canonical Not Equal
# Meta No-index
# Bad HTTP Status Code
# Not Equal
urls_select$Compliant <- TRUE
urls_select$Compliant[ which(urls_select$`Status Code` != 200
| urls_select$`Canonical Link Element 1` != urls_select$Address
| urls_select$Status != "OK"
| grepl("noindex",urls_select$`Meta Robots 1`)
) ] <- FALSE
urls_select$Compliant <- as.factor(urls_select$Compliant)
Detect Duplicata Meta 19
urls_select$`Status Title` <- 'Unique'
urls_select$`Status Title`[ which(urls_select$`Title 1 Length` == 0) ] <- "No Set"
urls_select$`Status Description` <- 'Unique'
urls_select$`Status Description`[ which(urls_select$`Meta Description 1 Length` == 0) ] <- "No Set"
urls_select$`Status H1` <- 'Unique'
urls_select$`Status H1`[ which(urls_select$`H1-1 Length` == 0) ] <- "No Set"
urls_select$`Status Title`[ which(duplicated(urls_select$`Title 1`)) ] <- 'Duplicate'
urls_select$`Status Description`[ which(duplicated(urls_select$`Meta Description 1`)) ] <- 'Duplicate'
urls_select$`Status H1`[ which(duplicated(urls_select$`H1-1`)) ] <- 'Duplicate'
urls_select$`Status Title` <- as.factor(urls_select$`Status Title`)
urls_select$`Status Description` <- as.factor(urls_select$`Status Description`)
urls_select$`Status H1` <- as.factor(urls_select$`Status H1`)
Generate CSV 20
urls_light <- select(urls_select,Address,Cat,Active,Speed,Compliant,Level,Inlinks) %>%
mutate(Address=gsub(“http://moniste.fr","",Address))
colnames(urls_light) <- c("request","section","active","speed","compliant","depth","inlinks")
write.csv2(“file.csv”, filename, row.names = FALSE)
Package dplyr : select and mutate
Edit colnames
Use write.csv2
R : ggplot2 command 21
DATA
Create the ggplot object and populate it with data (always a data frame)
ggplot( mydata, aes( x=section,y=count, fill=active ))
LAYERS
Add layer(s)
+ geom_point()
FACET
Used for conditionning on variable(s)
+ facet_grid(~rescode)
ggplot2 : Geometry 22
R Chart : Active Pages 23
urls_level_active <- group_by(urls_select,Level,Active) %>%
summarise(count = n()) %>%
filter(Level<12)
Geometry Aesthetic
p <- ggplot(urls_level_active, aes(x=Level, y=count, fill=Active) ) +
geom_bar(stat = "identity", position = "stack") +
scale_fill_manual(values=c("#e5e500", "#4DBD33")) +
labs(x = "Depth", y ="Crawled URLs")
#display
print(p)
# save in file
ggsave(file=“chart.png")
R Chart : GA Sessions 24
urls_cat_gasessions <- aggregate( urls_select$`GA Sessions`,
by=list(Cat=urls_select$Cat, urls_select$Compliant), FUN=sum, na.rm=TRUE)
colnames(urls_cat_gasessions) <- c("Category","Compliant","GA Sessions")
p <- ggplot(urls_cat_gasessions, aes(x=Category, y=`GA Sessions`,
fill=Compliant))+
geom_bar(stat = "identity", position = "stack") +
theme(axis.text.x = element_text(angle = 90, hjust = 1)) +
labs(x = "Section", y ="Sessions") +
scale_fill_manual(values=c("#e5e500","#4DBD33"))
#display
print(p)
# save in file
ggsave(file=“chart.png")
R Chart : Compliant 25
urls_cat_compliant_statuscode <- group_by(urls_select,Cat,
Compliant,`Status Code`) %>%
summarise(count = n()) %>%
filter(grepl(200,`Status Code`) | grepl(301,`Status Code`))
p <- ggplot(urls_cat_compliant_statuscode, aes(x=Cat, y=count,
fill= Compliant ) ) +
geom_bar(stat = "identity", position = "stack") +
theme(axis.text.x = element_text(angle = 90, hjust = 1)) +
facet_grid(`Status Code` ~ .) +
labs(x = "Section", y ="Crawled URLs") +
scale_fill_manual(values=c("#e5e500","#4DBD33"))
R : SEO Cheat Sheet 26
Package Dplyr
select() allows you to rapidly zoom in on a useful subset using operations that usually only work on numeric variable positions
mutate() a data frame by adding new or replacing existing columns
filter() allows you to select a subset of rows in a data frame.
Package Gplot2
aes - geom
ggsave()
Package Readxl
read_excel()
read.csv2()
write.csv2()
ELK
Architecture 28
Hard to monitor and optimize host server performance
Architecture 29
Using PaasLogs
PaasLogs 31
PaasLogs 32
164 noeuds au sein du cluster Elastic Search
180 machines connectées
Entre 100 000 et 300 000 logs traités par seconde
12 milliards de logs transitent tous les jours
211 milliards de documents enregistrés
8 clicks and 3 copy/paste to use it !
PaasLogs: Step 1 33
PaasLogs : Step 2 34
PaasLogs 35
PaasLogs : Streams 36
The Streams are the recipient of your logs. When you send a log with the
right stream token, it arrives automatically to your stream in a awesome
software named Graylog.
PaasLogs : Dashboards 37
The Dashboard is the global view of your logs, A Dashboard is an efficient
way to exploit your logs and to view global information like metrics and
trends about your data without being overwhelmed by the logs details.
PaasLogs : Aliases 38
The Aliases will allow you to access directly your data from your Kibana or
using an Elasticsearch query
DON’T FORGET TO ENABLE KIBANA INDICES AND WRITE YOUR USER PASSWORD
PaasLogs : Inputs 39
The Inputs will allow you to ask OVH to host your own dedicated collector
like Logstash or Flowgger.
PaasLogs : Network Configuration 40
PaasLogs : Plugins Logstash 41
OVHCOMMONAPACHELOG %{IPORHOST:clientip} %{USER:ident} %{USER:auth} [%{HTTPDATE:timestamp}]
"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion_num:float})?|%{DATA:rawrequest})"
%{NUMBER:response_int:int} (?:%{NUMBER:bytes_int:int}|-)
OVHCOMBINEDAPACHELOG %{OVHCOMMONAPACHELOG} "%{NOTSPACE:referrer}" %{QS:agent}
PaasLogs : Config Logstash 42
if [type] == "apache" {
grok {
match => [ "message", "%{OVHCOMBINEDAPACHELOG}"]
patterns_dir => "/opt/logstash/patterns"
}
}
if [type] == "csv_infos" {
csv {
columns => ["request", "section","active", "speed",
"compliant","depth","inlinks"]
separator => ";"
}
}
How to send Logs to PaasLogs ? 43
Use Filebeat 44
Filebeat : Install 45
Install filebeat
curl -L -O https://download.elastic.co/beats/filebeat/filebeat_1.2.1_amd64.deb
sudo dpkg -i filebeat_1.2.1_amd64.deb
https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-installation.html
Filebeat : Edit filebeat.yml 46
filebeat:
prospectors:
-
paths:
- /home/ubuntu/lib/apache2/log/access.log
input_type: log
fields_under_root: true
document_type: apache
-
paths:
- /home/ubuntu/workspace/csv/crawled-urls-filebeat-*.csv
input_type: csv
fields_under_root: true
document_type: csv_infos
output:
logstash:
hosts: ["c002-5717e1b5d2ee5e00095cea38.in.laas.runabove.com:5044"]
worker: 1
tls:
certificate_authorities: ["/home/ubuntu/workspace/certificat/key.crt"]
Filebeat : Start 47
Copy / Paste Key.crt
-----BEGIN CERTIFICATE-----
MIIDozCCAougAwIBAgIJALxR4fTZlzQMMA0GCSqGSIb3DQEBCwUAMGgxCzAJBgNVBAYTAkZSMQ
8wDQYDVQQIDAZGcmFuY2UxDjAMBgNVBAcMBVBhcmlzMQwwCgYDVQQKDANPVkgxCzAJBgNVB
AYTAkZSMR0wGwYDVQQDDBRpbi5sYWFzLnJ1bmFib3ZlLmNvbTAeFw0xNjAzMTAxNTEzMDNaFw0
xNzAzMTAxNTEzMDNaMGgxCzAJBgNVBAYTAkZSMQ8wDQYDVQQIDAZGcmFuY2UxDjAMBgNVBA
cMBVBhcmlzMQwwCgYDVQQKDANPVkgx
-----END CERTIFICATE-----
Start Filebeat
sudo /etc/init.d/filebeat start
sudo /etc/init.d/filebeat stop
How to combine multiple sources ? 48
Paaslogs : Plugins ES 49
Description : Copies fields from previous log events in Elasticsearch to current events
if [type] == "apache" {
elasticsearch {
hosts => "laas.runabove.com"
index => "logsDataSEO" # alias
ssl => true
query => ‘ type:csv_infos AND request: "%{[request]}" ‘
fields => [["speed","speed"],["compliant","compliant"],
["section","section"],["active","active"],
["depth","depth"],["inlinks","inlinks"]]
}
}
# TIP : fields => [[src,dest],[src,dest]]
Using Kibana
Kibana : Install 51
Download Kibana 4.1
• Download and unzip Kibana 4
• Extract your archive
• Open config/kibana.yml in an editor
• Set the elasticsearch.url to point at your Elasticsearch instance
• Run ./bin/kibana (or binkibana.bat on Windows)
• Point your browser athttp://yourhost.com:5601
Kibana : Edit Kibana.yml 52
Update Kibana.xml
server.port: 8080
server.host: "0.0.0.0"
elasticsearch.url: "https://laas.runabove.com:9200"
elasticsearch.preserveHost: true
kibana.index: "ra-logs-33078"
kibana.defaultAppId: "discover"
elasticsearch.username: "ra-logs-33078"
elasticsearch.password: "rHftest6APlolNcc6"
Kibana : Line Chart 53
Number of active crawled from google over a period of time
Kibana : Vertical Bar Chart 54
Kibana : Pie Chart 55
How to compare two periods ? 56
Kibana : Use Date Range 57
Final Architecture
PassLogs Kibana
Filebeat
@
58
@
Soft RealTime
--
Old Logs
IIS
Apache
Ngnix
HA Proxy
Test yourself 59
Use Screaming Frog Spider Tool
www.screamingfrog.co.uk
Teach R
www.datacamp.com
www.data-seo.com
www.moise-le-geek.fr/push-your-hands-in-the-r-introduction/
Test PassLogs
www.runabove.com
Install Kibana
www.elastic.co/downloads/kibana
TODO List 60
- Create a GitHub Repository with all source code
- Add Plugin Logstash to do a reverse DNS lookup
- Schedule A Crawl By Command Line
- Upload Screaming Frog File to web server
Thank you
Keep in touch
June 10th, 2016
@vincentterrasi Vincent Terrasi

More Related Content

What's hot

Test Automation Architecture That Works by Bhupesh Dahal
Test Automation Architecture That Works by Bhupesh DahalTest Automation Architecture That Works by Bhupesh Dahal
Test Automation Architecture That Works by Bhupesh Dahal
QA or the Highway
 

What's hot (20)

Cypress e2e automation testing - day1 intor by: Hassan Hameed
Cypress e2e automation testing -  day1 intor by: Hassan HameedCypress e2e automation testing -  day1 intor by: Hassan Hameed
Cypress e2e automation testing - day1 intor by: Hassan Hameed
 
Setting up Page Object Model in Automation Framework
Setting up Page Object Model in Automation FrameworkSetting up Page Object Model in Automation Framework
Setting up Page Object Model in Automation Framework
 
Everything You Need to Know About HCL Notes 14
Everything You Need to Know About HCL Notes 14Everything You Need to Know About HCL Notes 14
Everything You Need to Know About HCL Notes 14
 
Let's read code: the python-requests library
Let's read code: the python-requests libraryLet's read code: the python-requests library
Let's read code: the python-requests library
 
Getting Started With Cypress
Getting Started With CypressGetting Started With Cypress
Getting Started With Cypress
 
Scouter와 influx db – grafana 연동 가이드
Scouter와 influx db – grafana 연동 가이드Scouter와 influx db – grafana 연동 가이드
Scouter와 influx db – grafana 연동 가이드
 
Test Driven Development
Test Driven DevelopmentTest Driven Development
Test Driven Development
 
Spring cloud on kubernetes
Spring cloud on kubernetesSpring cloud on kubernetes
Spring cloud on kubernetes
 
Daos
DaosDaos
Daos
 
JMockit
JMockitJMockit
JMockit
 
Selenium Maven With Eclipse | Edureka
Selenium Maven With Eclipse | EdurekaSelenium Maven With Eclipse | Edureka
Selenium Maven With Eclipse | Edureka
 
Python Testing 101 with Selenium
Python Testing 101 with SeleniumPython Testing 101 with Selenium
Python Testing 101 with Selenium
 
Java 8-streams-collectors-patterns
Java 8-streams-collectors-patternsJava 8-streams-collectors-patterns
Java 8-streams-collectors-patterns
 
Test Automation Using Python | Edureka
Test Automation Using Python | EdurekaTest Automation Using Python | Edureka
Test Automation Using Python | Edureka
 
Caffeine+Nicotine 尼古丁+咖啡 不瞌睡的Ppt制作秘诀 蔡学镛
Caffeine+Nicotine 尼古丁+咖啡 不瞌睡的Ppt制作秘诀 蔡学镛Caffeine+Nicotine 尼古丁+咖啡 不瞌睡的Ppt制作秘诀 蔡学镛
Caffeine+Nicotine 尼古丁+咖啡 不瞌睡的Ppt制作秘诀 蔡学镛
 
QSpiders - Automation using Selenium
QSpiders - Automation using SeleniumQSpiders - Automation using Selenium
QSpiders - Automation using Selenium
 
Test Automation Architecture That Works by Bhupesh Dahal
Test Automation Architecture That Works by Bhupesh DahalTest Automation Architecture That Works by Bhupesh Dahal
Test Automation Architecture That Works by Bhupesh Dahal
 
Elasticsearch development case
Elasticsearch development caseElasticsearch development case
Elasticsearch development case
 
HCL Domino Volt Installation, Configuration & New Features
HCL Domino Volt Installation, Configuration & New FeaturesHCL Domino Volt Installation, Configuration & New Features
HCL Domino Volt Installation, Configuration & New Features
 
Laravel presentation
Laravel presentationLaravel presentation
Laravel presentation
 

Viewers also liked

Network Analysis for SEO
Network Analysis for SEONetwork Analysis for SEO
Network Analysis for SEO
charlottebourne
 
Chocolade, vanille en kaneel
Chocolade, vanille en kaneelChocolade, vanille en kaneel
Chocolade, vanille en kaneel
joostdevos
 
Spreekbeurt over katjes
Spreekbeurt over katjesSpreekbeurt over katjes
Spreekbeurt over katjes
an100000
 
Cheat sheets for data scientists
Cheat sheets for data scientistsCheat sheets for data scientists
Cheat sheets for data scientists
Ajay Ohri
 

Viewers also liked (20)

Meetup Data-science OVH
Meetup Data-science OVHMeetup Data-science OVH
Meetup Data-science OVH
 
SEO et redaction web: les bonnes pratiques
SEO et redaction web: les bonnes pratiquesSEO et redaction web: les bonnes pratiques
SEO et redaction web: les bonnes pratiques
 
Référencement d'un site web
Référencement d'un site webRéférencement d'un site web
Référencement d'un site web
 
Network Analysis for SEO
Network Analysis for SEONetwork Analysis for SEO
Network Analysis for SEO
 
Synodiance > Fondamentaux du SEO appliqués au E-Commerce - Conférence Salon E...
Synodiance > Fondamentaux du SEO appliqués au E-Commerce - Conférence Salon E...Synodiance > Fondamentaux du SEO appliqués au E-Commerce - Conférence Salon E...
Synodiance > Fondamentaux du SEO appliqués au E-Commerce - Conférence Salon E...
 
Revolutionazing Search Advertising with ElasticSearch at Swoop
Revolutionazing Search Advertising with ElasticSearch at SwoopRevolutionazing Search Advertising with ElasticSearch at Swoop
Revolutionazing Search Advertising with ElasticSearch at Swoop
 
Offre de référencement (Audit, SEO et forfait) - Gini Concept Design
Offre de référencement (Audit, SEO et forfait) - Gini Concept DesignOffre de référencement (Audit, SEO et forfait) - Gini Concept Design
Offre de référencement (Audit, SEO et forfait) - Gini Concept Design
 
Presentation meetup ml bd
Presentation meetup ml bdPresentation meetup ml bd
Presentation meetup ml bd
 
Queduweb 2016 ndd expiré - sébastien moity - ls project
Queduweb 2016   ndd expiré - sébastien moity - ls projectQueduweb 2016   ndd expiré - sébastien moity - ls project
Queduweb 2016 ndd expiré - sébastien moity - ls project
 
Chocolade
ChocoladeChocolade
Chocolade
 
Chocolade
ChocoladeChocolade
Chocolade
 
Onderzoek presentatie 3
Onderzoek presentatie 3Onderzoek presentatie 3
Onderzoek presentatie 3
 
Chocolade, vanille en kaneel
Chocolade, vanille en kaneelChocolade, vanille en kaneel
Chocolade, vanille en kaneel
 
Kat af
Kat afKat af
Kat af
 
Kat af helemaal
Kat af helemaalKat af helemaal
Kat af helemaal
 
Spreekbeurt over katjes
Spreekbeurt over katjesSpreekbeurt over katjes
Spreekbeurt over katjes
 
Evolutions et nouveaux outils SEO
Evolutions et nouveaux outils SEOEvolutions et nouveaux outils SEO
Evolutions et nouveaux outils SEO
 
SEO local : SMX Paris 2016
SEO local : SMX Paris 2016SEO local : SMX Paris 2016
SEO local : SMX Paris 2016
 
Cheat sheets for data scientists
Cheat sheets for data scientistsCheat sheets for data scientists
Cheat sheets for data scientists
 
Marketing Analytics with R Lifting Campaign Success Rates
Marketing Analytics with R Lifting Campaign Success RatesMarketing Analytics with R Lifting Campaign Success Rates
Marketing Analytics with R Lifting Campaign Success Rates
 

Similar to Analyse your SEO Data with R and Kibana

E2 appspresso hands on lab
E2 appspresso hands on labE2 appspresso hands on lab
E2 appspresso hands on lab
NAVER D2
 
E3 appspresso hands on lab
E3 appspresso hands on labE3 appspresso hands on lab
E3 appspresso hands on lab
NAVER D2
 
Ajax Performance Tuning and Best Practices
Ajax Performance Tuning and Best PracticesAjax Performance Tuning and Best Practices
Ajax Performance Tuning and Best Practices
Doris Chen
 
FleetDB A Schema-Free Database in Clojure
FleetDB A Schema-Free Database in ClojureFleetDB A Schema-Free Database in Clojure
FleetDB A Schema-Free Database in Clojure
elliando dias
 
FleetDB: A Schema-Free Database in Clojure
FleetDB: A Schema-Free Database in ClojureFleetDB: A Schema-Free Database in Clojure
FleetDB: A Schema-Free Database in Clojure
Mark McGranaghan
 

Similar to Analyse your SEO Data with R and Kibana (20)

E2 appspresso hands on lab
E2 appspresso hands on labE2 appspresso hands on lab
E2 appspresso hands on lab
 
E3 appspresso hands on lab
E3 appspresso hands on labE3 appspresso hands on lab
E3 appspresso hands on lab
 
Introduction to Elasticsearch
Introduction to ElasticsearchIntroduction to Elasticsearch
Introduction to Elasticsearch
 
Import web resources using R Studio
Import web resources using R StudioImport web resources using R Studio
Import web resources using R Studio
 
Attack monitoring using ElasticSearch Logstash and Kibana
Attack monitoring using ElasticSearch Logstash and KibanaAttack monitoring using ElasticSearch Logstash and Kibana
Attack monitoring using ElasticSearch Logstash and Kibana
 
Logstash for SEO: come monitorare i Log del Web Server in realtime
Logstash for SEO: come monitorare i Log del Web Server in realtimeLogstash for SEO: come monitorare i Log del Web Server in realtime
Logstash for SEO: come monitorare i Log del Web Server in realtime
 
Spark Machine Learning: Adding Your Own Algorithms and Tools with Holden Kara...
Spark Machine Learning: Adding Your Own Algorithms and Tools with Holden Kara...Spark Machine Learning: Adding Your Own Algorithms and Tools with Holden Kara...
Spark Machine Learning: Adding Your Own Algorithms and Tools with Holden Kara...
 
Windows Server 2008 (PowerShell Scripting Uygulamaları)
Windows Server 2008 (PowerShell Scripting Uygulamaları)Windows Server 2008 (PowerShell Scripting Uygulamaları)
Windows Server 2008 (PowerShell Scripting Uygulamaları)
 
Introduction to MongoDB
Introduction to MongoDBIntroduction to MongoDB
Introduction to MongoDB
 
2011 Mongo FR - MongoDB introduction
2011 Mongo FR - MongoDB introduction2011 Mongo FR - MongoDB introduction
2011 Mongo FR - MongoDB introduction
 
DevOps Fest 2019. Сергей Марченко. Terraform: a novel about modules, provider...
DevOps Fest 2019. Сергей Марченко. Terraform: a novel about modules, provider...DevOps Fest 2019. Сергей Марченко. Terraform: a novel about modules, provider...
DevOps Fest 2019. Сергей Марченко. Terraform: a novel about modules, provider...
 
Ajax Performance Tuning and Best Practices
Ajax Performance Tuning and Best PracticesAjax Performance Tuning and Best Practices
Ajax Performance Tuning and Best Practices
 
Black Hat: XML Out-Of-Band Data Retrieval
Black Hat: XML Out-Of-Band Data RetrievalBlack Hat: XML Out-Of-Band Data Retrieval
Black Hat: XML Out-Of-Band Data Retrieval
 
Java/Scala Lab: Борис Трофимов - Обжигающая Big Data.
Java/Scala Lab: Борис Трофимов - Обжигающая Big Data.Java/Scala Lab: Борис Трофимов - Обжигающая Big Data.
Java/Scala Lab: Борис Трофимов - Обжигающая Big Data.
 
前端概述
前端概述前端概述
前端概述
 
Elastic Search
Elastic SearchElastic Search
Elastic Search
 
R data interfaces
R data interfacesR data interfaces
R data interfaces
 
FleetDB A Schema-Free Database in Clojure
FleetDB A Schema-Free Database in ClojureFleetDB A Schema-Free Database in Clojure
FleetDB A Schema-Free Database in Clojure
 
AWS October Webinar Series - Introducing Amazon Elasticsearch Service
AWS October Webinar Series - Introducing Amazon Elasticsearch ServiceAWS October Webinar Series - Introducing Amazon Elasticsearch Service
AWS October Webinar Series - Introducing Amazon Elasticsearch Service
 
FleetDB: A Schema-Free Database in Clojure
FleetDB: A Schema-Free Database in ClojureFleetDB: A Schema-Free Database in Clojure
FleetDB: A Schema-Free Database in Clojure
 

More from Vincent Terrasi

More from Vincent Terrasi (14)

SEO CAMP'us Paris 2024 - Déploiement de l'IA générative privée dans les organ...
SEO CAMP'us Paris 2024 - Déploiement de l'IA générative privée dans les organ...SEO CAMP'us Paris 2024 - Déploiement de l'IA générative privée dans les organ...
SEO CAMP'us Paris 2024 - Déploiement de l'IA générative privée dans les organ...
 
IA générative : Menace ou Opportunité pour le SEO
IA générative : Menace ou Opportunité pour le SEOIA générative : Menace ou Opportunité pour le SEO
IA générative : Menace ou Opportunité pour le SEO
 
slides SEO CAMP'us Paris 2022 - Google et tools SEO On vous a menti
slides SEO CAMP'us Paris 2022 - Google et tools SEO  On vous a mentislides SEO CAMP'us Paris 2022 - Google et tools SEO  On vous a menti
slides SEO CAMP'us Paris 2022 - Google et tools SEO On vous a menti
 
Une IA pour votre SEO, une méthode inédite pour accélérer vos projets Data SEO
Une IA pour votre SEO, une méthode inédite pour accélérer vos projets Data SEOUne IA pour votre SEO, une méthode inédite pour accélérer vos projets Data SEO
Une IA pour votre SEO, une méthode inédite pour accélérer vos projets Data SEO
 
SEO AnswerBox, une méthode inédite pour interroger vos données et créer vos d...
SEO AnswerBox, une méthode inédite pour interroger vos données et créer vos d...SEO AnswerBox, une méthode inédite pour interroger vos données et créer vos d...
SEO AnswerBox, une méthode inédite pour interroger vos données et créer vos d...
 
Génération de contenu pour le SEO
Génération de contenu pour le SEOGénération de contenu pour le SEO
Génération de contenu pour le SEO
 
Comment faire du Data SEO sans savoir programmer ?
Comment faire du Data SEO sans savoir programmer ?Comment faire du Data SEO sans savoir programmer ?
Comment faire du Data SEO sans savoir programmer ?
 
Explainable Machine Learning for Ranking Factors
Explainable Machine Learning for Ranking FactorsExplainable Machine Learning for Ranking Factors
Explainable Machine Learning for Ranking Factors
 
Fausses données et Bad Data : restez vigilant !
Fausses données et Bad Data : restez vigilant !Fausses données et Bad Data : restez vigilant !
Fausses données et Bad Data : restez vigilant !
 
Comment les plateformes de Data Science métamorphosent le SEO ?
Comment les plateformes de Data Science métamorphosent le SEO ?Comment les plateformes de Data Science métamorphosent le SEO ?
Comment les plateformes de Data Science métamorphosent le SEO ?
 
Find out how DataScience has revolutionized SEO for OVH
Find out how DataScience has revolutionized SEO for OVHFind out how DataScience has revolutionized SEO for OVH
Find out how DataScience has revolutionized SEO for OVH
 
How to boost your datamanagement with Dremio ?
How to boost your datamanagement with Dremio ?How to boost your datamanagement with Dremio ?
How to boost your datamanagement with Dremio ?
 
How to automate all your SEO projects
How to automate all your SEO projectsHow to automate all your SEO projects
How to automate all your SEO projects
 
How Data Science can boost your SEO ?
How Data Science can boost your SEO ?How Data Science can boost your SEO ?
How Data Science can boost your SEO ?
 

Recently uploaded

Machine Learning For Career Growth..pptx
Machine Learning For Career Growth..pptxMachine Learning For Career Growth..pptx
Machine Learning For Career Growth..pptx
benishzehra469
 
一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理
pyhepag
 
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotecAbortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Riyadh +966572737505 get cytotec
 
一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理
cyebo
 
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
pyhepag
 
Exploratory Data Analysis - Dilip S.pptx
Exploratory Data Analysis - Dilip S.pptxExploratory Data Analysis - Dilip S.pptx
Exploratory Data Analysis - Dilip S.pptx
DilipVasan
 
一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理
cyebo
 
Fuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertaintyFuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertainty
RafigAliyev2
 

Recently uploaded (20)

MALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptx
MALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptxMALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptx
MALL CUSTOMER SEGMENTATION USING K-MEANS CLUSTERING.pptx
 
Machine Learning For Career Growth..pptx
Machine Learning For Career Growth..pptxMachine Learning For Career Growth..pptx
Machine Learning For Career Growth..pptx
 
basics of data science with application areas.pdf
basics of data science with application areas.pdfbasics of data science with application areas.pdf
basics of data science with application areas.pdf
 
2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...
2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...
2024-05-14 - Tableau User Group - TC24 Hot Topics - Tableau Pulse and Einstei...
 
2024 Q1 Tableau User Group Leader Quarterly Call
2024 Q1 Tableau User Group Leader Quarterly Call2024 Q1 Tableau User Group Leader Quarterly Call
2024 Q1 Tableau User Group Leader Quarterly Call
 
一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理
 
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotecAbortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
 
Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)
 
一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理
 
Jpolillo Amazon PPC - Bid Optimization Sample
Jpolillo Amazon PPC - Bid Optimization SampleJpolillo Amazon PPC - Bid Optimization Sample
Jpolillo Amazon PPC - Bid Optimization Sample
 
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPsWebinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
 
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
 
Slip-and-fall Injuries: Top Workers' Comp Claims
Slip-and-fall Injuries: Top Workers' Comp ClaimsSlip-and-fall Injuries: Top Workers' Comp Claims
Slip-and-fall Injuries: Top Workers' Comp Claims
 
Exploratory Data Analysis - Dilip S.pptx
Exploratory Data Analysis - Dilip S.pptxExploratory Data Analysis - Dilip S.pptx
Exploratory Data Analysis - Dilip S.pptx
 
Business update Q1 2024 Lar España Real Estate SOCIMI
Business update Q1 2024 Lar España Real Estate SOCIMIBusiness update Q1 2024 Lar España Real Estate SOCIMI
Business update Q1 2024 Lar España Real Estate SOCIMI
 
AI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdfAI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdf
 
Tabula.io Cheatsheet: automate your data workflows
Tabula.io Cheatsheet: automate your data workflowsTabula.io Cheatsheet: automate your data workflows
Tabula.io Cheatsheet: automate your data workflows
 
一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理
 
Fuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertaintyFuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertainty
 
how can i exchange pi coins for others currency like Bitcoin
how can i exchange pi coins for others currency like Bitcoinhow can i exchange pi coins for others currency like Bitcoin
how can i exchange pi coins for others currency like Bitcoin
 

Analyse your SEO Data with R and Kibana

  • 1. Analyse your SEO Data with R and Kibana June 10th, 2016 Vincent Terrasi
  • 2. Vincent Terrasi -- SEO Director - Groupe M6Web CuisineAZ, PasseportSanté, MeteoCity, … -- Join the OVH adventure in July 2016 Blog : data-seo.com
  • 3. Agenda Mission : Do a Real-Time Log Analysis Tool 1. Using Screaming Frog to crawl a website 2. Using R for SEO Analysis 3. Using PaasLogs to centralize logs 4. Using Kibana to build fancy dashboards 5. Test ! 3 “The world is full of obvious things which nobody by any chance ever observes.” Sherlock Holmes Quote
  • 4. Real-Time Log Analysis Tool 4 • Screaming Frog • Google Analytics • R Crawler • IIS Logs • Apache Logs • Nginx Logs Logs
  • 6. Screaming Frog : Export Data 6 When the crawl is finished, click the export button and save the XLSX file Add your url and click the start button
  • 7. Screaming Frog : Data ! 7 "Address" "Content" "Status Code" "Status" "Title 1" "Title 1 Length" "Title 1 Pixel Width" "Title 2" "Title 2 Length" "Title 2 Pixel Width" "Meta Description 1" "Meta Description 1 Length“ "Meta Description 1 Pixel Width" "Meta Keyword 1" "Meta Keywords 1 Length" "H1-1" "H1-1 length" "H2-1" "H2-1 length" "H2-2" "H2-2 length" "Meta Robots 1“ "Meta Refresh 1" "Canonical Link Element 1" "Size" "Word Count" "Level" "Inlinks" "Outlinks" "External Outlinks" "Hash" "Response Time" "Last Modified" "Redirect URI“ "GA Sessions" "GA % New Sessions" "GA New Users" "GA Bounce Rate" "GA Page Views Per Sesssion" "GA Avg Session Duration" "GA Page Value" "GA Goal Conversion Rate All" "GA Goal Completions All" "GA Goal Value All" "Clicks" "Impressions" "CTR" "Position" "H1-2" "H1-2 length"
  • 9. Why R ? Scriptable Big Community Mac / PC / Unix Open Source  7500 packages 9 Documentation
  • 10. WheRe ? How ?  https://www.cran.r-project.org/ 10 Rgui RStudio
  • 11. Using R : Step 1 Export All Urls 11 "request“;"section“;"active“; "speed“;"compliant“;"depth“;"inlinks" Packages : Stringr Ggplot Dplyr Readxl
  • 12. R Examples  Crawl via Screaming Frog  Classify URLs by :  Section  Load Time  Number of Inlinks  Detect Active Pages  Min 1 visit per month  Detect Compliant Pages  Canonical Not Equal  Meta No-index  Bad HTTP Status Code  Detect Duplicate Meta 12
  • 13. R : read files 13 # Read xlsx file urls <- read_excel("internal_html_blog.xlsx", sheet = 1, col_names = TRUE, skip=1) # Read csv file urls <- read.csv2("internal_html_blog.csv", sep=";", header = TRUE)
  • 14. Detect Active Pages 14 #default urls_select$Active <- FALSE urls_select$Active[ which(urls_select$`GA Sessions` > 0) ] <- TRUE #factor urls_select$Active <- as.factor(urls_select$Active)
  • 15. Classify URLs by Section 15 schemas <- read.csv(“conf.csv”,header = FALSE, col.names = "schema", stringsAsFactors = FALSE) urls_select$Cat <- "no match" for (j in 1:length(schemas)) { urls_select$Cat[ which(stri_detect_fixed(urls_select$Address , schemas[j]) ) ] <- schemas[j] } /agenda/sorties-cinema/ /agenda/parutions/ /agenda/evenements/ /agenda/programme-tv/ /encyclopedie/ Conf.csv
  • 16. Classify URLs By Load Time 16 urls_select$Speed <- NA urls_select$Speed[ which(urls_select$`Response Time` < 0.501 ) ] <- "Fast“ urls_select$Speed [ which(urls_select$`Response Time` >= 0.501 & urls_select$`Response Time` < 1.001) ] <- "Medium“ urls_select$Speed[ which(urls_select$`Response Time` >= 1.001 & urls_select$`Response Time` < 2.001) ] <- "Slow“ urls_select$Speed[ which(urls_select$`Response Time` >= 2.001) ] <- "Slowest" urls_select$Speed <- as.factor(urls_select$Speed)
  • 17. Classify URLs By Number of Inlinks 17 urls_select$`Group Inlinks` <- "URLs with No Follow Inlinks" urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` < 1 ) ] <- "URLs with No Follow Inlinks" urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` == 1 ) ] <- "URLs with 1 Follow Inlink“ urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` > 1 & urls_select$`Inlinks` < 6) ] <- "URLs with 2 to 5 Follow Inlinks“ urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` >= 6 & urls_select$`Inlinks` < 11 ) ] <- "URLs with 5 to 10 Follow Inlinks“ urls_select$`Group Inlinks`[ which(urls_select$`Inlinks` >= 11) ] <- "URLs with more than 10 Follow Inlinks" urls_select$`Group Inlinks` <- as.factor(urls_select$`Group Inlinks`)
  • 18. Detect Compliant Pages 18 # Compliant Pages # Canonical Not Equal # Meta No-index # Bad HTTP Status Code # Not Equal urls_select$Compliant <- TRUE urls_select$Compliant[ which(urls_select$`Status Code` != 200 | urls_select$`Canonical Link Element 1` != urls_select$Address | urls_select$Status != "OK" | grepl("noindex",urls_select$`Meta Robots 1`) ) ] <- FALSE urls_select$Compliant <- as.factor(urls_select$Compliant)
  • 19. Detect Duplicata Meta 19 urls_select$`Status Title` <- 'Unique' urls_select$`Status Title`[ which(urls_select$`Title 1 Length` == 0) ] <- "No Set" urls_select$`Status Description` <- 'Unique' urls_select$`Status Description`[ which(urls_select$`Meta Description 1 Length` == 0) ] <- "No Set" urls_select$`Status H1` <- 'Unique' urls_select$`Status H1`[ which(urls_select$`H1-1 Length` == 0) ] <- "No Set" urls_select$`Status Title`[ which(duplicated(urls_select$`Title 1`)) ] <- 'Duplicate' urls_select$`Status Description`[ which(duplicated(urls_select$`Meta Description 1`)) ] <- 'Duplicate' urls_select$`Status H1`[ which(duplicated(urls_select$`H1-1`)) ] <- 'Duplicate' urls_select$`Status Title` <- as.factor(urls_select$`Status Title`) urls_select$`Status Description` <- as.factor(urls_select$`Status Description`) urls_select$`Status H1` <- as.factor(urls_select$`Status H1`)
  • 20. Generate CSV 20 urls_light <- select(urls_select,Address,Cat,Active,Speed,Compliant,Level,Inlinks) %>% mutate(Address=gsub(“http://moniste.fr","",Address)) colnames(urls_light) <- c("request","section","active","speed","compliant","depth","inlinks") write.csv2(“file.csv”, filename, row.names = FALSE) Package dplyr : select and mutate Edit colnames Use write.csv2
  • 21. R : ggplot2 command 21 DATA Create the ggplot object and populate it with data (always a data frame) ggplot( mydata, aes( x=section,y=count, fill=active )) LAYERS Add layer(s) + geom_point() FACET Used for conditionning on variable(s) + facet_grid(~rescode)
  • 23. R Chart : Active Pages 23 urls_level_active <- group_by(urls_select,Level,Active) %>% summarise(count = n()) %>% filter(Level<12) Geometry Aesthetic p <- ggplot(urls_level_active, aes(x=Level, y=count, fill=Active) ) + geom_bar(stat = "identity", position = "stack") + scale_fill_manual(values=c("#e5e500", "#4DBD33")) + labs(x = "Depth", y ="Crawled URLs") #display print(p) # save in file ggsave(file=“chart.png")
  • 24. R Chart : GA Sessions 24 urls_cat_gasessions <- aggregate( urls_select$`GA Sessions`, by=list(Cat=urls_select$Cat, urls_select$Compliant), FUN=sum, na.rm=TRUE) colnames(urls_cat_gasessions) <- c("Category","Compliant","GA Sessions") p <- ggplot(urls_cat_gasessions, aes(x=Category, y=`GA Sessions`, fill=Compliant))+ geom_bar(stat = "identity", position = "stack") + theme(axis.text.x = element_text(angle = 90, hjust = 1)) + labs(x = "Section", y ="Sessions") + scale_fill_manual(values=c("#e5e500","#4DBD33")) #display print(p) # save in file ggsave(file=“chart.png")
  • 25. R Chart : Compliant 25 urls_cat_compliant_statuscode <- group_by(urls_select,Cat, Compliant,`Status Code`) %>% summarise(count = n()) %>% filter(grepl(200,`Status Code`) | grepl(301,`Status Code`)) p <- ggplot(urls_cat_compliant_statuscode, aes(x=Cat, y=count, fill= Compliant ) ) + geom_bar(stat = "identity", position = "stack") + theme(axis.text.x = element_text(angle = 90, hjust = 1)) + facet_grid(`Status Code` ~ .) + labs(x = "Section", y ="Crawled URLs") + scale_fill_manual(values=c("#e5e500","#4DBD33"))
  • 26. R : SEO Cheat Sheet 26 Package Dplyr select() allows you to rapidly zoom in on a useful subset using operations that usually only work on numeric variable positions mutate() a data frame by adding new or replacing existing columns filter() allows you to select a subset of rows in a data frame. Package Gplot2 aes - geom ggsave() Package Readxl read_excel() read.csv2() write.csv2()
  • 27. ELK
  • 28. Architecture 28 Hard to monitor and optimize host server performance
  • 32. PaasLogs 32 164 noeuds au sein du cluster Elastic Search 180 machines connectées Entre 100 000 et 300 000 logs traités par seconde 12 milliards de logs transitent tous les jours 211 milliards de documents enregistrés 8 clicks and 3 copy/paste to use it !
  • 36. PaasLogs : Streams 36 The Streams are the recipient of your logs. When you send a log with the right stream token, it arrives automatically to your stream in a awesome software named Graylog.
  • 37. PaasLogs : Dashboards 37 The Dashboard is the global view of your logs, A Dashboard is an efficient way to exploit your logs and to view global information like metrics and trends about your data without being overwhelmed by the logs details.
  • 38. PaasLogs : Aliases 38 The Aliases will allow you to access directly your data from your Kibana or using an Elasticsearch query DON’T FORGET TO ENABLE KIBANA INDICES AND WRITE YOUR USER PASSWORD
  • 39. PaasLogs : Inputs 39 The Inputs will allow you to ask OVH to host your own dedicated collector like Logstash or Flowgger.
  • 40. PaasLogs : Network Configuration 40
  • 41. PaasLogs : Plugins Logstash 41 OVHCOMMONAPACHELOG %{IPORHOST:clientip} %{USER:ident} %{USER:auth} [%{HTTPDATE:timestamp}] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion_num:float})?|%{DATA:rawrequest})" %{NUMBER:response_int:int} (?:%{NUMBER:bytes_int:int}|-) OVHCOMBINEDAPACHELOG %{OVHCOMMONAPACHELOG} "%{NOTSPACE:referrer}" %{QS:agent}
  • 42. PaasLogs : Config Logstash 42 if [type] == "apache" { grok { match => [ "message", "%{OVHCOMBINEDAPACHELOG}"] patterns_dir => "/opt/logstash/patterns" } } if [type] == "csv_infos" { csv { columns => ["request", "section","active", "speed", "compliant","depth","inlinks"] separator => ";" } }
  • 43. How to send Logs to PaasLogs ? 43
  • 45. Filebeat : Install 45 Install filebeat curl -L -O https://download.elastic.co/beats/filebeat/filebeat_1.2.1_amd64.deb sudo dpkg -i filebeat_1.2.1_amd64.deb https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-installation.html
  • 46. Filebeat : Edit filebeat.yml 46 filebeat: prospectors: - paths: - /home/ubuntu/lib/apache2/log/access.log input_type: log fields_under_root: true document_type: apache - paths: - /home/ubuntu/workspace/csv/crawled-urls-filebeat-*.csv input_type: csv fields_under_root: true document_type: csv_infos output: logstash: hosts: ["c002-5717e1b5d2ee5e00095cea38.in.laas.runabove.com:5044"] worker: 1 tls: certificate_authorities: ["/home/ubuntu/workspace/certificat/key.crt"]
  • 47. Filebeat : Start 47 Copy / Paste Key.crt -----BEGIN CERTIFICATE----- MIIDozCCAougAwIBAgIJALxR4fTZlzQMMA0GCSqGSIb3DQEBCwUAMGgxCzAJBgNVBAYTAkZSMQ 8wDQYDVQQIDAZGcmFuY2UxDjAMBgNVBAcMBVBhcmlzMQwwCgYDVQQKDANPVkgxCzAJBgNVB AYTAkZSMR0wGwYDVQQDDBRpbi5sYWFzLnJ1bmFib3ZlLmNvbTAeFw0xNjAzMTAxNTEzMDNaFw0 xNzAzMTAxNTEzMDNaMGgxCzAJBgNVBAYTAkZSMQ8wDQYDVQQIDAZGcmFuY2UxDjAMBgNVBA cMBVBhcmlzMQwwCgYDVQQKDANPVkgx -----END CERTIFICATE----- Start Filebeat sudo /etc/init.d/filebeat start sudo /etc/init.d/filebeat stop
  • 48. How to combine multiple sources ? 48
  • 49. Paaslogs : Plugins ES 49 Description : Copies fields from previous log events in Elasticsearch to current events if [type] == "apache" { elasticsearch { hosts => "laas.runabove.com" index => "logsDataSEO" # alias ssl => true query => ‘ type:csv_infos AND request: "%{[request]}" ‘ fields => [["speed","speed"],["compliant","compliant"], ["section","section"],["active","active"], ["depth","depth"],["inlinks","inlinks"]] } } # TIP : fields => [[src,dest],[src,dest]]
  • 51. Kibana : Install 51 Download Kibana 4.1 • Download and unzip Kibana 4 • Extract your archive • Open config/kibana.yml in an editor • Set the elasticsearch.url to point at your Elasticsearch instance • Run ./bin/kibana (or binkibana.bat on Windows) • Point your browser athttp://yourhost.com:5601
  • 52. Kibana : Edit Kibana.yml 52 Update Kibana.xml server.port: 8080 server.host: "0.0.0.0" elasticsearch.url: "https://laas.runabove.com:9200" elasticsearch.preserveHost: true kibana.index: "ra-logs-33078" kibana.defaultAppId: "discover" elasticsearch.username: "ra-logs-33078" elasticsearch.password: "rHftest6APlolNcc6"
  • 53. Kibana : Line Chart 53 Number of active crawled from google over a period of time
  • 54. Kibana : Vertical Bar Chart 54
  • 55. Kibana : Pie Chart 55
  • 56. How to compare two periods ? 56
  • 57. Kibana : Use Date Range 57
  • 58. Final Architecture PassLogs Kibana Filebeat @ 58 @ Soft RealTime -- Old Logs IIS Apache Ngnix HA Proxy
  • 59. Test yourself 59 Use Screaming Frog Spider Tool www.screamingfrog.co.uk Teach R www.datacamp.com www.data-seo.com www.moise-le-geek.fr/push-your-hands-in-the-r-introduction/ Test PassLogs www.runabove.com Install Kibana www.elastic.co/downloads/kibana
  • 60. TODO List 60 - Create a GitHub Repository with all source code - Add Plugin Logstash to do a reverse DNS lookup - Schedule A Crawl By Command Line - Upload Screaming Frog File to web server
  • 61. Thank you Keep in touch June 10th, 2016 @vincentterrasi Vincent Terrasi