More Related Content Similar to Mulesoft with ELK (Elastic Search, Log stash, Kibana) (20) Mulesoft with ELK (Elastic Search, Log stash, Kibana)1. All contents © MuleSoft, LLC
Bangalore Mulesoft Meetup
Mule with ELK
28th August 2021, 4PM IST
Bangalore Mulesoft
Meetup
Mule with ELK
28th August 2021, 4 PM IST
2. All contents © MuleSoft, LLC
● Introduction Of Moderators and Speaker
● Community Details
● Mulesoft with ELK
● ELK Demonstration
● Queries
● Trivia Quiz
3. All contents © MuleSoft, LLC
Speaker
Gaurav
Sethi
Moderators
Rajesh
Kumar
Pruthvi Raj Nagaraju K
4. All contents © MuleSoft, LLC
● Register for Meetups : https://meetups.mulesoft.com/
● Register for Official training platform : https://training.mulesoft.com/
● Register for upcoming PKO : https://library.mulesoft.com/l/pko-2021-on-demand
● Mulesoft Forum : https://help.mulesoft.com/
● For All Blogs : https://blogs.mulesoft.com/
● Anypoint Platform : http://anypoint.mulesoft.com/
● Join Exam Readiness Sessions
Community Details
5. All contents © MuleSoft, LLC
Mule with ELK
Gaurav Sethi
Senior Integration Specialist, CSG
6. All contents © MuleSoft, LLC
Introduction
Use the Elastic Stack (ELK stack) to analyze the business data and API analytics.
You can use Logstash for Filebeat to process Anypoint Platform log files, insert them
into an Elasticsearch database, and then analyze them with Kibana.
7. All contents © MuleSoft, LLC
Elastic Stack Overview
ELK stands for the three Elastic products - Elasticsearch, Logstash, and Kibana
To understand what the Elastic core products, we will use a simple architecture:
1. The logs will be created by an application and pushed into the AWS SQS Queue.
2. Logstash aggregates the logs from different sources and processes it.
3. Elasticsearch stores and indexes the data in order to search it.
4. Kibana is the visualization tool that makes sense of the data.
9. All contents © MuleSoft, LLC
Logstash Overview
Logstash is a data collection tool. It consists of three elements: input, filters, and
output.
10. All contents © MuleSoft, LLC
Elastic Search and Kibana Overview
ES (Elasticsearch) is a NoSQL database that is based on the Lucene search engine. ES
provides RESTful APIs to search and analyze the data. Different data types such as
numbers, text, and geo — structured or unstructured — can be stored.
Kibana is a data visualization tool. It helps you to quickly get insight into the data and
offers capabilities like diagrams, dashboards, etc. Kibana uses all the data stored on
Elasticsearch.
11. All contents © MuleSoft, LLC
Why we need to push logs into Kibana
Mulesoft CloudHub stores up to 100 MB of log data per application per worker, or up
to 30 days, whichever limit is reached first.
Due to this, we are not able to preserve all the logs for long time. And the searching
process is laborious. So, your necessity to push the logs into to ELK, which can store the
logs for long time and can visualize the logs.
13. All contents © MuleSoft, LLC
Procedure
1. Install Logstash, Elastic Search, Kibana.
2. Configure Kibana.
3. Configure Elastic Search.
4. Configure Logstash.
5. Create Hello world Application and Configure Log4J
6. Visualize logs into Kibana.
14. All contents © MuleSoft, LLC
ELK Installation
Logstash : The Logstash binaries are available from https://www.elastic.co/downloads
Download the Logstash installation file for your host environment—TARG.GZ, DEB, ZIP,
or RPM. Unpack the file. Do not install Logstash into a directory path that contains
colon (:) characters.
Elastic Search: The Elastic Search are available from the following link.
https://www.elastic.co/guide/en/elasticsearch/reference/current/zip-windows.html
This comes with a elasticsearch-service.bat command which will setup Elasticsearch to
run as a service.
Kibana: Download the .zip windows archive for Kibana v7.13.1 from the following link.
https://artifacts.elastic.co/downloads/kibana/kibana-7.13.1-windows-x86_64.zip
Please update the latest version.
15. All contents © MuleSoft, LLC
Configure Elastic Search
Step 1. Open the elastic search folder and go to the Config folder.
Step 2. Open the file “elasticsearch.yml” and verify the port Number. (Default Port :
9200)
Note : Set the Same port Number in Logstash -> bin -> logstash.conf File -> output
Step 3. Run the elastic Search. { Elastic Search -> Bin -> elasticsearch.bat }
Step 4. Verify the elastic Search logs as below :
17. All contents © MuleSoft, LLC
Configure Kibana
Step 1. Open the Kibana Folder and go to the Kibana.yml file from config folder.
Step 2. Verify the kibana host and port. {Default Host : Localhost, Default Port : 5601}
Step 3. Verify the elastic Search URLs in the same file. { As elsticsearch config we set
the host as localhost and port is 9200 – In Configure Elastic Search Step 2.}
Step 4. Run the kibana application. { Kibana -> Bin -> kibana.bat }
Step 5. Verify the elastic Search logs as below :
18. All contents © MuleSoft, LLC
Configure Kibana
Note: Elastic Search should be up and running for Kibana application, If elastic search fails to run,
you will not be able to run the Kibana application.
19. All contents © MuleSoft, LLC
Configure Logstash
Step 1. Open the Config Folder in Logstash.
Step 2. Verify the *.conf file in Logstash, if available
20. All contents © MuleSoft, LLC
Configure Logstash
Step 3. If only Sample config file available, please ignore and create a new file as name
logstash-sqs.conf
Step 4. Update the file with following Content:
input {
sqs {
region => "eu-central-1" // SQS Region
queue => "MuleSoftLogs" // SQS Queue Name
access_key_id => "XXXXXXXXXXXXXX" //AWS Access Key
secret_access_key => " XXXXXXXXXXXXXX " //AWS Secret key
}
}
21. All contents © MuleSoft, LLC
Configure Logstash
filter {
json {
# Parses the incoming JSON message into fields.
source => "message"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
codec => "json"
index => "mule-sqs"
#user => "elastic"
#password => "changeme"
}
}
22. All contents © MuleSoft, LLC
Configure Logstash
Step 5. Go to Logstash bin folder and run the logstash.bat file.
Note : The Elastic search port will be set in next step. So, we can set the same, once
we will setup the Elastic Search Configuration.
Step 8. Create the index for Kibana. (Kibana requires an index pattern to access the
Elasticsearch data that you want to explore. An index pattern selects the data to use
and allows you to define properties of the fields. An index pattern can point to a
specific index, for example, your log data from yesterday, or all indices that contain
your data.)
Step 9. Trigger the Logstash.bat file to execute the Logstash.
Step 10. The default port for Logstash is 9600 (For Localhost)
24. All contents © MuleSoft, LLC
Configure Log4j
Step 1. You must create the application, for which you need to push the logs into
Kibana.
Step 2. You must open the log4j file of mule project and update the appender as
following.
25. All contents © MuleSoft, LLC
Configure Log4j
Step 3. Update the Appender Reference for SQS (Appender Name will appear in
Appender Ref)
You have completed the Log4j Configuration, to push the logs into Kibana.
Step 4. Deploy the Project into CloudHub, and while deploying the application please
disable the CloudHub Logs.
27. All contents © MuleSoft, LLC
Configure Log4j
Note : Once we Disable the CloudHub logs, the logs will not appear into the
CloudHub. It will move to the AWS SQS Service. If you need the logs available in both
places(CloudHub and SQS), Please update the Log4j as below.
28. All contents © MuleSoft, LLC
Visualize Logs into Kibana
Step 1. Open the Kibana application and click on Hamburger Icon.
Step 2. Go to Management Tab -> Stack Management.
Step 3. Go to Kibana -> Index Patterns
Step 4. Create the index Pattern
30. All contents © MuleSoft, LLC
Visualize Logs into Kibana
Step 5. Provide the same index name, which we defined in Logstash Output.
Note : Index Name should be same as defined in Logstash (Defined in Configure
Logstash -> Step Number 7)
31. All contents © MuleSoft, LLC
Visualize Logs into Kibana
Step 6. Config the Setting for index management as per the requirement. And create
the index pattern.
32. All contents © MuleSoft, LLC
Visualize Logs into Kibana
Step 7. Index Pattern Created Successfully.
Step 8. Click on Hamburger Icon -> Kibana -> Discover.
Step 9. Select the Valid index Pattern as below :
33. All contents © MuleSoft, LLC
Visualize Logs into Kibana
Step 10. Run the Application, once the logs create it will be visible to your Kibana
Application as below :
34. All contents © MuleSoft, LLC
References
1. https://docs.mulesoft.com/runtime-manager/custom-log-appender
2. https://www.elastic.co/elasticsearch/
3. https://www.elastic.co/kibana
4. https://www.elastic.co/logstash
37. All contents © MuleSoft, LLC
1. Log stash is used for ?
A. Storage of Logs/Data
B. Visualization
C. Analysis
D. Data Collection and
Parsing
38. All contents © MuleSoft, LLC
2. What Changes are
required for Log4J File ?
39. All contents © MuleSoft, LLC
3. How to Push the logs
into Kibana from Multiple
Source as SQS and s3