Hello folks, this is extension post for the previous post where in we discussed about the ELK components and their advantages one by one. In this post we’ll explore an example to understand the working part of ELK. We’ll also integrate ElasticSearch with Grafana which is also one of the popular monitoring tool nowadays.
Let’s dive into the example now
Procedure for monitoring an application logs Using ELK stack:
Here I am taking the server logs generated by nginx for my application.
- Start the ELK stack products Elastic search and kibana
- Note : Start ElastcSearch first and kibana next because kibana depends on ElasticSearch.
- Edit the config file of nginx to get the application logs of your project like proxy settings and proxy_server_name and log_format. If you are running on your local system server or any otherone, ignore this step.
- nginx by deafult provides a log_format and that format may not be required or suitable for some kind of projects.
- Start Nginx and your application servers.
- Write the pipeline code of logstash in a config file with the extension .conf.
input{
file{
path => "F:/nginx-1.17.9/logs/*.log" // logs path
type => "logs"
start_position=>"end"
}
}
filter{
grok{
match => { "message" => "%{IPORHOST:remote_addr} - - \[%{HTTPDATE:time_local}\] \"%{WORD:http_method} %{DATA:url} HTTP/%{NUMBER:http_version}\" %{NUMBER:response_code} %{NUMBER:body_sent_bytes} \"%{DATA:referrer}\" \"%{DATA:agent}\""}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "myproject"
}
}
- Logstash File Explanation:
- Mention the source from where you are getting the logs using input plugins like file,http,cloudwatch etc..
- Mention the type of logs and start_position of parsing.
- Start_position for static logs is beginning and for dynamic logs is end because there will be a continuous flow of logs in case running application.
- Write the filters for your logs in the desired format using filter plugins like grok.
- Filter part optional, it is to parse your logs. The provided filter with grok in the above example is to parse nginx logs.
- Write the destinations(outputs) to where the logs should be reached.
- In case of elastic search the output name is elastisearch.
- Mention the hostname as localhost:9200 and desired index name inside the elastic search output plugin.
- Save the file and place it anywhere. Best practice is to place it inside the bin folder of logstash because logstash batch will also be located there itself.
Start logstash by configuring the written config file like cfile.conf to logstash batch file.
Command : pathToLogstashBatchfileInsideBinFolder\logstash.bat -f pathWhereTheConfigFileLocated\Filename.conf
Example: .\logstash -f cfile.conf (Both files are located inside the bin folder only)
After successful run of logstash config file it will create index with the index name given in the config file.
We can check the whether the index is created or not using the address : http://localhost:9200/_cat/indices/
Move to kibana and create the index pattern name with the same name given in config file by clicking on management tab in kibana.
Go to Discover tab
Select the index pattern name and visualize the logs transported from our application to elasticsearch through logstash when your application is under usage.
For better visualization purpose create graphs,pie charts etc..
To search for a particular type of log click on search and enter the keywords.
Check the logs for a range of time by using calendar option in kibana dashboard.
Procedure for Visualizing the logs of ElasticSearch Indices in Grafana
- Start Grafana ,By default grafana works at port 3000.
- Create an elastic search type datasource
- Give name of the datasource.Mention the http url where elastic search operates. Default : http://localhost:9200
- Give the Index pattern name for which you need to create the visualizations and elastic search version that you’re using on your local machine.
- Provide data link if needed(optional)
- click on save and Test button
- If all the information provided for creating a datasource is fine then it says index name and datasource is ok.
- Else check the index pattern name sometime it asks to mention the pattern name inside [] like [pattern_name].
- Go to create a dashboard
- create a panel by selecting Query as recently created elasticSearch datasource name and choose metric as the logs
- Give name for panel then you’ll be able to view our application logs in grafana dashboard.
- Alert can be created on graphs if required.
After adding elastic search datasource, we can look into logs and alarms set details graphically in Grafana Dashboard.
That’s it about the Visualizing the logs in Kibana and Grafana guys, For any queries on this article do comment here or reach my out at narsimhulu.464@gmail.com and at LinkedIn B Narsimha.
Thank You!!!