Quick Install Guide to ELK on AWS EC2

This one is for anyone out there who wants to setup their own, all-in-one ELK stack server on AWS. The powers of ElasticSearch, Logstash and Kibana combined creates the ELK stack. ELK is used for parsing, sorting and storing logs. The best part is the software is free. The aim of the article is for you to get your very own ELK stack up and running on a AWS EC2 instance in under 15 minutes, so you can evaluate it yourself. As an IT system engineer knowing your way around ELK is essential. So hoping this will give some people an enjoyable introduction, so you can get started. I went through a bit of pain setting this up myself so this guide will help you skip all that pain. There are many resources that will help you get more info on ELK. I have mentioned some resources in the references section if you want to get into more detail on the actual usage of ELK.

Quick note, AWS do offer their own ElasticSearch offering but there is a significant cost difference. E.g as of writing this post in Oct 2017 the price of a t2.medium per hour was $0.058 (On Demand) and the ElasticSearch service was $0.112 for similar specifications.

STEP 1: Create an EC2 Instance

Create a t2.medium instance AWS instance with the Amazon Linux OS and 20Gb storage. Anything smaller may cause some of the components to not work properly. One thing with ELK is its very resource hungry. Memory, CPU and storage are considerations you will need to make before productionising ELK. Unfortunately this means no free tear.

When creating a security group allow for the following access.

Port 22 from your local IP address. And TCP ports 5000 for Logstash access, 5601 for Kibana access and 9200 for ElasticSearch access.

Figure 1: Security Group Configuration

Figure 2: EC2 instance summary

STEP 02: Install the packages

This is the main bit that I have trimmed down for everyone. SSH into the ECS instance and switch user to root and you should be able to copy and paste the below commands in sequence. Just keep in mind that after running the “alternative” command there is a section where you will have to do a selection for java 1.8.

(Note: sorry everyone but the above snippet formatting in Linkedin is not behaving properly when I paste the sed commands in. Might be safer for you to copy the above contents from the install file from my github repository I have made available. Click the link below.)

That’s pretty much it. You have completed the install. Give yourself a pat on the back!!! Now you have your very own ELK stack you can play around with.

The following section just give you some practical tips and how you can begin testing the ELK stack you just created.

Test ElasticSearch

ElasticSearch is what indexes and organises all the data you pump into the ELK stack.

SSH onto the ELK server you just created and run the below command and you should get a similar response if successful.

Command:

curl -X GET "http://localhost:9200/?pretty"

Successful Response Example:

{
 "name" : "PBYAswt",
 "cluster_name" : "elasticsearch",
 "cluster_uuid" : "uYbHKgAYS9Gii3MnX8J1mg",
 "version" : {
   "number" : "5.6.2",
   "build_hash" : "57e20f3",
   "build_date" : "2017-09-23T13:16:45.703Z",
   "build_snapshot" : false,
   "lucene_version" : "6.6.1"
 },
 "tagline" : "You Know, for Search"
}

FYI an unsuccessful response may look as follows:

curl: (7) Failed to connect to localhost port 9200: Connection refused

You should also check the “?pretty” end point works via the public DNS name of the server. The public DNS name is available via the AWS console EC2 details for the server you started up. The public DNS value is what you want to use. An example success response via a browser is displayed below.

Figure 3: ElasticSearch response via browser

Test Logstash

Logstash allows you to parse data from different inputs and pump it into ElasticSearch.

Step A:

Run following command to get the current indexes in ElasticSearch.

curl -X GET "http://localhost:9200/_cat/indices?v"

Following output means there are no indexes in ElasticSearch

health status index               uuid                   pri rep docs.count docs.deleted store.size pri.store.size

Step B:

Run the following command to add a line called “test-message” into the log file specified for logstash.

echo "test-message" >> /tmp/logstash.txt

IMPORTANT!!: This is not the only way you can get logs sent via logstash. There are many different ways. This is a later discussion.

Step C:

Re-run the ElasticSearch index list command and the output will show one index has been created as follows

curl -X GET "http://localhost:9200/_cat/indices?v"

health status index               uuid                   pri rep docs.count docs.deleted store.size pri.store.size
yellow open   logstash-2017.10.18 hQwB9WzRROaBuTr9-8mYMg   5   1          1            0      5.4kb          5.4kb

Step D:

List items in index using the following command

Format:

curl -X GET "http://localhost:9200/[yout index name]/_search?size=[number of records you want returned]&pretty=1"

Example Command and Output showing the test-message:

curl -X GET "http://localhost:9200/logstash-2017.10.18/_search?size=1000&pretty=1"

{
 "took" : 42,
 "timed_out" : false,
 "_shards" : {
  "total" : 5,
  "successful" : 5,
  "skipped" : 0,
  "failed" : 0
 },
 "hits" : {
  "total" : 1,
  "max_score" : 1.0,
  "hits" : [
   {
    "_index" : "logstash-2017.10.18",
    "_type" : "logs",
    "_id" : "AV8vQlDh9tfzGLSrHMf5",
    "_score" : 1.0,
    "_source" : {
     "@version" : "1",
     "host" : "ip-172-31-2-132",
     "path" : "/tmp/logstash.txt",
     "@timestamp" : "2017-10-18T11:33:15.826Z",
     "message" : "test-message"
    }
   }
  ]
 }
}

Test Kibana

Kibana is the Web interface that will allow you to perform analytics on the data that has been gathered into the ELK stack.

Step A:

Navigate to http://<server public IP address>:5601/ in your browser

Step B:

Make sure the index pattern is logstash-* and the Time Filter field name is @timestampand click “Create”. This will create a logstash- index.

Figure 4: Working Kibana Web UI

Step C:

Navigate to the “Discover” link on the top right corner and select the “logstash-*” index

If you see the following screen with no records try expand the time period using the top right time period selector (indicated in example with clock and “Last 15 minutes”) so it includes the time you inserted the logstash entry. If records are found or you change the time period to include the logstash record entry you will see the following example:

Figure 5: Kibana showing log entry “test-message”

Figure 6: No results found page. Change the time period

Summary

This is just the tip of the iceberg. It’s a wonderful product that I’ve seen used in almost every IT organisation in some capacity to organise, analyse and visualise data. Also its a necessity for all people in IT at the moment. I hope this post gives you a smooth start to your ELK journey. Good luck!!!

Resources

  • Awesome Youtube Tutorial by Misuk Heohttps://www.youtube.com/watch?v=69OoC7haeeA&list=PLVNY1HnUlO25m5tT06HaiHPs2nV3cLhUD
  • Official Elastic Websitehttps://www.elastic.co/
  • ELK install for Ubuntuhttps://www.rosehosting.com/blog/install-and-configure-the-elk-stack-on-ubuntu-16-04/

Continue reading...

Everything you missed at DevDay
Natalie Byrgiotis
October 30, 2018
Chameleon SMS
Ibrahim Tareq
October 29, 2018
MessageMedia x Auth0
Ibrahim Tareq
October 25, 2018