Bitnami Stack Virtual Machines

It is assumed you are using an Ubuntu desktop or laptop or MacOS as your local machine.

We used the Bitnami Stack of ELK on Ubuntu 16.0.4 or 18.0.4 as the server of choice. It is relatively easy to install ELK on the AWS or Alibaba Cloud server instance, provided you follow the minimum procedures. ElasticSearch has its own issues on the number of processes and threads and so does Java which needs to be tamed for smaller swaps to control its garbage collection. Thanks to Bitnami, they have optimized most of the work for us.

ELK Start ELK Do

Installing (ELK) ElasticSearch, Logstash and Kibana on AWS

Create a t2 medium instance. Anything less, like a micro or small, you will be out of resources. ElasticSearch uses the Lucene Engine on Java and it is extremely resource hungry.

In AIM set a policy that allows you to access S3 buckets. In AWS, S3 is the resource where the logs are kept.

Once the instance is generated, download the certificate keys.

SSH into the new instance. You can use the Java client provided by AWS also.

        $ ssh -i ~/Downloads/elkpair.pem
        $ sudo swapoff -a
        $ sudo nano /etc/security/limits.conf

        # add the following lines
        * memlock unlimited
        * soft nofile 65536
        * hard nofile 65536
        # save the file ctrl-o, ctrl-x

        $ sudo nano /etc/sysctl.conf
        # add the following line
        # save the file ctrl-o, ctrl-x

Reboot the server instance. These changes need to take effect. Once the server is up and ready, ssh into the server again. Run the installation for ELK and other pre-requisites.

        $ ssh -i ~/Downloads/elkpair.pem
        # --- Install nginx 
        $ sudo apt update
        $ sudo apt install nginx
        # --- Now install the ELK from Bitnami
        $ wget
        $ chmod +x
        # --- Run the installer in text mode
        $ ./ --mode text
        # --- Your installation should complete and be able to start the servers
        # --- Add awscli to the ec2 instance 
        $ sudo apt install awscli

Create a S3 bucket. Add the logs into it. Create a role which has access to s3 in it.

From within the instance, see if you can access S3.

      $ aws configure
      # --- enter the access key and the password to access S3 bucket from within the server.
      $ aws s3 ls
      # --- this will list all the buckets within your account
      $ aws s3 ls (your s3-elk-bucket)  
      # --- this will list the log file in the bucket
      $ JAVA_HOME=/home/ubuntu/elk-6.4.3-0/java
      $ export JAVA_HOME
      $ cd ~/elk-6.4.3-0/elasticsearch
      $ ./bin/elasticsearch-plugin install repository-s3
      # --- now you have installed the repository for elastic search to check the S3 instances

      $ cd ~/elk-6.4.3-0
      $ ./ stop logstash
      $ cd ~/elk-6.4.3-0/logstash/conf

      # --- keep a backup of the logstash configure file
      $ cp logstash.conf logstash.conf_hold
      $ nano logstash.conf
      # --- replace the configuration with this in logstash.conf
        input {
            s3 {
                bucket => "your-s3-elk-bucket"
                region => "us-east-1"
                secret_access_key => "6xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxLP2hXqNQe"
                access_key_id => "AxxxxxxxxxxxxxxxxxxA"
                additional_settings => {
                    force_path_style => true
                    follow_redirects => false

        output {
            elasticsearch {
                hosts => [""]
                index => "AWS"
        # --- save the file ctrl-o, ctrl-x
        $ cd ~/elk-6.4.3-0
        $ ./ctlscript stop
        # --- give a few minutes
        $ cd ~/elk-6.4.3-0
        $ ./ctlscript start