How to Setup Elk Stack

How to Setup ELK Stack Introduction The ELK Stack is a powerful and popular open-source solution for searching, analyzing, and visualizing large volumes of data in real time. ELK stands for Elasticsearch, Logstash, and Kibana, three components that work together to help organizations collect, process, and visualize log data and metrics from various sources. Setting up the ELK Stack enables you to

Nov 17, 2025 - 11:06
Nov 17, 2025 - 11:06
 3

How to Setup ELK Stack

Introduction

The ELK Stack is a powerful and popular open-source solution for searching, analyzing, and visualizing large volumes of data in real time. ELK stands for Elasticsearch, Logstash, and Kibana, three components that work together to help organizations collect, process, and visualize log data and metrics from various sources. Setting up the ELK Stack enables you to gain valuable insights into system performance, detect anomalies, and improve operational intelligence.

This tutorial provides a comprehensive, step-by-step guide on how to install and configure the ELK Stack, best practices to optimize its use, essential tools and resources, real-world examples, and answers to frequently asked questions. Whether you are a system administrator, developer, or data analyst, mastering ELK Stack setup is crucial for effective log management and data analysis.

Step-by-Step Guide

Step 1: System Requirements and Preparation

Before installing the ELK Stack, ensure your system meets the hardware and software requirements. The ELK Stack runs on various Linux distributions, Windows, and macOS, but Linux is preferred for production environments due to stability and performance.

  • Operating System: Ubuntu 20.04 LTS or later, CentOS 7 or later, Debian 10 or later
  • Memory: Minimum 4GB RAM (8GB or more recommended for production)
  • Disk Space: At least 10GB free space for data storage
  • Java: Elasticsearch and Logstash require Java. OpenJDK 11 or later is recommended.
  • Network: Open ports for communication (default Elasticsearch: 9200, Kibana: 5601, Logstash: 5044)

Update your system packages before installation:

sudo apt update && sudo apt upgrade -y

Step 2: Install Elasticsearch

Elasticsearch is a distributed search and analytics engine at the core of the ELK Stack. Follow the steps below to install Elasticsearch on Ubuntu:

  1. Download and install the Elasticsearch public signing key:
  2. wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
  3. Add the Elasticsearch repository:
  4. sudo sh -c 'echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" > /etc/apt/sources.list.d/elastic-7.x.list'
  5. Update package lists and install Elasticsearch:
  6. sudo apt update
    

    sudo apt install elasticsearch

  7. Enable and start the Elasticsearch service:
  8. sudo systemctl enable elasticsearch
    

    sudo systemctl start elasticsearch

  9. Verify Elasticsearch is running by querying its REST API:
  10. curl -X GET "localhost:9200/"

Step 3: Install Logstash

Logstash is a data processing pipeline that ingests data from multiple sources, transforms it, and sends it to a "stash" like Elasticsearch.

  1. Install Logstash from the same Elastic repository:
  2. sudo apt install logstash
  3. Enable and start Logstash service:
  4. sudo systemctl enable logstash
    

    sudo systemctl start logstash

  5. Configure a basic Logstash pipeline:
  6. Create a configuration file /etc/logstash/conf.d/logstash-simple.conf with the following content:

    input {
    

    beats {

    port => 5044

    }

    }

    filter {

    Add filters as needed

    }

    output {

    elasticsearch {

    hosts => ["localhost:9200"]

    index => "logstash-%{+YYYY.MM.dd}"

    }

    stdout { codec => rubydebug }

    }

  7. Restart Logstash to apply the configuration:
  8. sudo systemctl restart logstash

Step 4: Install Kibana

Kibana is the visualization layer of the ELK Stack, providing a web interface to explore and visualize data stored in Elasticsearch.

  1. Install Kibana:
  2. sudo apt install kibana
  3. Enable and start Kibana service:
  4. sudo systemctl enable kibana
    

    sudo systemctl start kibana

  5. Access Kibana via web browser at http://your_server_ip:5601
  6. Configure Kibana to connect to Elasticsearch by editing /etc/kibana/kibana.yml if necessary:
  7. server.port: 5601
    

    server.host: "0.0.0.0"

    elasticsearch.hosts: ["http://localhost:9200"]

  8. Restart Kibana after changes:
  9. sudo systemctl restart kibana

Step 5: Install and Configure Beats (Optional but Recommended)

Beats are lightweight data shippers used to send data from your servers to Logstash or Elasticsearch.

  1. Install Filebeat to collect log files:
  2. sudo apt install filebeat
  3. Enable the Filebeat Elasticsearch module:
  4. sudo filebeat modules enable system
  5. Configure Filebeat to send data to Logstash by editing /etc/filebeat/filebeat.yml:
  6. output.logstash:
    

    hosts: ["localhost:5044"]

  7. Start and enable Filebeat:
  8. sudo systemctl enable filebeat
    

    sudo systemctl start filebeat

Step 6: Verify the ELK Stack Setup

After installation and configuration, verify that data flows correctly through the ELK Stack:

  • Check Elasticsearch indices:
curl -X GET "localhost:9200/_cat/indices?v"
  • Access Kibana and create an index pattern matching your Logstash indices, for example: logstash-*
  • Create visualizations and dashboards in Kibana to analyze your data.
  • Best Practices

    Optimize Elasticsearch Configuration

    Elasticsearch performance depends heavily on configuration. Use the following best practices:

    • Heap Size: Set JVM heap size to 50% of available RAM but not more than 32GB. Edit /etc/elasticsearch/jvm.options.
    • Node Roles: Assign roles such as master, data, and ingest nodes for distributed setups.
    • Shard Management: Avoid creating too many shards; use a shard count aligned with your data volume.

    Secure the ELK Stack

    Security is critical when exposing ELK Stack services:

    • Enable authentication and role-based access control using Elastic Security features.
    • Use TLS/SSL encryption for Elasticsearch and Kibana communications.
    • Restrict network access using firewalls and VPNs.

    Logstash Pipeline Efficiency

    Keep your Logstash pipelines efficient by:

    • Using conditionals to route events and reduce unnecessary processing.
    • Applying filters only when needed to minimize CPU usage.
    • Monitoring pipeline performance and tuning worker threads.

    Regular Data Management

    Manage your Elasticsearch indices to prevent storage issues:

    • Implement index lifecycle management (ILM) policies to automate rollover and deletion.
    • Regularly monitor disk usage and cluster health.
    • Backup critical data periodically.

    Tools and Resources

    Official Documentation

    The Elastic website offers comprehensive documentation for all ELK Stack components:

    Community Forums and Support

    Engage with the Elastic community through forums and discussion boards for troubleshooting and best practices.

    Third-Party Tools

    • ElasticHQ: Elasticsearch management and monitoring tool.
    • Grafana: Alternative visualization platform supporting Elasticsearch data sources.
    • Curator: Tool for managing Elasticsearch indices.

    Learning Platforms

    Several online platforms provide courses and tutorials on ELK Stack:

    • Udemy, Coursera, Pluralsight
    • YouTube technical channels

    Real Examples

    Example 1: Centralized Server Log Monitoring

    A company with multiple Linux servers deployed Filebeat on each to forward system logs to Logstash. Logstash filtered and enriched logs before storing them in Elasticsearch. Kibana dashboards visualized CPU usage, error rates, and login activity to improve incident response times.

    Example 2: Application Performance Monitoring

    Developers integrated application logs with Logstash and used custom filters to parse JSON logs. Kibana visualizations helped identify slow API endpoints and monitor transaction rates, enabling proactive optimization.

    Example 3: Security Event Analysis

    The security team used Winlogbeat to collect Windows event logs. Combined with threat intelligence feeds processed by Logstash, they created Kibana dashboards for real-time detection of suspicious login attempts and firewall alerts.

    FAQs

    What is the difference between Elasticsearch and Logstash?

    Elasticsearch is a search and analytics engine that stores and indexes data, while Logstash is a data processing pipeline that collects, transforms, and forwards data to Elasticsearch or other destinations.

    Can I install ELK Stack on Windows?

    Yes, ELK Stack components can be installed on Windows, but Linux is generally preferred for production due to better support and performance.

    How much hardware resources does ELK Stack require?

    Resource requirements depend on data volume and usage. A minimum of 4GB RAM and 10GB disk space is recommended for small setups, with significantly more for production environments.

    Is it possible to secure ELK Stack with user authentication?

    Yes, Elastic offers built-in security features including user authentication, role-based access control, and TLS encryption to protect your ELK Stack deployment.

    How do I handle large volumes of log data?

    Implement index lifecycle management, optimize shard configurations, scale Elasticsearch clusters horizontally, and use efficient data ingestion pipelines to manage large datasets.

    Conclusion

    Setting up the ELK Stack provides a comprehensive solution for collecting, analyzing, and visualizing log and event data. By carefully following the installation steps, adhering to best practices, and utilizing available tools and resources, you can unlock valuable insights from your data. The ELK Stacks flexibility and scalability make it suitable for a wide range of use casesfrom infrastructure monitoring to security analytics.

    Mastering ELK Stack setup empowers organizations to improve operational efficiency, detect issues faster, and make data-driven decisions. Start your ELK Stack journey today and transform raw logs into actionable intelligence.