How to Forward Logs to Elasticsearch

How to Forward Logs to Elasticsearch: A Comprehensive Tutorial Introduction In today’s data-driven world, managing and analyzing log data efficiently is critical for maintaining the health and security of IT environments. Forwarding logs to Elasticsearch has become a popular approach for centralizing log data, enabling powerful search, analysis, and visualization capabilities. Elasticsearch, part

Nov 17, 2025 - 11:06
Nov 17, 2025 - 11:06
 3

How to Forward Logs to Elasticsearch: A Comprehensive Tutorial

Introduction

In todays data-driven world, managing and analyzing log data efficiently is critical for maintaining the health and security of IT environments. Forwarding logs to Elasticsearch has become a popular approach for centralizing log data, enabling powerful search, analysis, and visualization capabilities. Elasticsearch, part of the Elastic Stack, is a distributed, RESTful search and analytics engine that excels at storing and querying large volumes of data in near real-time.

This tutorial provides a detailed, step-by-step guide on how to forward logs from various sources to Elasticsearch. Whether you are an IT administrator, DevOps engineer, or data analyst, understanding how to set up log forwarding can dramatically improve your ability to monitor systems, troubleshoot issues, and gain actionable insights.

Step-by-Step Guide

1. Understanding the Log Forwarding Process

Before diving into configurations, its important to understand the typical log forwarding workflow:

  • Log Generation: Logs are generated by servers, applications, network devices, or cloud services.
  • Log Collection: A log shipper or forwarder collects the logs from the source.
  • Log Processing: Logs are parsed, enriched, and sometimes transformed to a structured format.
  • Log Forwarding: Processed logs are sent to Elasticsearch for indexing and storage.
  • Visualization and Analysis: Tools like Kibana query Elasticsearch to visualize and analyze the logs.

2. Prerequisites

Make sure you have the following ready before starting:

  • An Elasticsearch cluster or single node instance running (version 7.x or later recommended).
  • A log shipper such as Filebeat, Logstash, or Fluentd installed on the source machines.
  • Basic knowledge of JSON and Elasticsearch query DSL.
  • Access to Kibana for viewing and visualizing logs (optional but recommended).

3. Installing Elasticsearch

If Elasticsearch is not yet installed, follow these steps for a basic installation on a Linux system:

Step 1: Download and install the Elasticsearch package.

wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-7.17.0-linux-x86_64.tar.gz

tar -xzf elasticsearch-7.17.0-linux-x86_64.tar.gz

cd elasticsearch-7.17.0/

./bin/elasticsearch

Step 2: Verify Elasticsearch is running by navigating to http://localhost:9200 in your web browser.

4. Installing and Configuring Filebeat

Filebeat is a lightweight log shipper developed by Elastic. It is commonly used to forward logs to Elasticsearch directly or via Logstash.

Step 1: Install Filebeat on the machine generating logs.

wget https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.17.0-linux-x86_64.tar.gz

tar -xzf filebeat-7.17.0-linux-x86_64.tar.gz

cd filebeat-7.17.0-linux-x86_64/

Step 2: Configure Filebeat by editing filebeat.yml to specify log paths and Elasticsearch output.

filebeat.inputs:

- type: log

enabled: true

paths:

- /var/log/*.log

output.elasticsearch:

hosts: ["localhost:9200"]

Step 3: Start Filebeat.

./filebeat -e

5. Forwarding Logs via Logstash (Optional)

Logstash allows advanced log processing and is often used as an intermediary between Filebeat and Elasticsearch.

Step 1: Install Logstash.

wget https://artifacts.elastic.co/downloads/logstash/logstash-7.17.0-linux-x86_64.tar.gz

tar -xzf logstash-7.17.0-linux-x86_64.tar.gz

cd logstash-7.17.0/

Step 2: Create a Logstash pipeline configuration file logstash.conf:

input {

beats {

port => 5044

}

}

filter {

Add any parsing or enrichment here if needed

}

output {

elasticsearch {

hosts => ["localhost:9200"]

index => "logs-%{+YYYY.MM.dd}"

}

}

Step 3: Start Logstash with the pipeline:

./bin/logstash -f logstash.conf

Step 4: Update Filebeat output to Logstash:

output.logstash:

hosts: ["localhost:5044"]

6. Verifying Logs in Elasticsearch

Once logs are forwarded, verify their presence using Elasticsearch's REST API:

curl -X GET "localhost:9200/logs-*/_search?pretty"

You should see documents containing your log entries indexed under the specified index.

7. Visualizing Logs with Kibana

To easily explore and visualize logs, install and configure Kibana:

  • Download and install Kibana from the Elastic website.
  • Configure Kibana to connect to your Elasticsearch instance.
  • Create index patterns matching your log indices (e.g., logs-*).
  • Use Kibanas Discover, Dashboard, and Visualization features to analyze logs.

Best Practices

1. Use Structured Logging

Whenever possible, configure your applications to emit logs in structured formats like JSON. This enhances parsing accuracy and search efficiency in Elasticsearch.

2. Implement Log Rotation and Retention Policies

Control disk usage on source machines and Elasticsearch by rotating logs regularly and defining retention periods to delete old data.

3. Secure Your Log Data

Use TLS encryption for connections between log shippers and Elasticsearch. Implement authentication and role-based access control in Elasticsearch and Kibana to protect log data.

4. Monitor Log Forwarding Agents

Continuously monitor the health of Filebeat, Logstash, or other agents to ensure logs are being forwarded reliably without loss.

5. Optimize Elasticsearch Indices

Use appropriate index templates and mappings to optimize storage and query performance. Consider time-based indices and use lifecycle management policies.

Tools and Resources

1. Filebeat

A lightweight shipper for forwarding and centralizing log data. Supports various input types and outputs.

2. Logstash

A powerful data processing pipeline that can ingest data from multiple sources, transform it, and send it to Elasticsearch.

3. Fluentd

An alternative log forwarder with rich plugin ecosystem for collecting and forwarding logs.

4. Elasticsearch

The core search and analytics engine used to index and query log data.

5. Kibana

The visualization and exploration tool designed to work with Elasticsearch data.

6. Elastic Documentation

Official Elastic documentation provides extensive guides and references for all Elastic Stack components: https://www.elastic.co/guide/index.html

Real Examples

Example 1: Forwarding Apache Access Logs to Elasticsearch Using Filebeat

Configure Filebeat to ship Apache logs:

filebeat.inputs:

- type: log

enabled: true

paths:

- /var/log/apache2/access.log

output.elasticsearch:

hosts: ["localhost:9200"]

Start Filebeat and verify logs appear in Elasticsearch under the default index.

Example 2: Using Logstash to Parse and Forward Nginx Logs

Sample Logstash pipeline for parsing Nginx access logs:

input {

beats {

port => 5044

}

}

filter {

grok {

match => { "message" => "%{COMBINEDAPACHELOG}" }

}

date {

match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]

}

}

output {

elasticsearch {

hosts => ["localhost:9200"]

index => "nginx-logs-%{+YYYY.MM.dd}"

}

}

This setup enables structured storage of logs with timestamp parsing.

FAQs

Q1: Can I forward logs directly from Filebeat to Elasticsearch without Logstash?

Yes, Filebeat supports direct forwarding to Elasticsearch. However, Logstash provides advanced filtering and enrichment capabilities that may be necessary for complex log formats.

Q2: How do I secure log data during transmission?

Enable TLS encryption on both the log shipper and Elasticsearch endpoints. Use Elastic Stacks security features to require authentication and restrict access.

Q3: What log formats does Elasticsearch support?

Elasticsearch can index any text data, but structured formats like JSON are preferred for efficient searching and aggregation.

Q4: How much disk space will log forwarding consume?

Disk usage depends on log volume, retention period, and indexing strategy. Consider using Elasticsearch's Index Lifecycle Management (ILM) to automate data retention.

Q5: What if my logs contain sensitive information?

Implement data masking or filtering in Logstash or Filebeat before forwarding. Use encryption and strict access controls to protect sensitive data.

Conclusion

Forwarding logs to Elasticsearch is a powerful technique to centralize and analyze log data, enabling faster troubleshooting, improved security, and better operational insights. This tutorial covered the essential steps to set up log forwarding using Filebeat and Logstash, best practices to follow, and useful tools to leverage. By implementing these strategies, organizations can build a robust logging infrastructure that scales with their needs and unlocks the full potential of their log data.