OpenSearch - Monitor your infra (with Logstash or Fluent Bit)
Find out how to set up Logstash and Fluent Bit with your OpenSearch databases
Find out how to set up Logstash and Fluent Bit with your OpenSearch databases
Last updated 17th December 2021
OpenSearch is an open source search and analytics suite. In order to push relevant data to OpenSearch, several methods are available. You can upload a file easily, but for real-time data such as metrics and logs, collectors are required.
This tutorial explains how to configure two famous collectors, Logstash and Fluent Bit, to connect and forward data logs to your OpenSearch database service.
This tutorial requires:
Make sure that you have a user account with enough privileges to write to the OpenSearch database, and that the IP addresses that you will use to run your agents (Logstash and Fluent Bit) are part of the authorized list. Check our Getting Started guide for more information.
The main software components used to create this tutorial are:
Due to the fork from ElasticSearch, it is recommended to verify the version of software that you will use:OpenSearch compatibility Matrices.
To collect data, you need to install the collector agent on the data source. In our case it's a Linux Ubuntu virtual machine, but it can be anything.
Please refer to the official OpenSearch Logstash installation documentation
As detailed in the compatibility matrix, select and download a compatible release from ElasticSearch: download OSS past releases
The default configuration is sufficient for this tutorial but in case you need to review the configuration, go to /etc/logstash and look at the configuration files (logstash.yml).
Every pipeline is based on 3 phases: inputs, filters and outputs. For this tutorial, let's assume an Apache service is running. We will collect Apache access logs to configure one:
Configuration file : /etc/logstash/conf.d/apache2.conf
input {
file {
path => "/var/log/apache2/access.log"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
opensearch {
hosts => ["https://opensearch-682faf00-682faf00.database.cloud.ovh.net:20184"]
index => "index_test-%{+YYYY.MM.dd}"
user => "admin"
password => "2fakeSVV5wvyPykF"
}
}
As an example of the main pattern matching that are currently used, or to test some more specific ones, you can use some tools like Grok Debugger :
Now that we have Logstash running and parsing the Apache access log file, let's do a different connection to the Apache service and validate that Logstash is forwarding all access data to OpenSearch. Let's also connect to the OpenSearch Dashboard to check that the corresponding index has been created and that some first documents have been populated.
Use the Dev Tools to query the indices:
Then, in the console, execute a GET /_cat/indices command to get the index list with the number of documents for each:
To aggregate all corresponding daily indices, we have to create an index pattern from the Stack Management menu:
To create the first index pattern:
As we want to consolidate all our daily indices, let define the pattern as index_test:
Then define the timestamp:
OpenSearch will then define all fields available and their type directly from stored documents in the index:
In the OpenSearch Dashboard / Visualize, create a new visualization:
Select the visualization type:
Then select the data source (the index pattern you just created):
You now have to define all criteria that you need to visualize:
Click on update
if there are any changes in the data definition, or refresh if you change the data time scale.
Save your visualization:
Let's make a dashboard with the visualization you've just saved.
Create a new dashboard :
As we have already created a visualization, Add an existing object:
Search and select the one we just created:
Reorganize your panels, add new or existing ones as wanted:
We now have a real time dashboard showing us the unique IP address per day, found in Apache logs.
Multiple agents exist. Fluent Bit is a strong, full open source, alternative to Logstash, and works natively with OpenSearch: official website for Fluent Bit.
We propose now to parse the Apache access logs and push these information to the OpenSearch database for this agent.
To install Fluent Bit on our Linux Ubuntu instance, we will install the td-agent-bit package, as described in the Fluent Bit installation process from the Official Documentation.
Check that the service is running and is enabled to start automatically when the system boots:
sudo systemctl enable td-agent-bit.service
sudo systemctl status td-agent-bit.service
Let's modify the configuration file /etc/td-agent-bit/td-agent-bit.conf for the INPUT and OUTPUT section :
[INPUT]
Name tail
Tag test.file
Path /var/log/apache2/access.log
DB /var/log/apache2_access.db
Path_Key filename
Parser apache2
Mem_Buf_Limit 8MB
Skip_Long_Lines On
Refresh_Interval 30
[OUTPUT]
Name es
Match *
Host opensearch-682faf00-682faf00.database.cloud.ovh.net
Port 20184
tls On
tls.verify off
HTTP_user admin
HTTP_Passwd 2FakeSVV5wvyPykF
Logstash_Format True
Logstash_Prefix my-fluent
If required, please refer to parameters in the Fluent Bit Official Documentation.
All Apache access logs will now be pushed to the OpenSearch database under the "my-fluent" index.
As we did for Logstash, you can create an index pattern to aggregate all the daily data logs stored in OpenSearch then visualize those data or create dashboards. In this example, the indices will be my-fluent-YYYY.MM.DD , so you can define a pattern like : my-fluent-*.
Congratulations, you are now able to collect data from multiple sources and push them to Public Cloud Databases for Opensearch!
OVHcloud documentation on managed Public Cloud Databases
OpenSearch Official documentation
OpenSearch Dashboard Official documentation
Fluent Bit Official Documentation
Visit our dedicated Discord channel: https://discord.gg/ovhcloud. Ask questions, provide feedback and interact directly with the team that builds our databases services.
Si lo desea, también puede enviarnos sus sugerencias para ayudarnos a mejorar nuestra documentación.
Imágenes, contenido, estructura...: ayúdenos a mejorar nuestra documentación con sus sugerencias.
No podemos tratar sus solicitudes de asistencia a través de este formulario. Para ello, haga clic en "Crear un tíquet" .
¡Gracias! Tendremos en cuenta su opinión.
¡Acceda al espacio de la OVHcloud Community! Resuelva sus dudas, busque información, publique contenido e interactúe con otros miembros de la comunidad.
Discuss with the OVHcloud community