Questions tagged [logstash]

I have an event in logstash that looks like:
{
'terms' : { 'A' : 1, 'B' : 0.5, 'c' : 1.6 }
}
I would like to change it to:
{
'terms' : [ 'A', 'B', 'C' ]
}
I didn't find any documentation about a for loop or get the keys of dictionary.
I would like to do something like:
filter {
for key in [terms]{
m...

I've installed the logstash-plugin for jenkins and configured it to use the indexer as elastic search.
While executing the jobs I see the following error:
16:56:12 [logstash-plugin]: Failed to send log data to ELASTICSEARCH:http://localhost:9200.
16:56:12 [logstash-plugin]: No Further logs will be s...

Thanks in advance for your help. I would like to reload some logs to customize additional fields. I have noticed that registry file in filebeat configuration keeps track of the files already picked. However, if I remove the content in that file, I am not getting the old logs back. I have tried also...

I'm using a FIX filter plugin to process some of our FIX logs. In those messages we receive multiple custom fields. This is outside of the grok filter. I pass the message I care about into this secondary fix plugin
Some of our messages for example look like this:
'unknown_fields' => [
[0] '5000',...

I am using Elasticsearch 2.0 and logstash 1.5.4 for my project, the moment I have upgraded the elasticsearch version from 1.7 to 2.0, its showing some garbage indexes in my elasticsearch.
[[email protected] ~]# curl 'localhost:9200/_cat/indices?v'
health status index pri rep docs.co...

I'm trying to run the ELK for NGINX Plus (JSON) example on a Windows 10 machine, and am hung up on finding a Windows equivalent to the following unix command for ingesting the log data into Elasticsearch:
cat nginxplus_json_logs | /bin/logstash -f nginxplus_json_logstash.conf
From this question and...

We had successfully set up the ELK stack to our production environment. We can also see the logs (logs are unstructured) output on our Kibana Server.
Everything is working fine for us. But the only thing we are concerned about is the messages in the kibana are structured for every single line writt...

I'm experimenting with ELK to analyze our log files. Following the available documentation, managed to set up the stack in my pc. Now I'm facing an issue with the elastic search index creation. Previously I was using filebeat -> logstash -> elasticsearch -> kibana combination and using the following...

Question 1 -
56dd573d.5edd this is my session id, i have grok filter like
%{WORD:session_id}.%{WORD:session_id} - this will read the session id and output will look like this
'session_id': [
[
'56dd573d',
'5edd'
]
]
Is there any way where i can get output something like
'session_id': [
[
'56dd573d...

I've been trying to do this all day. I want to send logs from Docker to FluentD via the fluentd logging engine and then from fluentd send those logs to logstash for processing.
I keep getting this error from logstash though:
{:timestamp=>'2016-03-09T23:29:19.388000+0000',
:message=>'An error occurre...

Below is the field that is part of a log line from the application, where have client-id attributes value that I need to split, delimitter '#'.
client-id=ABC-SYNC_Foo#qrkmguv4p995b3kqk1jaupocl2
here is how i want
source=ABC-SYNC_Foo
id=qrkmguv4p995b3kqk1jaupocl2
I need help with regex on how to spli...

I'm trying to read log4j v2.3 JSON output via a Logstash TCP socket using the logstash JSON codec, but Logstash is treating each line as a separate event to be indexed rather than reading each JSON object as an event.
log4j config
... removed for brevity ...
... removed for brevity ...
Logstash Conf...

I'm new in this ELK stuff. I've been trying to create visualizations using this stack, but I'm not able to use fields such as verb, response, request, etc, I'm only able to select a few available fields:
However, in the Discover section I'm perfectly able to work with those fields. Here is a sample...

I have trouble getting logstash to work. The Basic logstash Example works. But then I struggle with the Advanced Pipeline Example. Perhaps it could be as well a problem with elasticsearch.
Now I just want to check if a simple example work:
input: read textfile-a
output: generate new textfile-b wit...

I've a logstash instance, version 2.3.1 which isn't running using the command
sudo service logstash start
Whenever I run this command, it returns logstash started and after a few moments when I check the status, I find that logstash isn't running. Although, when I start the logstash from opt to get...

I have configured filebeat as it is descripted at elastic.co
The Problem is that when I add a new log-file the data is not uploaded to logstash. What can be the problem?
I already tried different config ways but it didn't work at all.
################### Filebeat Configuration Example #############...

Our Sever is hosted in Solaris(OS) but we are not able to install Filebeat to forward the logs to desired port as Filebeat is not supported in Solaris. Can someone here suggest any way to solve this problem. Please note we are told not to install Logstash in the server hosted machine.
Your advices a...

I'm using Logstash to send our log data to an Elasticsearch Service in AWS. Now I have some business logic defined in Spark Streaming that I want to apply to the log data in real-time, so I'm thinking about using Amazon SQS or Apache Kafka in the middle.
Is is right to use Kafka it in this scenario...

I am attempting to import a MySQL table into Elasticsearch.It is a table containing 10 different columns with a an 8 digits VARCHAR set as a Primary Key. MySQL database is located on a remote host.
To transfer data from MySQL into Elasticsearch I've decided to use Logstash and jdbc MySQL driver.
I a...

I have started logstash with multiple workers > 16.
I have multiline messages like java exceptions/java traces and want to merge them into a single event. Earlier, It was working as expected but after upgrading my ELK stack it's breaking :-(
my logstash filter :
filter {
multiline {
pattern => '(^[...

I am using Filebeat to parse XML files in Windows, and sending them to Logstash for filtering and sending to Elasticsearch.
The Filebeat job worked perfectly and I m getting XML blocks into Logstash, but it looks likes I misconfigured Logstash filter to parse XML blocks into separated fields and enc...

I've been tasked with managing our ELK stack and writing rules for elastalert, but need a specific part of one field I already have as its own field in order to use elastalert's query_key functionality on that field. We're using these rules here:
https://github.com/hpcugent/logstash-patterns/blob/ma...

I have ELK stack with Elasticsearch, Logstash and kibana installed on 3 different instances.
Now I want to make 3 node cluster of Elasticsearch.
I will make one node as master and 2 data nodes.
I want to know in logstash config
elasticsearch {
hosts => 'http://es01:9200'
Which address I need to ente...

We will be feeding different logs from different sources to kafka from logstash. There would be some particular set(or some format)messages that consumers of kafka will be more interested than other messages. How do we prioritise the messages?
One thought is to create two topics in kafka - 1) high_p...

I have the following Grok patterns defined in a pattern file
HOSTNAME \b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:\.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(\.?|\b)
EMAILLOCALPART [a-zA-Z][a-zA-Z0-9_.+-=:]+
EMAILADDRESS %{EMAILLOCALPART}@%{HOSTNAME}
For some reason this doesn't compile when run against http://...

I need a Query that shows me all the DNS requests that have more than 20 characters in the domain name.
I want to see anything like “mybadwebsite.hacker.com” 23 characters long – most sites are under 20.
How do I do that?

I am new with logstash and grok filters. I am trying to parse a string from an Apache Access Log, with a grok filter in logstash, where the username is part of the access log in the following format:
name1.name2.name3.namex.id
I want to build a new field called USERNAME where it is name1.name2.name...

What is the most appropriate name for the timestamp when utilizing Logstash to parse logs into Elasticsearch, then visualizing with Kibana?
I am defining the timestamp using date in a filter:
date {
match => [ 'logtime', 'yy-MM-dd HH:mm:ss' ]
}
Logstash automatically puts this into the @timestamp fi...

I want to read events from the Instagram. I was wondering if I can do it using Logstash similar to reading events from Twitter using Twitter input plugin, but there is no input plugin for Instagram. Is there any other way to collect Instagram data using Logstash?
Thanks!

I am trying to get filebeat (for logstash forwarding) on a CentOS 7 environment to run under my created user account: filebeat instead of root. I tried editing the /etc/rc.d/init.d/filebeat file to the following but to no avail. I might be doing something wrong but still a bit new to BASH scriptin...

I have single node kafka running on my server machine.
I used following command to create topic 'bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test'.
I have two logstash instances running. First one reads data from some java application log fi...

I am using logstash to create a pipeline from Postgres to CockroachDB. Below is the config.
The input plugin(source is postgres) is working fine. But I am unable to establish a connection in the output plugin(cockroachDB) using JDBC. I am facing the below error.
JDBC - Connection is not valid. Pleas...