Logstash to Remote ElasticSearch #516
Replies: 7 comments 1 reply
-
The secondary instance is only if you want the data sent to another elasticsearch cluster in addition to the primary one. It won't have anything to do with your log stash container's health. I'm going to be on vacation until December 2nd, but I will follow up here when I return. |
Beta Was this translation helpful? Give feedback.
-
config output.conf output { conf input_beats GNU nano 4.8 01_beats_input.conf conf inernal.conf |
Beta Was this translation helpful? Give feedback.
-
That all looks correct to me. What specific error are you experiencing? |
Beta Was this translation helpful? Give feedback.
-
from log Logstash: [WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch ][WARN ][org.logstash.plugins.pipeline.AbstractPipelineBus] Attempted to send event to 'zeek-parse' but that address was unavailable. Maybe the destination pipeline is down or stopping? Will Retry. ][WARN ][logstash.inputs.beats ] You are using a deprecated config setting "ssl_verify_mode" set in beats. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Set 'ssl_client_authentication' instead. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"ssl_verify_mode", :plugin=><LogStash::Inputs::Beats ssl_certificate=>"/certs/server.crt", ssl_key=>"/certs/server.key", ssl_verify_mode=>"none", port=>5044, host=>"0.0.0.0", id=>"input_beats", ssl=>true, ssl_certificate_authorities=>["/certs/ca.crt" not having ssl certificates in my elesticsearch cluster, is the possible cause that the malcolm-beats index is not copied? |
Beta Was this translation helpful? Give feedback.
-
now in my elastic cluster with kibana have x509 certificates. where do i put the file with the ca-authorites in malcolm? |
Beta Was this translation helpful? Give feedback.
-
Converting this to a troubleshooting discussion instead of an issue. |
Beta Was this translation helpful? Give feedback.
-
During the installation phase I answered this question like this
Should Malcolm use and maintain its own OpenSearch instance? (Y / n): n
1: opensearch-local - local OpenSearch
2: opensearch-remote - remote OpenSearch
3: elasticsearch-remote - remote Elasticsearch
Select primary Malcolm document store (opensearch-local): 3
Enter primary remote Elasticsearch connection URL https://192.168.1.10:9200 (in my case)
Require SSL certificate validation for communication with remote Elasticsearch instance? (y / N): n
Enter Kibana connection URL https://10.9.0.215:5601
You must run auth_setup after configure to store data store connection credentials.
But at the following I don't know what to do
Forward Logstash logs to a secondary remote document store? (y / N): ?
1: opensearch-remote - remote OpenSearch
2: elasticsearch-remote - remote Elasticsearch
Select secondary Malcolm document store: ?
Enter secondary remote OpenSearch connection URL (?)
Require SSL certificate validation for communication with secondary remote OpenSearch instance? (y / N): n
You must run auth_setup after configure to store data store connection credentials.
Can you point me to the correct steps? I ask because by doing an installation with the answer no to the question on logstash now I get the container unhealthy
Beta Was this translation helpful? Give feedback.
All reactions