我正在尝试构建一个基于this tutorial的管道,其中Kafka使用file Source连接器读取文件。使用these Docker images for the Elastic Stack,我想将Logstash注册为“快速入门-数据”主题的消费者,但我暂时失败了。
这是我的logstash.conf文件:
input {
kafka {
bootstrap_servers => 'localhost:9092'
topics => 'quickstart-data'
}
}
output {
elasticsearch {
hosts => [ 'elasticsearch']
user => 'elastic'
password => 'changeme'
}
stdout {}
}
与Elasticsearch的连接之所以有效,是因为我使用心跳输入对其进行了测试。我得到的错误消息如下:无法建立到node -1的连接。Broker可能不可用。没有可用的节点,请放弃发送元数据请求
有什么想法吗?
发布于 2018-04-25 18:45:29
我建议你保持简单,使用Kafka Connect将数据也登陆到Elasticsearch:https://docs.confluent.io/current/connect/connect-elasticsearch/docs/elasticsearch_connector.html#quick-start
发布于 2020-02-18 05:29:06
也许有一种更好的方法可以做到这一点,但这里是我如何纠正这个问题的:
zookeeper:
image: confluentinc/cp-zookeeper:latest
ports:
- "2181:2181"
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
networks:
- stack
kafka:
image: confluentinc/cp-kafka:latest
ports:
- "9092:9092"
environment:
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
depends_on:
- zookeeper
networks:
- stack
input {
stdin{}
kafka {
id => "my_kafka_1"
bootstrap_servers => "kafka:29092"
topics => "test"
}
}
https://stackoverflow.com/questions/50018611
复制相似问题