1 year ago
#349098
Ruben Oubiña Mosteiro
Kafka producer: NoBrokersAvailable in docker
I'm starting with kafka and my project consists in get data and put it into a kafka topic.
I was trying to do it without docker, executing the data-generator in an python environment and it works perfectly, but in docker not.
My docker-compose is the following code:
version: '3.8'
services:
data-generator:
build: ./data-generator
container_name: data-generator
depends_on:
- kafka
zkpr:
image: confluentinc/cp-zookeeper:5.5.3
container_name: zkpr
environment:
ZOOKEEPER_CLIENT_PORT: 2181
healthcheck: { test: nc -z localhost 2181, interval: 1s, start_period: 120s }
kafka:
container_name: kafka
image: confluentinc/cp-enterprise-kafka:5.5.3
depends_on: [zkpr]
environment:
KAFKA_ZOOKEEPER_CONNECT: zkpr:2181
KAFKA_LISTENERS: INTERNAL://0.0.0.0:9092,OUTSIDE://0.0.0.0:9094
KAFKA_ADVERTISED_LISTENERS: INTERNAL://kafka:9092,OUTSIDE://localhost:9094
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INTERNAL:PLAINTEXT,OUTSIDE:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: INTERNAL
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
ports:
- 9092:9092
- 9094:9094
healthcheck: { test: nc -z localhost 9092, interval: 1s, start_period: 120s }
materialized:
image: materialize/materialized:v0.20.0
container_name: materialized
command: -w1
ports:
- 6875:6875
mzcli:
image: materialize/cli
container_name: mzcli
depends_on:
- materialized
dbt:
build:
context: ./dbt
target: dbt-third-party
args:
- build_for=${ARCH}
container_name: dbt
ports:
- 8000:8080
volumes:
- ./dbt/profiles.yml:/root/.dbt/profiles.yml
- ./dbt/:/usr/app/dbt
stdin_open: true
tty: true
depends_on:
- materialized
metabase:
image: ${MIMG}/metabase
container_name: metabase
depends_on:
- materialized
ports:
- 3030:3000
The producer that i'm using is a python code with the following code:
# Producer instance
prod = KafkaProducer(bootstrap_servers='kafka:9092')
try:
urlData = "url"
jsonData = getResponse(urlData)
for i in jsonData["resources"]:
## code
prod.send(topic='topic', key=i["key"].encode('utf-8'), value=json_value.encode('utf-8'))
prod.flush()
#code
except Exception as e:
print("Exception: %s" % str(e),file=sys.stderr)
sys.exit(1)
# code
The problem occurs when connecting the producer to the kafka, maybe the ports, or the ip, there are no problems getting the data or processing it.
If someone knows how to solve the problem i would be really grateful.
python
docker
apache-kafka
docker-compose
kafka-producer-api
0 Answers
Your Answer