基于k8s的kafka 单例部署之SASL_PLAINTEXT账号密码验证机制

1. 简介

  实现kafka SASL_PLAINTEXT账号密码验证机制。

  详细请参考项目:https://github.com/duruo850/kafka

  kafka普通部署(不对外开放)请参考之前的文档, 这篇文章专注于如何SASL_PLAINTEXT的账号密码验证机制

2.服务器配置

2.1. jaas文件创建

kafka_server_jaas.conf

 KafkaServer {
    org.apache.kafka.common.security.plain.PlainLoginModule required
        serviceName="kafka"
        username="admin"
        password="123456"
        user_admin="123456";
};

设置用户admin的密码为123456

2.2. broker启动的时候设置环境变量KAFKA_OPTS

kafka部署的时候设置环境变量:

          env:
            - name: KAFKA_HEAP_OPTS
              value: "-Xmx512M -Xms512M"
            - name: KAFKA_OPTS
              value: "-Dlogging.level=INFO -Djava.security.auth.login.config=/opt/kafka/config/kafka_server_jaas.conf"

2.3. 相关properties配置

          --override listeners=SASL_PLAINTEXT://:9092 \
          --override advertised.listeners=SASL_PLAINTEXT://10.0.22.120:9092 \
          --override security.inter.broker.protocol=SASL_PLAINTEXT \
          --override sasl.mechanism.inter.broker.protocol=PLAIN \
          --override sasl.enabled.mechanisms=PLAIN \

listeners为对内的监听端口,

advertised.listeners为对外的监听端口,这个地址必须和客户端连接的bootstrap_servers一模一样,否则将是连接超时

python客户端连接实例

 from kafka import KafkaProducer
import datetime
import json

# 生产者
producer = KafkaProducer(
    bootstrap_servers='10.0.22.120:9092',
    security_protocol='SASL_PLAINTEXT',
    sasl_mechanism='PLAIN',
    sasl_plain_username='admin',
    sasl_plain_password='123456',
)
my_topic = "test"

for i in range(5):
    data = {'num': i, 'data': datetime.datetime.now().strftime('%Y-%m-%d %H:%M:%S')}
    producer.send(my_topic, json.dumps(data).encode('utf-8')).get(timeout=300)

 

如果是单例部署的话,需要设置offsets.topic.replication.factor=1

 

3.客户端配置

 

3.1. jaas文件创建

kafka_client_jaas.conf

KafkaClient {
org.apache.kafka.common.security.plain.PlainLoginModule required
    username="admin"
    password="123456";
};

设置用户admin的密码为123456

3.2. consule脚本重载:

运行以下2脚本前需要设置环境变量:

  kafka-console-consumer.sh
  kafka-console-producer.sh

export KAFKA_OPTS="-Djava.security.auth.login.config=/opt/kafka/config/kafka_client_jaas.conf"

3.3. 相关properties配置

consule修改config下consumer.properties和producer.properties:
  
 增加:

security.protocol=SASL_PLAINTEXT
sasl.mechanism=PLAIN

 

4.测试

4.1. kafka容器内脚本开启consumer

/opt/kafka/bin/kafka-console-consumer.sh --topic test --bootstrap-server localhost:9092 --consumer.config /opt/kafka/config/consumer.properties

 qiteck@server:~/program/kafka$ sudo kubectl exec -it kafka-0 -n kafka -- /bin/bash
root@kafka-0:/opt/kafka_2.13-3.3.2# /opt/kafka/bin/kafka-console-consumer.sh --topic test --bootstrap-server localhost:9092 --consumer.config /opt/kafka/config/consumer.properties
999

4.2. python脚本开启consumer

python consumer代码

 from kafka import KafkaConsumer

my_topic = "test"

consumer = KafkaConsumer(
    my_topic,
    bootstrap_servers='10.0.22.120:9092',
    auto_offset_reset='latest',
    api_version=(0, 10, 1),
    security_protocol='SASL_PLAINTEXT',
    sasl_mechanism='PLAIN',
    sasl_plain_username='admin',
    sasl_plain_password='123456',
)

for msg in consumer: # 这里会监听,无限循环
    print(msg.value)

设置sasl的用户名admin,密码123456

4.3. kafka容器内脚本开启producer

 

4.4. python脚本开启producer

python producer

 from kafka import KafkaProducer
import datetime
import json

# 生产者
producer = KafkaProducer(
    bootstrap_servers='10.0.22.120:9092',
    security_protocol='SASL_PLAINTEXT',
    sasl_mechanism='PLAIN',
    sasl_plain_username='admin',
    sasl_plain_password='123456',
)
my_topic = "test"

for i in range(5):
    data = {'num': i, 'data': datetime.datetime.now().strftime('%Y-%m-%d %H:%M:%S')}
    producer.send(my_topic, json.dumps(data).encode('utf-8')).get(timeout=300)

设置sasl的用户名admin,密码123456

4.5. kafka容器内脚本consumer接收情况

4.6. python脚本consumer接收情况

数据接收正常,完成。

 

5.遇到问题整理

5.1. requirement failed: inter.broker.listener.name must be a listener name defined in advertised.listeners.

  listeners要和advertised.listeners的协议要一致

--override listeners=SASL_PLAINTEXT://:9092 \
--override advertised.listeners=SASL_PLAINTEXT://10.0.22.120:9092 \

5.2. java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config

  kafka_server_jaas.conf中加入serviceName="kafka" 

KafkaServer {
    org.apache.kafka.common.security.plain.PlainLoginModule required
        serviceName="kafka"
        username="admin"
        password="123456"
        user_admin="123456";
};

5.3. python提示:kafka.errors.NoBrokersAvailable: NoBrokersAvailable

  没有带上账号密码的缘故:

producer = KafkaProducer(
    bootstrap_servers='10.0.22.120:9092',
    security_protocol='SASL_PLAINTEXT',
    sasl_mechanism='PLAIN',
    sasl_plain_username='admin',
    sasl_plain_password='u01202302011819',
)

5.4. Only one of inter.broker.listener.name and security.inter.broker.protocol should be set.

inter.broker.listener.name或者 security.inter.broker.protocol配置一个就好,不要2个都配置

 

5.5. {test=LEADER_NOT_AVAILABLE} (org.apache.kafka.clients.NetworkClient) 

等待一段时间

5.6. InvalidReplicationFactorException: Replication factor: 3 larger than available brokers: 1

offsets.topic.replication.factor设置为1

          --override offsets.topic.replication.factor=1 \

 

posted @ 2023-03-02 15:53  若-飞  阅读(862)  评论(0编辑  收藏  举报