Sr. Confluent Kafka Administrator

Total-TECH Co.

” The Job Description”

  1. Kafka Administration: Manage the installation, configuration, and administration of Confluent Kafka clusters in both cloud and on-premise environments.
  2. Monitoring and Alerts: Set up monitoring and alerting using tools such as Prometheus, Grafana, and Confluent Control Center to track Kafka cluster performance and health metrics.
  3. Cluster Scaling: Plan and implement the horizontal and vertical scaling of Kafka clusters to handle increased data throughput and storage requirements.
  4. Security Management: Implement and maintain security protocols for Kafka, including SSL/TLS encryption, Kerberos, and role-based access control (RBAC).
  5. Backup and Recovery: Develop and manage Kafka backup, disaster recovery, and failover strategies to ensure data integrity and high availability.
  6. Performance Tuning: Optimize Kafka brokers, ZooKeeper, producers, consumers, and connectors to improve performance, reduce latency, and manage data retention policies.
  7. Kafka Connect and Stream Processing: Manage Kafka Connect for integrating data from various sources, and optimize Kafka Streams applications for real-time data processing.
  8. Cluster Upgrades: Plan and execute cluster upgrades and patching of Kafka brokers, ZooKeeper, and Confluent components, ensuring minimal downtime.
  9. Automation and Scripting: Develop automation scripts using Bash, Python, or Ansible to streamline Kafka operations such as cluster deployment, scaling, and monitoring.
  10. Kafka Topics Management: Create, manage, and optimize Kafka topics, partitions, and replication settings, ensuring efficient use of cluster resources.
  11. Troubleshooting: Diagnose and resolve issues related to Kafka performance, ZooKeeper, consumer groups, broker failures, and message processing delays.
  12. Data Governance and Auditing: Implement data governance, audit logging, and compliance monitoring for Kafka topics and streams.
  13. Collaboration: Work closely with development and DevOps teams to support Kafka-based applications, ensure smooth integration, and provide guidance on best practices.
  14. Documentation: Maintain up-to-date documentation for Kafka environments, processes, and procedures, including incident response plans and operational guidelines.

    Requirements:

  • Kafka Expertise: In-depth experience with administering Apache Kafka and the Confluent Platform, including Kafka Streams, Kafka Connect, and Schema Registry.
  • Confluent Tools: Strong experience with Confluent-specific tools such as Confluent Control Center, Confluent Schema Registry, and Confluent REST Proxy.
  • ZooKeeper Administration: Solid understanding of ZooKeeper and its role in Kafka cluster management, including tuning and maintaining ZooKeeper ensembles.
  • Performance Optimization: Expertise in optimizing Kafka brokers, topics, partitions, and producers/consumers for high-throughput, low-latency messaging.
  • Scripting and Automation: Proficiency in scripting languages such as Bash, Python, and automation tools like Ansible for automating Kafka cluster tasks.
  • Security: Strong understanding of Kafka security configurations, including encryption (SSL/TLS), authentication (SASL/Kerberos), and authorization (ACLs, RBAC).
  • Cloud Deployments: Experience with deploying and managing Kafka clusters in cloud environments like AWS, Azure, or GCP, including leveraging Kubernetes for Kafka on containers.
  • Troubleshooting: Proven ability to troubleshoot Kafka performance, consumer-lag issues, replication lags, and ZooKeeper synchronization issues.
  • Data Integration: Experience with Kafka Connect, integrating various data sources (e.g., databases, message queues) and sinks with Kafka clusters.
  • Monitoring and Logging: Experience with Kafka monitoring tools such as Prometheus, Grafana, and ELK stack (Elasticsearch, Logstash, Kibana) for Kafka log aggregation and analysis.
  • Containerization: Familiarity with Docker and Kubernetes for containerized Kafka deployments.
  • DevOps Practices: Experience with CI/CD pipelines and infrastructure-as-code tools such as Terraform to automate Kafka infrastructure management.
  • Cloud Services: Hands-on experience with Confluent Cloud or Kafka on managed cloud platforms.
  • Data Governance: Experience with Kafka topic schema evolution and management using Confluent Schema Registry.
  • Certifications: Confluent Certified Administrator or relevant Kafka certifications.
  • Bachelor’s degree in Computer Science, Information Technology, or related field.
  • 5+ years of experience as a Kafka Administrator or in a similar role, with at least 2 years working with Confluent Platform.
  • Proven track record in managing large-scale, enterprise-grade Kafka environments.

Tagged as: , , , , , , , , , , ,

Upload your CV/resume or any other relevant file. Max. file size: 3 GB.

Job Overview
Job Location