Kafka Engineer (Azure & GCP Focus)
**Job Title:** Confluent Kafka Engineer (Azure & GCP Focus) **Location:** [Bangalore or Hyderabad ] **Role Overview** We are seeking an experienced **Confluent Kafka Engineer** with hands-on expertise in deploying, administering, and securing Kafka clusters in **Microsoft Azure** and **Google Cloud Platform (GCP)** environments. The ideal candidate will be skilled in cluster administration, RBAC, cluster linking and setup, and monitoring using Prometheus and Grafana, with a strong understanding of cloud-native best practices. **Key Responsibilities** - **Kafka Cluster Administration (Azure & GCP):** - Deploy, configure, and manage Confluent Kafka clusters on Azure and GCP virtual machines or managed infrastructure. - Plan and execute cluster upgrades, scaling, and disaster recovery strategies in cloud environments. - Set up and manage cluster linking for cross-region and cross-cloud data replication. - Monitor and maintain the health and performance of Kafka clusters, proactively identifying and resolving issues. - **Security & RBAC:** - Implement and maintain security protocols, including SSL/TLS encryption and role-based access control (RBAC). - Configure authentication and authorization (Kafka ACLs) across Azure and GCP environments. - Set up and manage **Active Directory (AD) plain authentication** and **OAuth** for secure user and application access. - Ensure compliance with enterprise security standards and cloud provider best practices. - **Monitoring & Observability:** - Set up and maintain monitoring and alerting using Prometheus and Grafana, integrating with Azure Monitor and GCP-native monitoring as needed. - Develop and maintain dashboards and alerts for Kafka performance and reliability metrics. - Troubleshoot and resolve performance and reliability issues using cloud-native and open-source monitoring tools. - **Integration & Automation:** - Develop and maintain automation scripts (Bash, Python, Terraform, Ansible) for cluster deployment, scaling, and monitoring. - Build and maintain infrastructure as code for Kafka environments in Azure and GCP. - Configure and manage **Kafka connectors** for integration with external systems, including **BigQuery Sync connectors** and connectors for Azure and GCP data services (such as Azure Data Lake, Cosmos DB, BigQuery). - **Documentation & Knowledge Sharing:** - Document standard operating procedures, architecture, and security configurations for cloud-based Kafka deployments. - Provide technical guidance and conduct knowledge transfer sessions for internal teams. **Required Qualifications** - Bachelors degree in Computer Science, Engineering, or related field. - 5+ years of hands-on experience with Confluent Platform and Kafka in enterprise environments. - Demonstrated experience deploying and managing Kafka clusters on **Azure** and **GCP** (not just using pre-existing clusters). - Strong expertise in cloud networking, security, and RBAC in Azure and GCP. - Experience configuring **AD plain authentication** and **OAuth** for Kafka. - Proficiency with monitoring tools (Prometheus, Grafana, Azure Monitor, GCP Monitoring). - Hands-on experience with Kafka connectors, including BQ Sync connectors, Schema Registry, KSQL, and Kafka Streams. - Scripting and automation skills (Bash, Python, Terraform, Ansible). - Familiarity with infrastructure as code practices. - Excellent troubleshooting and communication skills. **Preferred Qualifications** - Confluent Certified Developer/Admin certification. - Experience with cross-cloud Kafka streaming and integration scenarios. - Familiarity with Azure and GCP data services (Azure Data Lake, Cosmos DB, BigQuery). - Experience with other streaming technologies (e.g., Spark Streaming, Flink). - Experience with data visualization and analytics tools.