About the Role We are looking for an experienced Data Streaming Engineer with strong hands-on expertise in Kafka Confluent and Apache Flink . The ideal candidate will have a solid background in building, managing, and optimizing real-time data streaming pipelines for large-scale systems. Key Responsibilities Design, develop, and maintain real-time streaming applications using Kafka Confluent and Apache Flink. Build end-to-end streaming architectures with a focus on scalability, reliability, and low latency. Manage Kafka clusters, including topics, partitions, and connectors. Optimize Flink jobs for high performance, fault tolerance, and throughput. Integrate streaming pipelines with downstream systems such as data lakes, warehouses, and microservices. Monitor and troubleshoot streaming workflows to ensure high availability. Work closely with data engineering, DevOps, and product teams to deliver robust streaming solutions. Required Skills & Experience 5+ years of experience in data engineering or data streaming roles. Strong expertise in the Kafka Confluent Platform, including Connect, Schema Registry, and KSQL. Proven experience with Apache Flink (streaming and batch). Strong understanding of distributed systems and event-driven architectures. Experience building and scaling streaming ETL pipelines. Proficiency in Java, Scala, or Python for developing Flink applications. Hands-on experience with cloud platforms (AWS, Azure, or GCP) is a strong plus. Good understanding of CI/CD, Docker, Kubernetes, and monitoring tools. Nice to Have Experience with other messaging systems such as Apache Pulsar, RabbitMQ, or AWS MSK. Familiarity with Snowflake, BigQuery, Redshift, or other cloud data warehouses. Understanding of microservices-based architectures and API integrations.
Company Description EZData Advisory is a consultancy specializing in data engineering and analytics, offering modern solutions tailored for diverse industries. The company provides certified data experts to ensure reliable and prompt project delivery without ramp-up time. Our services include aligning data strategies with long-term organizational goals, ensuring secure, scalable, and insight-driven outcomes. Additionally, EZData delivers custom pre-built data solutions designed to save time and fit seamlessly with your unique tech stack. Throughout every project, we prioritize efficiency, transparency, and measurable results. Role Description Experience: 3-5 Location: Remote (preferable Pune) Shift Timings : Australian time (7:00 am to 2 pm) BGV: Mandatory Description: 5+ years of experience with Data Modeling, particularly Dimensional modeling, to support our industrial data initiatives. The ideal candidate will play a key role in designing scalable data models, collaborating with cross-functional teams, and driving data architecture best practices across CCEP domains. Key Responsibilities: · Participate in Requirement workshops/ Backlog refinement workshop to understand data requirements for the project. · Build and maintain conceptual, logical, and physical data models using industry-standard tools (e.g., ER/Studio, Erwin). · Collaborate closely with Data Analysts, Data Stewards and the Data Governance team to ensure alignment of data models with business requirements and governance standards. · Translate business requirements into scalable data architecture solutions that support real-time and historical analytics. Mandatory Skills & Experience: • 3+ years of experience in Data architecture, Data modeling and Data analysis skills. • Familiarity with Dimensional data modelling, Kimball modeling techniques and data warehouse design. • Experience in SQL for data transformation, automation, and querying. • Familiarity with ETL/ELT processes, • Familiarity with Data lakes and Lakehouse architecture • Familiarity with data integration tools, and data pipeline orchestration. • Strong communication and stakeholder management skills, with the ability to work in agile teams and participate in sprint ceremonies. • Experience with Azure Databricks, Azure Data Lake, and related cloud- solutions is desirable. • Exposure to SAP ERP data is desirable