Posted:3 weeks ago|
Platform:
Work from Office
Full Time
We are seeking a highly skilled ETL Architect Powered by AI (Apache NiFi/Kafka) to join our team. The ideal candidate will have expertise in managing, automating, and orchestrating data flows using Apache NiFi. In this role, you will design, implement, and maintain scalable data pipelines that handle real-time and batch data processing. The role also involves integrating NiFi with various data sources, performing data transformation tasks, and ensuring data quality and governance Key Responsibilities: Real-Time Data Integration (Apache NiFi & Kafka): Design, develop, and implement real-time data pipelines leveraging Apache NiFi for seamless data flow. Build and maintain Kafka producers and consumers for effective streaming data management across systems. Ensure the scalability, reliability, and performance of data streaming platforms using NiFi and Kafka. Monitor, troubleshoot, and optimize data flow within Apache NiFi and Kafka clusters. Manage schema evolution and support data serialization formats such as Avro , JSON , and Protobuf . Set up, configure, and optimize Kafka topics, partitions, and brokers for high availability and fault tolerance. Implement backpressure handling, prioritization, and flow control strategies in NiFi data flows. Integrate NiFi flows with external services (e.g., REST APIs , HDFS , RDBMS ) for efficient data movement. Establish and maintain secure data transmission, access controls, and encryption mechanisms in NiFi and Kafka environments. Develop and maintain batch ETL pipelines using tools like Informatica , Talend , and custom Python/SQL scripts . Continuously optimize and refactor existing ETL workflows to improve performance, scalability, and fault tolerance. Implement job scheduling, error handling, and detailed logging mechanisms for data pipelines. Conduct data quality assessments and design frameworks to ensure high-quality data integration. Design and document both high-level and low-level data architectures for real-time and batch processing. Lead technical evaluations of emerging tools and platforms for potential adoption into existing systems. Qualifications we seek in you: Minimum Qualifications / Skills: Bachelors degree in computer science , Information Technology , or a related field. Significant experience in IT with a focus on data architecture and engineering . Proven experience in technical leadership , driving data integration projects and initiatives. Certifications in relevant technologies (e.g., AWS Certified Solutions Architect , Microsoft Certified: Azure Data Engineer ) are a plus. Strong analytical skills and the ability to translate business requirements into effective technical solutions. Proficiency in communicating complex technical concepts to non-technical stakeholders. Preferred Qualifications / Skills: Extensive hands-on experience as a Data Architect . In-depth experience with Apache NiFi , Apache Kafka , and related ecosystem components (e.g., Kafka Streams , Schema Registry ). Ability to develop and optimize NiFi processors to handle various data sources and formats. Proficient in creating reusable NiFi templates for common data flows and transformations. Familiarity with integrating NiFi and Kafka with big data technologies like Hadoop , Spark , and Databricks . At least 2 end-to-end implementations of data integration solutions in a real-world environment. Experience in metadata management frameworks and scalable data ingestion processes. Solid understanding of data platform design patterns and best practices for integrating real-time data systems. Knowledge of ETL processes , data integration tools, and data modeling techniques. Demonstrated experience in Master Data Management (MDM) and data privacy standards . Experience with modern data platforms such as Snowflake , Databricks , and big data tools. Proven ability to troubleshoot complex data issues and implement effective solutions . Strong project management skills with the ability to lead data initiatives from concept to delivery. Familiarity with AI/ML frameworks and their integration with data platforms is a plus. Excellent communication and interpersonal skills , with the ability to collaborate effectively across cross-functional teams . Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Genpact
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Genpact
Kolkata, Hyderabad, Bengaluru
25.0 - 35.0 Lacs P.A.
30.0 - 35.0 Lacs P.A.
Bengaluru, Delhi / NCR, Mumbai (All Areas)
30.0 - 40.0 Lacs P.A.
14.0 - 18.0 Lacs P.A.
7.0 - 17.0 Lacs P.A.
Noida, Indore
10.0 - 14.0 Lacs P.A.
Pune, Gurugram, Bengaluru
35.0 - 40.0 Lacs P.A.
Indore, Hyderabad, Ahmedabad
25.0 - 32.5 Lacs P.A.
7.0 - 17.0 Lacs P.A.
Bengaluru, Karnataka
Salary: Not disclosed