Job
Description
Job Purpose Vi is seeking an experienced Apache NiFi Developer to join our data engineering team. In this role, person will be responsible for designing, building, and managing data flows using Apache NiFi. The ideal candidate has a strong background in data integration and transformation, and experience with real-time data pipelines in enterprise environments. This role requires hands-on experience with NiFi and related data ingestion and processing technologies. Key Result Areas/Accountabilities Data Flow Design and Development : Create, configure, and manage data flows in Apache NiFi to support data integration from diverse sources, such as databases, cloud storage, and APIs. Data Transformation and Routing : Develop and implement data transformation, routing, and enrichment workflows within NiFi to ensure data consistency and quality. Data Ingestion and Processing : Set up and manage data ingestion pipelines that enable real-time and batch data processing, using NiFi processors for custom integrations and transformations. Monitoring and Optimization: M onitor, troubleshoot, and optimize NiFi workflows for performance, reliability, and scalability. Core Competencies, Knowledge, Experience Overall 6+ years of experience in managing database/NBI development part, minimum 2 years of experience in managing & integration Kafka layer. Strong hands-on experience with Apache Kafka (setup, configuration, and tuning) Experience with Kafka Streams and/or Kafka Connect for real-time data processing and integration, Proficiency in Kafka producer/consumer development (using Java , Scala , Python , or other Kafka-compatible languages). Familiarity with NoSQL (e.g., Cassandra , MongoDB ) and SQL databases. Solid understanding of message queuing , event-driven architecture , and pub/sub systems . Must have technical / professional qualifications Bachelors degree in Computer Science with 4+ years of experience with Apache NiFi for data integration and flow management. Background in real-time data processing and data pipeline management . Familiarity with cloud platforms (AWS, Azure, Google Cloud) and cloud-based storage solutions.