Home
Jobs
Companies
Resume

236 Nifi Jobs - Page 10

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 8 years

0 Lacs

Vadodara, Gujarat, India

Hybrid

Linkedin logo

Note: If shortlisted, we’ll contact you via WhatsApp and email. Please check both and respond promptly. Work Mode: Hybrid (3 days onsite/week) Location: Vadodara, Gujarat-390003 Shift: US Timings – EST / CST Salary: INR 1500000 - 2500000 About The Role We are seeking a highly skilled Senior Java Developer – Microservices to join our dynamic team and contribute to the modernization of enterprise applications. You will play a key role in transforming a monolithic architecture into a scalable, microservices-driven ecosystem. Key Responsibilities Design, develop, and maintain Java-based applications using Spring Boot Refactor legacy systems into microservices architecture Work on private or hybrid cloud infrastructure ensuring scalability and compliance Integrate with Apache Kafka for real-time streaming and message processing Use Oracle databases for transactional data management and support data migration via Apache NiFi Collaborate in an Agile team environment Optimize application performance and conduct root-cause analysis Maintain technical documentation and enforce coding standards Required Skills And Qualifications 5+ years experience in Java (8 or higher) development Strong expertise with Spring Boot Hands-on with Apache Kafka and inter-service communication Proficient in Oracle Database for transactional systems Experience with private/hybrid cloud infrastructure and security Strong analytical and problem-solving abilities Excellent communication and time management skills Must-Haves Min Bachelor's / Master's in Computer Science or related field No job gaps or frequent job changes (must have good stability) Notice Period: Immediate to 30 days

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Vadodara, Gujarat, India

Hybrid

Linkedin logo

Note: If shortlisted, we’ll contact you via WhatsApp and email. Please check both and respond promptly. Work Mode: Hybrid (3 days onsite/week) Location: Vadodara, Gujarat-390003 Shift: US Timings – EST / CST Salary: INR 1500000 - 2500000 About The Role We are seeking a highly skilled Senior Java Developer – Microservices to join our dynamic team and contribute to the modernization of enterprise applications. You will play a key role in transforming a monolithic architecture into a scalable, microservices-driven ecosystem. Key Responsibilities Design, develop, and maintain Java-based applications using Spring Boot Refactor legacy systems into microservices architecture Work on private or hybrid cloud infrastructure ensuring scalability and compliance Integrate with Apache Kafka for real-time streaming and message processing Use Oracle databases for transactional data management and support data migration via Apache NiFi Collaborate in an Agile team environment Optimize application performance and conduct root-cause analysis Maintain technical documentation and enforce coding standards Required Skills And Qualifications 5+ years experience in Java (8 or higher) development Strong expertise with Spring Boot Hands-on with Apache Kafka and inter-service communication Proficient in Oracle Database for transactional systems Experience with private/hybrid cloud infrastructure and security Strong analytical and problem-solving abilities Excellent communication and time management skills Must-Haves Min Bachelor's / Master's in Computer Science or related field No job gaps or frequent job changes (must have good stability) Notice Period: Immediate to 30 days

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Vapi, Gujarat, India

On-site

Linkedin logo

OverviewWe are seeking a highly skilled Senior Data Engineer to lead the design, development, and optimization of large-scale data infrastructure, pipelines, and platforms. This role requires expertise across the full spectrum of data engineering, including cloud and on-premise systems, real-time streaming, orchestration frameworks, and robust data governance practices.You will be a key contributor in shaping the organization’s data ecosystem, enabling analytics, machine learning, and real-time decision-making at scale. This position demands not just competence—but mastery—of the core disciplines and tools in modern data engineering. ResponsibilitiesData Platform ArchitectureDesign, implement, and manage hybrid data platforms across on-premise and cloud environments.Build scalable and reliable data lake and warehouse solutions using best-in-class storage formats and compute frameworks.Collaborate with cross-functional teams to define data architecture and technology strategy aligned with business objectives.Ingestion and OrchestrationDevelop and maintain robust data ingestion workflows using tools such as Apache NiFi, Kafka, and custom connectors.Implement data orchestration pipelines with tools like Dagster, Airflow, or Prefect for both batch and streaming data.Build modular, maintainable workflows that adhere to best practices in monitoring, error handling, and retries.ETL/ELT Pipeline DevelopmentDesign and optimize ETL/ELT pipelines that process data at scale from multiple systems into analytical environments.Ensure data workflows are highly performant, idempotent, and compliant with SLAs.Use Spark, dbt, or custom code to transform, enrich, and validate data.Data Modeling and WarehousingCreate and maintain normalized and denormalized schemas for analytical workloads using star and snowflake models.Work with cloud and on-premise databases and warehouses including PostgreSQL, Redshift, BigQuery, Snowflake, and Hive.Define partitioning, bucketing, and indexing strategies to ensure query efficiency.Infrastructure and DevOpsDeploy and maintain infrastructure using Terraform, Ansible, or shell scripting for both cloud and on-premise systems.Implement CI/CD pipelines for data services using Jenkins, GitLab CI, or similar tools.Utilize Docker and optionally Kubernetes to package and manage data applications.Data Governance and QualityDefine and enforce data quality policies using tools like Great Expectations or Deequ.Establish lineage and metadata tracking through solutions like Apache Atlas, Amundsen, or Collibra.Implement access control, encryption, and audit policies to ensure data security and compliance.Monitoring and OptimizationMonitor pipeline health, job performance, and system metrics using Prometheus, Grafana, or ELK.Continuously optimize workflows and queries to minimize cost and latency.Perform root cause analysis and troubleshooting for data issues in production systems.Collaboration and LeadershipMentor junior and mid-level data engineers, participate in technical reviews, and help define team standards.Work closely with data scientists, analysts, software engineers, and product managers to gather requirements and translate them into robust data solutions.Promote a culture of high quality, documentation, reusability, and operational excellence. QualificationsRequiredBachelor’s or Master’s degree in Computer Science, Engineering, or related field.At least 5 years of experience as a data engineer, with expertise in both on-premise and cloud environments.Deep experience with Apache NiFi, Dagster, and orchestration frameworks such as Airflow or Prefect.Proficiency in Python, SQL, and optionally Scala or Java.Strong understanding of distributed systems, including Hadoop, Spark, and Kafka.Demonstrated experience building secure, scalable, and maintainable data pipelines and infrastructure.Familiarity with modern data stack tools and infrastructure automation.PreferredExperience with real-time data processing and CDC pipelines.Exposure to regulatory and high-security environments (e.g., healthcare, finance, industrial systems).Certifications in AWS, GCP, or Azure for data engineering or analytics.Contributions to open-source data tooling or internal platform development. What We OfferA high-impact role in a data-driven organization focused on innovation and scalability.Flexible working environment and strong support for personal development.Competitive compensation and benefits, including performance-based incentives.Opportunities to work on high-visibility projects and influence enterprise data architecture. How to Apply Please share your updated CV along with the following details:Current CTCExpected CTCNotice Period Email to: jignesh.pandoriya@merillife.com

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About The Role OSTTRA India The Role: Enterprise Architect - Integration The Team: The OSTTRA Technology team is composed of Capital Markets Technology professionals, who build, support and protect the applications that operate our network. The technology landscape includes high-performance, high-volume applications as well as compute intensive applications, leveraging contemporary microservices, cloud-based architectures. The Impact: Together, we build, support, protect and manage high-performance, resilient platforms that process more than 100 million messages a day. Our services are vital to automated trade processing around the globe, managing peak volumes and working with our customers and regulators to ensure the efficient settlement of trades and effective operation of global capital markets. What’s in it for you: The current objective is to identify individuals with 16+ years of experience who have high expertise, to join their existing team of experts who are spread across the world. This is your opportunity to start at the beginning and get the advantages of rapid early growth. This role is based out in Gurgaon and expected to work with different teams and colleagues across the globe. This is an excellent opportunity to be part of a team based out of Gurgaon and to work with colleagues across multiple regions globally. Responsibilities The role shall be responsible for establishing, maintaining, socialising and realising the target state integration strategy for FX & Securities Post trade businesses of Osttra. This shall encompass the post trade lifecycle of our businesses including connectivity with clients, markets ecosystem and Osttra’s post trade family of networks and platforms and products. The role shall partner with product architects, product managers, delivery heads and teams for refactoring the deliveries towards the target state. They shall be responsible for the efficiency, optimisation, oversight and troubleshooting of current day integration solutions, platforms and deliveries as well, in addition target state focus. The role shall be expected to produce and maintain integration architecture blueprint. This shall cover current state and propose a rationalised view of target state of end-to-end integration flows and patterns. The role shall also provide for and enable the needed technology platforms/tools and engineering methods to realise the strategy. The role enable standardisation of protocols / formats (at least within Osttra world) , tools and reduce the duplication & non differentiated heavy lift in systems. The role shall enable the documentation of flows & capture of standard message models. Integration strategy shall also include transformation strategy which is so vital in a multi-lateral / party / system post trade world. Role shall partner with other architects and strategies / programmes and enable the demands of UI, application, and data strategies. What We’re Looking For Rich domain experience of financial services industry preferably with financial markets, Pre/post trade life cycles and large-scale Buy/Sell/Brokerage organisations Should have experience of leading the integration strategies and delivering the integration design and architecture for complex programmes and financial enterprises catering to key variances of latency / throughput. Experience with API Management platforms (like AWS API Gateway, Apigee, Kong, MuleSoft Anypoint) and key management concepts (API lifecycle management, versioning strategies, developer portals, rate limiting, policy enforcement) Should be adept with integration & transformation methods, technologies and tools. Should have experience of domain modelling for messages / events / streams and APIs. Rich experience of architectural patterns like Event driven architectures, micro services, event streaming, Message processing/orchestrations, CQRS, Event sourcing etc. Experience of protocols or integration technologies like FIX, Swift, MQ, FTP, API etc. .. including knowledge of authentication patterns (OAuth, mTLS, JWT, API Keys), authorization mechanisms, data encryption (in transit and at rest), secrets management, and security best practices Experience of messaging formats and paradigms like XSD, XML, XSLT, JSON, Protobuf, REST, gRPC, GraphQL etc … Experience of technology like Kafka or AWS Kinesis, Spark streams, Kubernetes / EKS, AWS EMR Experience of languages like Java, python and message orchestration frameworks like Apache Camel, Apache Nifi, AWS Step Functions etc. Experience in designing and implementing traceability/observability strategies for integration systems and familiarity with relevant framework tooling. Experience of engineering methods like CI/CD, build deploy automation, Infra as code and integration testing methods and tools Should have appetite to review / code for complex problems and should find interests / energy in doing design discussions and reviews. Experience and strong understanding of multicloud integration patterns. The Location: Gurgaon, India About Company Statement OSTTRA is a market leader in derivatives post-trade processing, bringing innovation, expertise, processes and networks together to solve the post-trade challenges of global financial markets. OSTTRA operates cross-asset post-trade processing networks, providing a proven suite of Credit Risk, Trade Workflow and Optimisation services. Together these solutions streamline post-trade workflows, enabling firms to connect to counterparties and utilities, manage credit risk, reduce operational risk and optimise processing to drive post-trade efficiencies. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. These businesses have an exemplary track record of developing and supporting critical market infrastructure and bring together an established community of market participants comprising all trading relationships and paradigms, connected using powerful integration and transformation capabilities. About OSTTRA Candidates should note that OSTTRA is an independent firm, jointly owned by S&P Global and CME Group. As part of the joint venture, S&P Global provides recruitment services to OSTTRA - however, successful candidates will be interviewed and directly employed by OSTTRA, joining our global team of more than 1,200 post trade experts. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. OSTTRA is a joint venture, owned 50/50 by S&P Global and CME Group. With an outstanding track record of developing and supporting critical market infrastructure, our combined network connects thousands of market participants to streamline end to end workflows - from trade capture at the point of execution, through portfolio optimization, to clearing and settlement. Joining the OSTTRA team is a unique opportunity to help build a bold new business with an outstanding heritage in financial technology, playing a central role in supporting global financial markets. Learn more at www.osttra.com. What’s In It For You? Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body.Flexible Downtime: Generous time off helps keep you energized for your time on.Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills.Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs.Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families.Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), BSMGMT203 - Entry Professional (EEO Job Group) Job ID: 315309 Posted On: 2025-05-12 Location: Gurgaon, Haryana, India

Posted 1 month ago

Apply

20 years

0 Lacs

Gurugram, Haryana

Work from Office

Indeed logo

About the Role: OSTTRA India The Role: Enterprise Architect - Integration The Team: The OSTTRA Technology team is composed of Capital Markets Technology professionals, who build, support and protect the applications that operate our network. The technology landscape includes high-performance, high-volume applications as well as compute intensive applications, leveraging contemporary microservices, cloud-based architectures. The Impact: Together, we build, support, protect and manage high-performance, resilient platforms that process more than 100 million messages a day. Our services are vital to automated trade processing around the globe, managing peak volumes and working with our customers and regulators to ensure the efficient settlement of trades and effective operation of global capital markets. What’s in it for you: The current objective is to identify individuals with 16+ years of experience who have high expertise, to join their existing team of experts who are spread across the world. This is your opportunity to start at the beginning and get the advantages of rapid early growth. This role is based out in Gurgaon and expected to work with different teams and colleagues across the globe. This is an excellent opportunity to be part of a team based out of Gurgaon and to work with colleagues across multiple regions globally. Responsibilities: The role shall be responsible for establishing, maintaining, socialising and realising the target state integration strategy for FX & Securities Post trade businesses of Osttra. This shall encompass the post trade lifecycle of our businesses including connectivity with clients, markets ecosystem and Osttra’s post trade family of networks and platforms and products. The role shall partner with product architects, product managers, delivery heads and teams for refactoring the deliveries towards the target state. They shall be responsible for the efficiency, optimisation, oversight and troubleshooting of current day integration solutions, platforms and deliveries as well, in addition target state focus. The role shall be expected to produce and maintain integration architecture blueprint. This shall cover current state and propose a rationalised view of target state of end-to-end integration flows and patterns. The role shall also provide for and enable the needed technology platforms/tools and engineering methods to realise the strategy. The role enable standardisation of protocols / formats (at least within Osttra world) , tools and reduce the duplication & non differentiated heavy lift in systems. The role shall enable the documentation of flows & capture of standard message models. Integration strategy shall also include transformation strategy which is so vital in a multi-lateral / party / system post trade world. Role shall partner with other architects and strategies / programmes and enable the demands of UI, application, and data strategies. What We’re Looking For: Rich domain experience of financial services industry preferably with financial markets, Pre/post trade life cycles and large-scale Buy/Sell/Brokerage organisations Should have experience of leading the integration strategies and delivering the integration design and architecture for complex programmes and financial enterprises catering to key variances of latency / throughput. Experience with API Management platforms (like AWS API Gateway, Apigee, Kong, MuleSoft Anypoint) and key management concepts (API lifecycle management, versioning strategies, developer portals, rate limiting, policy enforcement) Should be adept with integration & transformation methods, technologies and tools. Should have experience of domain modelling for messages / events / streams and APIs. Rich experience of architectural patterns like Event driven architectures, micro services, event streaming, Message processing/orchestrations, CQRS, Event sourcing etc. Experience of protocols or integration technologies like FIX, Swift, MQ, FTP, API etc. .. including knowledge of authentication patterns (OAuth, mTLS, JWT, API Keys), authorization mechanisms, data encryption (in transit and at rest), secrets management, and security best practices Experience of messaging formats and paradigms like XSD, XML, XSLT, JSON, Protobuf, REST, gRPC, GraphQL etc … Experience of technology like Kafka or AWS Kinesis, Spark streams, Kubernetes / EKS, AWS EMR Experience of languages like Java, python and message orchestration frameworks like Apache Camel, Apache Nifi, AWS Step Functions etc. Experience in designing and implementing traceability/observability strategies for integration systems and familiarity with relevant framework tooling. Experience of engineering methods like CI/CD, build deploy automation, Infra as code and integration testing methods and tools Should have appetite to review / code for complex problems and should find interests / energy in doing design discussions and reviews. Experience and strong understanding of multicloud integration patterns. The Location: Gurgaon, India About Company Statement: OSTTRA is a market leader in derivatives post-trade processing, bringing innovation, expertise, processes and networks together to solve the post-trade challenges of global financial markets. OSTTRA operates cross-asset post-trade processing networks, providing a proven suite of Credit Risk, Trade Workflow and Optimisation services. Together these solutions streamline post-trade workflows, enabling firms to connect to counterparties and utilities, manage credit risk, reduce operational risk and optimise processing to drive post-trade efficiencies. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. These businesses have an exemplary track record of developing and supporting critical market infrastructure and bring together an established community of market participants comprising all trading relationships and paradigms, connected using powerful integration and transformation capabilities. About OSTTRA Candidates should note that OSTTRA is an independent firm, jointly owned by S&P Global and CME Group. As part of the joint venture, S&P Global provides recruitment services to OSTTRA - however, successful candidates will be interviewed and directly employed by OSTTRA, joining our global team of more than 1,200 post trade experts. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. OSTTRA is a joint venture, owned 50/50 by S&P Global and CME Group. With an outstanding track record of developing and supporting critical market infrastructure, our combined network connects thousands of market participants to streamline end to end workflows - from trade capture at the point of execution, through portfolio optimization, to clearing and settlement. Joining the OSTTRA team is a unique opportunity to help build a bold new business with an outstanding heritage in financial technology, playing a central role in supporting global financial markets. Learn more at www.osttra.com . What’s In It For You? Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), BSMGMT203 - Entry Professional (EEO Job Group) Job ID: 315309 Posted On: 2025-05-12 Location: Gurgaon, Haryana, India

Posted 1 month ago

Apply

12 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About the Company We are Mindsprint! A leading-edge technology and business services firm that provides impact driven solutions to businesses, enabling them outpace speed of change. For over three decades we have been accelerating technology transformation for the Olam Group and their large base of global clients. Working with leading technologies and empowered with the freedom to create new solutions and better existing ones, we have been inspiring businesses with pioneering initiatives. Awards bagged in the recent years: Great Place To Work® Certified™ for 2023-2024Best Shared Services in India Award by Shared Services Forum – 2019Asia’s No.1 Shared Services in Process Improvement and Value Creation by Shared Services and Outsourcing Network Forum – 2019International Innovation Award for Best Services and Solutions – 2019Kincentric Best Employer India – 2020Creative Talent Management Impact Award – SSON Impact Awards 2021The Economic Times Best Workplaces for Women – 2021 & 2022#SSFExcellenceAward for Delivering Business Impact through Innovative People Practices – 2022 For more info: https://www.mindsprint.org/ Follow us in LinkedIn: Mindsprint Position : Associate Director Responsibilities Lead, mentor, and manage the Data Architects, Apps DBA, and DB Operations teams.Possess strong experience and deep understanding of major RDBMS, NoSQL, and Big Data technologies, with expertise in system design and advanced troubleshooting in high-pressure production environments.Core technologies include SQL Server, PostgreSQL, MySQL, TigerGraph, Neo4J, Elastic Search, ETL concepts, and high-level understanding on data warehouse platforms such as Snowflake, ClickHouse, etc.Define, validate, and implement robust data models and database solutions for clients across sectors such as Agriculture, Supply Chain, and Life Sciences.Oversee end-to-end database resource provisioning in the cloud, primarily on Azure, covering IaaS, PaaS, and SaaS models, along with proactive cost management and optimization.Hands-on expertise in data migration strategies between on-premises and cloud environments, ensuring minimal downtime and secure transitions.Experienced in database performance tuning, identifying and resolving SQL code bottlenecks, code review, optimization for high throughput, and regular database maintenance including defragmentation.Solid understanding of High Availability (HA) and Disaster Recovery (DR) solutions, with experience in setting up failover setup, replication, backup, and recovery strategies.Expertise in implementing secure data protection measures such as encryption (at rest and in transit), data masking, access controls, DLP strategies, and ensuring regulatory compliance with GDPR, PII, PCI-DSS, HIPAA, etc.Skilled in managing data integration, data movement, and data report pipelines using tools like Azure Data Factory (ADF), Apache NiFi, and Talend.Fair understanding of database internals, storage engines, indexing strategies, and partitioning for optimal resource and performance management.Strong knowledge in Master Data Management (MDM), data cataloging, metadata management, and building comprehensive data lineage frameworks.Proven experience in implementing monitoring and alerting systems for database health and capacity planning using tools like Azure Monitor, Grafana, or custom scripts.Exposure to DevOps practices for database management, including CI/CD pipelines for database deployments, version control of database schemas, and Infrastructure as Code (IaC) practices (e.g., Terraform, ARM templates).Experience collaborating with data analytics teams to provision optimized environments as data’s are shared between RDBMS, NoSQL and Snowflake Layers.Knowledge of security best practices for multi-tenant database environments and data segmentation strategies.Ability to guide the evolution of data governance frameworks, defining policies, standards, and best practices for database environments. Job Location : ChennaiNotice period :15 Days / Immediate / Currently Serving Notice period - Max 30 DaysShift : Day ShiftExperience : Min 12 YearsWork Mode : HybridGrade : D1 Associate Director

Posted 1 month ago

Apply

3 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Senior Associate / Manager ? Nifi Developer Job Location: Pan India Candidate should possess 8 to 12years of experience where 3+ years? should be relevant. Roles & Responsibilities: ? Design, develop, and manage data pipelines using Apache NiFi ? Integrate with systems like Kafka, HDFS, Hive, Spark, and RDBMS ? Monitor, troubleshoot, and optimize data flows ? Ensure data quality, reliability, and security ? Work with cross-functional teams to gather requirements and deliver data solutions Skills Required: ? Strong hands-on experience with Apache NiFi ? Knowledge of data ingestion, streaming, and batch processing ? Experience with Linux, shell scripting, and cloud environments (AWS/GCP is a plus) ? Familiarity with REST APIs, JSON/XML, and data transformation Skills Required RoleNifi Developer - SA/M Industry TypeIT/ Computers - Software Functional AreaIT-Software Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills NIFI DEVELOPER DESIGN DEVELOPMENT Other Information Job CodeGO/JC/21435/2025 Recruiter NameSPriya

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Pune, Maharashtra, India

Hybrid

Linkedin logo

About Company: Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. · Job Title: ETL Testing + Java + API Testing + UNIX Commands · Location: Pune [Kharadi] (Hybrid) · Experience: 7 + yrs [With 7+ Relevant Experience in ETL Testing] · Job Type : Contract to hire. · Notice Period:- Immediate joiners. Mandatory Skills : ETL Testing, Java, API Testing, UINX Commands Job Description: 1) Key Responsibilities: ! Extensive experience in validating ETL processes, ensuring accurate data extraction, transformation, and loading across multiple environments. ! Proficient in Java programming, with the ability to understand and write Java code when required. ! Advanced skills in SQL for data validation, querying databases, and ensuring data consistency and integrity throughout the ETL process. ! Expertise in utilizing Unix commands to manage test environments, handle file systems, and execute system-level tasks. ! Proficient in creating shell scripts to automate testing processes, enhancing productivity and reducing manual intervention. ! Ensuring that data transformations and loads are accurate, with strong attention to identifying and resolving discrepancies in the ETL process. ! Focused on automating repetitive tasks and optimizing testing workflows to increase overall testing efficiency. ! Write and execute automated test scripts using Java to ensure the quality and functionality of ETL solutions. ! Utilize Unix commands and shell scripting to automate repetitive tasks and manage system processes. ! Collaborate with cross-functional teams, including data engineers, developers, and business analysts, to ensure the ETL processes meet business requirements. ! Ensure that data transformations, integrations, and pipelines are robust, secure, and efficient. ! Troubleshoot data discrepancies and perform root cause analysis for failed data loads. ! Create comprehensive test cases, execute them, and document test results for all data flows. ! Actively participate in the continuous improvement of ETL testing processes and methodologies. ! Experience with version control systems (e.g., Git) and integrating testing into CI/CD pipelines. 2) Tools & Technologies (Good to Have): # Experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, and Spark for handling large-scale data processing and storage. # Knowledge of NiFi for automating data flows, transforming data, and integrating different systems seamlessly. # Experience with tools like Postman, SoapUI, or RestAssured to validate REST and SOAP APIs, ensuring correct data exchange and handling of errors.

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Pune, Maharashtra, India

Hybrid

Linkedin logo

Key Result Areas And Activities ETL Pipeline Development and Maintenance Design, develop, and maintain ETL pipelines using Cloudera tools such as Apache NiFi, Apache Flume, and Apache Spark. Create and maintain comprehensive documentation for data pipelines, configurations, and processes. Data Integration and Processing Integrate and process data from diverse sources including relational databases, NoSQL databases, and external APIs. Performance Optimization Optimize performance and scalability of Hadoop components (HDFS, YARN, MapReduce, Hive, Spark) to ensure efficient data processing. Identify and resolve issues related to data pipelines, system performance, and data integrity. Data Quality and Transformation Implement data quality checks and manage data transformation processes to ensure accuracy and consistency. Data Security and Compliance Apply data security measures and ensure compliance with data governance policies and regulatory requirements. Essential Skills Proficiency in Cloudera Data Platform (CDP) - Cloudera Data Engineering. Proven track record of successful data lake implementations and pipeline development. Knowledge of data lakehouse architectures and their implementation. Hands-on experience with Apache Spark and Apache Airflow within the Cloudera ecosystem. Proficiency in programming languages such as Python, Java, Scala, and Shell. Exposure to containerization technologies (e.g., Docker, Kubernetes) and system-level understanding of data structures, algorithms, distributed storage, and compute. Desirable Skills Experience with other CDP services like Dataflow, Stream Processing Familiarity with cloud environments such as AWS, Azure, or Google Cloud Platform Understanding of data governance and data quality principles CCP Data Engineer Certified Qualifications 7+ years of experience in Cloudera/Hadoop/Big Data engineering or related roles Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field Qualities Can influence and implement change; demonstrates confidence, strength of conviction and sound decisions. Believes in head-on dealing with a problem; approaches in logical and systematic manner; is persistent and patient; can independently tackle the problem, is not over-critical of the factors that led to a problem and is practical about it; follow up with developers on related issues. Able to consult, write, and present persuasively. Able to work in a self-organized and cross-functional team. Able to iterate based on new information, peer reviews, and feedback. Able to work seamlessly with clients across multiple geographies. Research focused mindset. Proficiency in English (read/write/speak) and communication over email. Excellent analytical, presentation, reporting, documentation, and interactive skills.

Posted 1 month ago

Apply

6 - 11 years

18 - 30 Lacs

Gurugram

Work from Office

Naukri logo

Application layer technologies including Tomcat/Nodejs, Netty, Springboot, hibernate, Elasticsearch, Kafka, Apache flink Frontend technologies including ReactJs, Angular, Android/IOS Data storage technologies like Oracle, S3, Postgres, Mongodb

Posted 1 month ago

Apply

- 2 years

3 - 8 Lacs

Lucknow

Hybrid

Naukri logo

Develop and maintain scalable data pipelines. Collaborate with data scientists and analysts to support business needs. Work with cloud platforms like AWS, Azure, or Google Cloud. Effectively working with cross-functional teams. Data Modelling.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies