Home
Jobs

4841 Apache Jobs - Page 20

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 8.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Role Working alongside other developers, testers, BA s, designers and product owners and you need to be able to communicate complex technical issues and be good at asking hard questions, at the right time. Working in small teams where collaboration and relationship building is key. We are interested in people who enjoy a dynamic, rapidly changing environment and importantly who want to drive change in the organization. Work in a build it and run it environment where teams build, deploy, monitor and support their components that they own. Accountabilities To grow and be successful in the role, you will ideally bring the following Great communication skills. You are happy to work alongside a team where you talk openly and constructively about technical issues. Solid knowledge and experience in Microservice development, using technologies like preferably Node.js (including frameworks like Fastify/Molecular), ES6/TypeScript Experience in software and Microservice design, familiar with design patterns and best practices API development and integration experience using REST/JSON, Kafka, message queues Experience with API service testing, such as unit, integration, acceptance, TDD / BDD, mocking and stubbing Solid DevOps knowledge including Configuring continuous integration, deployment, and delivery tools like Jenkins, or GitLab Cl Containerbased development using platforms like Docker, Kubernetes, and OpenShift Instrumenting monitoring and logging of applications Experience working with Microservices on AWS (EKS, Codefresh, GitHub Actions). Mandatory skill sets Must have knowledge, skills and experiences Strong understanding of CI/CD pipelines and Infrastructure as code principles such as Terraform. Experience with CI/CD tooling such as GitHub, Jenkins, Codefresh , Docker, Kubernetes. Experienced in building RestFul API s using Java (Springboot). Experienced in AWS development environment and ecosystem Cloud native and digital solutioning leveraging emerging technologies incl. containers, serverless, data, API and Microservices etc Experience with measuring, analysing, monitoring, and optimizing cloud performance including cloud system reliability and availability Understanding of storage solutions, networking, and security. Strong Familiarity with cloud platform specific Well Architected Frameworks Production experience of running services in Kubernetes. Preferred skill sets Good to have knowledge, skills and experiences Solid DevOps knowledge including o Configuring continuous integration, deployment, and delivery tools like Jenkins, or GitLab Cl o Containerbased development using platforms like Docker, Kubernetes, and OpenShift o Instrumenting monitoring and logging of applications Experience working with Microservices on AWS (EKS, Codefresh, GitHub Actions). Years of experience required Experience 7 to 8 years (23 years relevant) Education qualification BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration, Master of Engineering Degrees/Field of Study preferred Required Skills Power BI Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} No

Posted 3 days ago

Apply

18.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: Enterprise Architect Grade: VP Location: Pune / Mumbai/Chennai Experience: 18+ Years Organization: Intellect Design Arena Ltd. www.intellectdesign.com About the Role: We are looking for a senior Enterprise Architect with strong leadership and deep technical expertise to define and evolve the architecture strategy for iGTB , our award-winning transaction banking platform. The ideal candidate will have extensive experience architecting large-scale, cloud-native enterprise applications within the BFSI domain , and will be responsible for driving innovation, ensuring engineering excellence, and aligning architecture with evolving business needs. Mandatory Skills: Cloud-native architecture Microservices-based systems PostgreSQL, Apache Kafka, ActiveMQ Spring Boot / Spring Cloud, Angular Strong exposure to BFSI domain Key Responsibilities: Architectural Strategy & Governance: Define and maintain enterprise architecture standards and principles across iGTB product suites. Set up governance structures to ensure compliance across product lines. Technology Leadership: Stay updated on emerging technologies; assess and recommend adoption to improve scalability, security, and performance. Tooling & Automation: Evaluate and implement tools to improve developer productivity, code quality, and application reliability—including automation across testing, deployment, and monitoring. Architecture Evangelism: Drive adoption of architecture guidelines and tools across engineering teams through mentorship, training, and collaboration. Solution Oversight: Participate in the design of individual modules to ensure technical robustness and adherence to enterprise standards. Performance & Security: Oversee performance benchmarking and security assessments. Engage with third-party labs for certification as needed. Customer Engagement: Represent architecture in pre-sales, CXO-level interactions, and post-production engagements to demonstrate the product's technical superiority. Troubleshooting & Continuous Improvement: Support teams in resolving complex technical issues. Capture learnings and feed them back into architectural best practices. Automation Vision: Lead the end-to-end automation charter for iGTB—across code quality, CI/CD, testing, monitoring, and release management. Profile Requirements: 18+ years of experience in enterprise and solution architecture roles, preferably within BFSI or fintech Proven experience with mission-critical, scalable, and secure systems Strong communication and stakeholder management skills, including CXO interactions Demonstrated leadership in architecting complex enterprise products and managing teams of architects Ability to blend technical depth with business context to drive decisions Passion for innovation, engineering excellence, and architectural rigor Show more Show less

Posted 3 days ago

Apply

7.0 - 12.0 years

5 - 13 Lacs

Pune

Hybrid

Naukri logo

So, what’s the role all about? NICE APA is a comprehensive platform that combines Robotic Process Automation, Desktop Automation, Desktop Analytics, AI and Machine Learning solutions as Neva Discover NICE APA is more than just RPA, it's a full platform that brings together automation, analytics, and AI to enhance both front-office and back-office operations. It’s widely used in industries like banking, insurance, telecom, healthcare, and customer service We are seeking a Senior/Specialist Technical Support Engineer with a strong understanding of RPA applications and exceptional troubleshooting skills. The ideal candidate will have hands-on experience in Application Support, the ability to inspect and analyze RPA solutions and Application Server (e.g., Tomcat, Authentication, certificate renewal), and a solid understanding of RPA deployments in both on-premises and cloud-based environments (such as AWS). You should be comfortable supporting hybrid RPA architectures, handling bot automation, licensing, and infrastructure configuration in various environments. Familiarity with cloud-native services used in automation (e.g., AMQ queues, storage, virtual machines, containers) is a plus. Additionally, you’ll need a working knowledge of underlying databases and query optimization to assist with performance and integration issues. You will be responsible for diagnosing and resolving technical issues, collaborating with development and infrastructure teams, contributing to documentation and knowledge bases, and ensuring a seamless and reliable customer experience across multiple systems and platforms How will you make an impact? Interfacing with various R&D groups, Customer Support teams, Business Partners and Customers Globally to address and resolve product issues. Maintain quality and on-going internal and external communication throughout your investigation. Provide high level of support and minimize R&D escalations. Prioritize daily missions/cases and mange critical issues and situations. Contribute to the Knowledge Base, document troubleshooting and problem resolution steps and participate in Educating/Mentoring other support engineers. Willing to perform on call duties as required. Excellent problem-solving skills with the ability to analyze complex issues and implement effective solutions. Good communication skills with the ability to interact with technical and non-technical stakeholders. Have you got what it takes? Minimum of 8 to 12 years of experience in supporting global enterprise customers. Monitor, troubleshoot, and maintain RPA bots in production environments. Monitor, troubleshoot, system performance, application health, and resource usage using tools like Prometheus, Grafana, or similar Data Analytics - Analyze trends, patterns, and anomalies in data to identify product bugs Familiarity with ETL processes and data pipelines - Advantage Provide L1/L2/L3 support for RPA application, ensuring timely resolution of incidents and service requests Familiarity applications running on Linux-based Kubernetes clusters Troubleshoot and resolve incidents related to pods, services, and deployments Provide technical support for applications running on both Windows and Linux platforms, including troubleshooting issues, diagnosing problems, and implementing solutions to ensure optimal performance. Familiarity with Authentication methods like WinSSO and SAML. Knowledge in Windows/Linux Hardening like TLS enforcement, Encryption Enforcement, Certificate Configuration Working and Troubleshooting knowledge in Apache Software components like Tomcat, Apache and ActiveMQ. Working and Troubleshooting knowledge in SVN/Version Control applications Knowledge in DB schema, structure, SQL queries (DML, DDL) and troubleshooting Collect and analyze logs from servers, network devices, applications, and security tools to identify Environment/Application issues. Knowledge in terminal server (Citrix)- advantage Basic understanding on AWS Cloud systems. Network troubleshooting skills (working with different tools) Certification in RPA platforms and working knowledge in RPA application development/support – advantage. NICE Certification - Knowledge in RTI/RTS/APA products – Advantage Integrate NICE's applications with customers on-prem and cloud-based 3rd party tools and applications to ingest/transform/store/validate data. Shift- 24*7 Rotational Shift (include night shift) Other Required Skills: Excellent verbal and written communication skills Strong troubleshooting and problem-solving skills. Self-motivated and directed, with keen attention to details. Team Player - ability to work well in a team-oriented, collaborative environment. Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7326 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 3 days ago

Apply

6.0 - 9.0 years

4 - 9 Lacs

Pune

Hybrid

Naukri logo

So, what’s the role all about? NICE APA is a comprehensive platform that combines Robotic Process Automation, Desktop Automation, Desktop Analytics, AI and Machine Learning solutions as Neva Discover NICE APA is more than just RPA, it's a full platform that brings together automation, analytics, and AI to enhance both front-office and back-office operations. It’s widely used in industries like banking, insurance, telecom, healthcare, and customer service We are seeking a Senior/Specialist Technical Support Engineer with a strong understanding of RPA applications and exceptional troubleshooting skills. The ideal candidate will have hands-on experience in Application Support, the ability to inspect and analyze RPA solutions and Application Server (e.g., Tomcat, Authentication, certificate renewal), and a solid understanding of RPA deployments in both on-premises and cloud-based environments (such as AWS). You should be comfortable supporting hybrid RPA architectures, handling bot automation, licensing, and infrastructure configuration in various environments. Familiarity with cloud-native services used in automation (e.g., AMQ queues, storage, virtual machines, containers) is a plus. Additionally, you’ll need a working knowledge of underlying databases and query optimization to assist with performance and integration issues. You will be responsible for diagnosing and resolving technical issues, collaborating with development and infrastructure teams, contributing to documentation and knowledge bases, and ensuring a seamless and reliable customer experience across multiple systems and platforms How will you make an impact? Interfacing with various R&D groups, Customer Support teams, Business Partners and Customers Globally to address and resolve product issues. Maintain quality and on-going internal and external communication throughout your investigation. Provide high level of support and minimize R&D escalations. Prioritize daily missions/cases and mange critical issues and situations. Contribute to the Knowledge Base, document troubleshooting and problem resolution steps and participate in Educating/Mentoring other support engineers. Willing to perform on call duties as required. Excellent problem-solving skills with the ability to analyze complex issues and implement effective solutions. Good communication skills with the ability to interact with technical and non-technical stakeholders. Have you got what it takes? Minimum of 5 to 7 years of experience in supporting global enterprise customers. Monitor, troubleshoot, and maintain RPA bots in production environments. Monitor, troubleshoot, system performance, application health, and resource usage using tools like Prometheus, Grafana, or similar Data Analytics - Analyze trends, patterns, and anomalies in data to identify product bugs Familiarity with ETL processes and data pipelines - Advantage Provide L1/L2/L3 support for RPA application, ensuring timely resolution of incidents and service requests Familiarity applications running on Linux-based Kubernetes clusters Troubleshoot and resolve incidents related to pods, services, and deployments Provide technical support for applications running on both Windows and Linux platforms, including troubleshooting issues, diagnosing problems, and implementing solutions to ensure optimal performance. Familiarity with Authentication methods like WinSSO and SAML. Knowledge in Windows/Linux Hardening like TLS enforcement, Encryption Enforcement, Certificate Configuration Working and Troubleshooting knowledge in Apache Software components like Tomcat, Apache and ActiveMQ. Working and Troubleshooting knowledge in SVN/Version Control applications Knowledge in DB schema, structure, SQL queries (DML, DDL) and troubleshooting Collect and analyze logs from servers, network devices, applications, and security tools to identify Environment/Application issues. Knowledge in terminal server (Citrix)- advantage Basic understanding on AWS Cloud systems. Network troubleshooting skills (working with different tools) Certification in RPA platforms and working knowledge in RPA application development/support – advantage. NICE Certification - Knowledge in RTI/RTS/APA products – Advantage Integrate NICE's applications with customers on-prem and cloud-based 3rd party tools and applications to ingest/transform/store/validate data. Shift- 24*7 Rotational Shift (include night shift) Other Required Skills: Excellent verbal and written communication skills Strong troubleshooting and problem-solving skills. Self-motivated and directed, with keen attention to details. Team Player - ability to work well in a team-oriented, collaborative environment. Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7556 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 3 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Gurugram

Work from Office

Naukri logo

Company: Mercer Description: We are seeking a talented individual to join our Technology team at Mercer. This role will be based in Gurugram. This is a hybrid role that has a requirement of working at least three days a week in the office. Senior Devops Engineer We are looking for an ideal candidate with minimum 4 years of experience in Devops. The candidate should have strong and deep understanding of Amazon Web Services (AWS) & Devops tools like Terraform, Ansible, Jenkins. LocationGurgaon Functional AreaEngineering Education QualificationGraduate/ Postgraduate Experience4-6 Years We will count on you to: Deploy infrastructure on AWS cloud using Terraform Deploy updates and fixes Build tools to reduce occurrence of errors and improve customer experience Perform root cause analysis of production errors and resolve technical issues Develop scripts to automation Troubleshooting and maintenance What you need to have: 4+ years of technical experience in devops area. Knowledge of the following technologies and applications: AWS Terraform Linux Administration, Shell Script Ansible CI ServerJenkins Apache/Nginx/Tomcat Good to have Experience in following technologies: Python What makes you stand out: Excellent verbal and written communication skills, comfortable interfacing with business users Good troubleshooting and technical skills Able to work independently Why join our team: We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being. Mercer, a business ofMarsh McLennan (NYSEMMC),is a global leader in helping clients realize their investment objectives, shape the future of work and enhance health and retirement outcomesfor their people. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $23 billion and more than 85,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit mercer.com, or follow on LinkedIn and X. Mercer Assessments business, one of the fastest-growing verticals within the Mercer brand, is a leading global provider of talent measurement and assessment solutions. As part of Mercer, the worlds largest HR consulting firm and a wholly owned subsidiary of Marsh McLennanwe are dedicated to delivering talent foresight that empowers organizations to make informed, critical people decisions. Leveraging a robust, cloud-based assessment platform, Mercer Assessments partners with over 6,000 corporations, 31 sector skill councils, government agencies, and more than 700 educational institutions across 140 countries. Our mission is to help organizations build high-performing teams through effective talent acquisition, development, and workforce transformation strategies. Our research-backed assessments, advanced technology, and comprehensive analytics deliver transformative outcomes for both clients and their employees. We specialize in designing tailored assessment solutions across the employee lifecycle, including pre-hire evaluations, skills assessments, training and development, certification exams, competitions and more. At Mercer Assessments, we are committed to enhancing the way organizations identify, assess, and develop talent. By providing actionable talent foresight, we enable our clients to anticipate future workforce needs and make strategic decisions that drive sustainable growth and innovation. Mercer, a business of Marsh McLennan (NYSEMMC), is a global leader in helping clients realize their investment objectives, shape the future of work and enhance health and retirement outcomes for their people. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit mercer.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.

Posted 3 days ago

Apply

3.0 - 8.0 years

6 - 14 Lacs

Gurugram

Work from Office

Naukri logo

The ideal candidate will have strong expertise in Python, Apache Spark, and Databricks, along with experience in machine learning Data Engineer

Posted 3 days ago

Apply

18.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Role: Enterprise Architect Grade: VP Location: Pune / Mumbai/Chennai Experience: 18+ Years Organization: Intellect Design Arena Ltd. www.intellectdesign.com About the Role: We are looking for a senior Enterprise Architect with strong leadership and deep technical expertise to define and evolve the architecture strategy for iGTB , our award-winning transaction banking platform. The ideal candidate will have extensive experience architecting large-scale, cloud-native enterprise applications within the BFSI domain , and will be responsible for driving innovation, ensuring engineering excellence, and aligning architecture with evolving business needs. Mandatory Skills: Cloud-native architecture Microservices-based systems PostgreSQL, Apache Kafka, ActiveMQ Spring Boot / Spring Cloud, Angular Strong exposure to BFSI domain Key Responsibilities: Architectural Strategy & Governance: Define and maintain enterprise architecture standards and principles across iGTB product suites. Set up governance structures to ensure compliance across product lines. Technology Leadership: Stay updated on emerging technologies; assess and recommend adoption to improve scalability, security, and performance. Tooling & Automation: Evaluate and implement tools to improve developer productivity, code quality, and application reliability—including automation across testing, deployment, and monitoring. Architecture Evangelism: Drive adoption of architecture guidelines and tools across engineering teams through mentorship, training, and collaboration. Solution Oversight: Participate in the design of individual modules to ensure technical robustness and adherence to enterprise standards. Performance & Security: Oversee performance benchmarking and security assessments. Engage with third-party labs for certification as needed. Customer Engagement: Represent architecture in pre-sales, CXO-level interactions, and post-production engagements to demonstrate the product's technical superiority. Troubleshooting & Continuous Improvement: Support teams in resolving complex technical issues. Capture learnings and feed them back into architectural best practices. Automation Vision: Lead the end-to-end automation charter for iGTB—across code quality, CI/CD, testing, monitoring, and release management. Profile Requirements: 18+ years of experience in enterprise and solution architecture roles, preferably within BFSI or fintech Proven experience with mission-critical, scalable, and secure systems Strong communication and stakeholder management skills, including CXO interactions Demonstrated leadership in architecting complex enterprise products and managing teams of architects Ability to blend technical depth with business context to drive decisions Passion for innovation, engineering excellence, and architectural rigor Show more Show less

Posted 3 days ago

Apply

5.0 - 6.0 years

55 - 60 Lacs

Pune

Work from Office

Naukri logo

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow. Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fuelled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of "22.5 billion.

Posted 3 days ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

POSITION: Sr. Devops Engineer Job Type: Work From Office (5 days) Location: Sector 16A, Film City, Noida / Mumbai Relevant Experience: Minimum 4+ year Salary: Competitive Education- B.Tech About the Company: Devnagri is a AI company dedicated to personalizing business communication and making it hyper-local to attract non-English speakers. We address the significant gap in internet content availability for most of the world’s population who do not speak English. For more detail - Visit www.devnagri.com We seek a highly skilled and experienced Senior DevOps Engineer to join our dynamic team. As a key member of our technology department, you will play a crucial role in designing and implementing scalable, efficient and robust infrastructure solutions with a strong focus on DevOps automation and best practices. Roles And Responsibilities Design, plan, and implement scalable, reliable, secure, and robust infrastructure architectures Manage and optimize cloud-based infrastructure components - Architect and implement containerization technologies, such as Docker, Kubernetes Implement the CI/CD pipelines to automate the build, test, and deployment processes Design and implement effective monitoring and logging solutions for applications and infrastructure. Establish metrics and alerts for proactive issue identification and resolution Work closely with cross-functional teams to troubleshoot and resolve issues. Implement and enforce security best practices across infrastructure components Establish and enforce configuration standards across various environments. Implement and manage infrastructure using Infrastructure as Code principles Leverage tools like Terraform for provisioning and managing resources. Stay abreast of industry trends and emerging technologies. Evaluate and recommend new tools and technologies to enhance infrastructure and operations Must Have Skills Cloud ( AWS & GCP ), Redis, MongoDB, MySQL, Docker, bash scripting, Jenkins, Prometheus, Grafana, ELK Stack, Apache, Linux Good To Have Skills Kubernetes, Collaboration and Communication, Problem Solving, IAM, WAF, SAST/DAST Interview Process Screening Round then Shortlisting >> 3 technical round >> 1 Managerial round >> HR Closure with your short success story into Devops and Tech Cheers For more details, visit our website- https://www.devnagri.com  Note for approver Skills:- DevOps, Linux/Unix, Apache, Amazon Web Services (AWS), Google Cloud Platform (GCP), prometheus, grafana, MongoDB, MySQL and CI/CD Show more Show less

Posted 3 days ago

Apply

18.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role: Enterprise Architect Grade: VP Location: Pune / Mumbai/Chennai Experience: 18+ Years Organization: Intellect Design Arena Ltd. www.intellectdesign.com About the Role: We are looking for a senior Enterprise Architect with strong leadership and deep technical expertise to define and evolve the architecture strategy for iGTB , our award-winning transaction banking platform. The ideal candidate will have extensive experience architecting large-scale, cloud-native enterprise applications within the BFSI domain , and will be responsible for driving innovation, ensuring engineering excellence, and aligning architecture with evolving business needs. Mandatory Skills: Cloud-native architecture Microservices-based systems PostgreSQL, Apache Kafka, ActiveMQ Spring Boot / Spring Cloud, Angular Strong exposure to BFSI domain Key Responsibilities: Architectural Strategy & Governance: Define and maintain enterprise architecture standards and principles across iGTB product suites. Set up governance structures to ensure compliance across product lines. Technology Leadership: Stay updated on emerging technologies; assess and recommend adoption to improve scalability, security, and performance. Tooling & Automation: Evaluate and implement tools to improve developer productivity, code quality, and application reliability—including automation across testing, deployment, and monitoring. Architecture Evangelism: Drive adoption of architecture guidelines and tools across engineering teams through mentorship, training, and collaboration. Solution Oversight: Participate in the design of individual modules to ensure technical robustness and adherence to enterprise standards. Performance & Security: Oversee performance benchmarking and security assessments. Engage with third-party labs for certification as needed. Customer Engagement: Represent architecture in pre-sales, CXO-level interactions, and post-production engagements to demonstrate the product's technical superiority. Troubleshooting & Continuous Improvement: Support teams in resolving complex technical issues. Capture learnings and feed them back into architectural best practices. Automation Vision: Lead the end-to-end automation charter for iGTB—across code quality, CI/CD, testing, monitoring, and release management. Profile Requirements: 18+ years of experience in enterprise and solution architecture roles, preferably within BFSI or fintech Proven experience with mission-critical, scalable, and secure systems Strong communication and stakeholder management skills, including CXO interactions Demonstrated leadership in architecting complex enterprise products and managing teams of architects Ability to blend technical depth with business context to drive decisions Passion for innovation, engineering excellence, and architectural rigor Show more Show less

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Kafka.- Strong understanding of data warehousing concepts and architecture.- Familiarity with cloud platforms such as AWS or Azure.- Experience in SQL and NoSQL databases for data storage and retrieval. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Kafka.- Strong understanding of data warehousing concepts and architecture.- Familiarity with cloud platforms such as AWS or Azure.- Experience in SQL and NoSQL databases for data storage and retrieval. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide innovative solutions to enhance data accessibility and usability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Kafka, Apache Airflow, and cloud platforms such as AWS or Azure.- Strong understanding of data modeling and database design principles.- Experience with SQL and NoSQL databases for data storage and retrieval.- Familiarity with data warehousing concepts and tools. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Chennai

Work from Office

Naukri logo

Description Ciklum is looking for a Full-Stack Java Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4, 000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: As a Full-Stack Java Engineer, become a part of a high-performing engineering team focused on delivering robust, scalable applications in the financial technology domain. This role offers the opportunity to work across backend services and frontend interfaces, contributing to mission-critical systems in a dynamic Agile environment. Responsibilities Design, develop, and maintain microservices using Java (11+), Spring Boot, REST APIs, and SOAP Web Services Build and enhance user-facing components using JSF and other Java-based UI frameworks Integrate applications with databases (primarily Oracle) and support deployment pipelines using Maven, Jenkins, and Tomcat Ensure code quality through unit testing, code reviews, and best development practices Collaborate with cross-functional teams (product, QA, DevOps) to deliver end-to-end features Participate in Agile ceremonies including backlog refinement, sprint planning, and retrospectives Assist in troubleshooting and resolving issues across development and production environments Requirements 4-6 years of experience in full-stack or backend-focused Java development Backend Expertise: Java (11+), Spring Boot, REST APIs, SOAP Web Services, Oracle, Maven, Jenkins, Tomcat Frontend Proficiency: Experience with JSF or other component-based Java UI frameworks Familiarity with microservices architecture and CI/CD workflows Strong analytical and debugging skills Solid understanding of SDLC, version control (Git), and Agile methodologies Excellent communication and collaboration abilities Desirable Experience with Spring WebFlux, Kafka, MongoDB, Drools, Apache Freemarker, or Netty Exposure to cloud platforms such as AWS, Azure, or PCF (Pivotal Cloud Foundry) Hands-on experience working in a SAFe Agile setup or large-scale Agile programs Whats in it for you Care: your mental and physical health is our priority. We ensure comprehensive company-paid medical insurance, as well as financial and legal consultation Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy licence, language courses and company-paid certifications Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally Flexibility: hybrid work mode at Chennai or Pune Opportunities: we value our specialists and always find the best options for them. Our Resourcing Team helps change a project if needed to help you grow, excel professionally and fulfil your potential Global impact: work on large-scale projects that redefine industries with international and fast-growing clients Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events

Posted 3 days ago

Apply

6.0 - 11.0 years

11 - 16 Lacs

Noida

Work from Office

Naukri logo

Assist in the upgrade of Java from version 8 to 17 and Spring Boot from version 1. 5 to 3. 2. - Develop, test, and maintain high-quality code using Java and Spring Boot. - Write unit tests using JUnit and Mockito. - Rich experience with Apache Maven as build tool - Strong experience with Java 17. - Hands-on experience with Spring Boot 3. 2. - Experience in JUnit and Mockito for unit testing. - Familiarity with RESTful APIs and microservices architecture. Responsibilities - Participate in code reviews and contribute to the development of best practices. - Collaborate with senior developers and other team members to deliver software solutions. - Troubleshoot and debug applications. - Good understanding of software development best practices. - Strong analytical and problem-solving skills. - Good communication and teamwork skills. ** Good to have **: - Experience with circuit breaker patterns and caching techniques. - Experience with Docker. - Knowledge of CI/CD pipelines. - Familiarity with Agile methodologies. Mandatory Competencies Java - Core JAVA Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Fundamental Technical Skills - OOPS/Design Beh - Communication Beh - Communication and collaboration Database - SQL Others - Micro services Java Others - Spring Boot At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, were committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees success and happiness.

Posted 3 days ago

Apply

6.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Job : Data Engineer Consultant Jobs in Hyderabad (J49138)- Job in Hyderabad Data Engineer Consultant (Job Code : J49138) Job Summary 6 - 8 Years Data Engineer Consultant BE-Comp/IT, BE-Other, BTech-Comp/IT, BTech-Other, MCA Stream of Study: Computer Science/IT IT-Software/Software Services IT Software - Client Server Key Skills: Job Post Date: Sunday, June 15, 2025 Company Description Our client is a global knowledge practice that provides consulting, technology, engineering, management and innovation services to leading businesses, governments, non-governmental organizations and not-for-profits. We focus on gaining, refining and sharing expertise in the energy and utility sector, then provide strategic advice and implement outcome-driven solutions. Working with customers across the utility value chain, we deliver sustainable and lasting improvements to their efficiency and performance, adding value to their bottom line. Demand for power, gas and water is consistently growing as the population of the planet expands. Our goal is to support large consumers of energy and water, and improve the sustainability of resources by increasing efficiency and optimizing existing operations. We also develop commercially successful ways to use renewable resources which deliver transformative advantages for our customers. As demand grows so does opportunity, something Enzen has seized on since its inception in 2006. The business has grown and developed across the globe, with a physical presence in the UK, India, Australia, USA, Spain, Turkey, Middle-East, Africa and Kazakhstan. As we push into the second decade of the company, we`re super-charging our growth by delivering exceptional value and results to our customers. For people with the right mindset, the opportunity to develop and grow in the organization has never been greater. Our expanding solutions, services and geographies mean we`re always on the lookout for individuals who can drive positive change and are hungry for the success and rewards that go with it. Job Description "Strong programming and scripting skills in SQL and Python. Experience with data pipeline tools (e. g. , Apache Airflow, Azure Data Factory, AWS Glue). Hands-on with cloud-based data platforms such as Azure, AWS. Familiarity with data modeling and warehousing concepts (e. g. , star schema, snowflake). "

Posted 3 days ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist In this role will be responsible for the below: The Senior Data Engineer will be responsible for designing, building, and managing the data infrastructure and data pipeline processes for the bank. This role involves leading a team of data engineers, working closely with data scientists, analysts, and IT professionals to ensure data is accessible, reliable, and secure. The ideal candidate will have a strong background in data engineering, excellent leadership skills, and a thorough understanding of the banking industrys data requirements. Leadership and Team Management . Lead, mentor, and develop a team of data engineers. Establish best practices for data engineering and ensure team adherence. Coordinate with other IT teams, business units, and stakeholders. Data Pipelines Integration and Management: Design and implement scalable data architectures to support the banks data needs. Develop and maintain ETL (Extract, Transform, Load) processes. Ensure the data infrastructure is reliable, scalable, and secure. Oversee the integration of diverse data sources into a cohesive data platform. Ensure data quality, data governance, and compliance with regulatory requirements. Develop and enforce data security policies and procedures. Monitor and optimize data pipeline performance. Troubleshoot and resolve data-related issues promptly. Implement monitoring and alerting systems for data processes Requirements To be successful in this role, you should meet the following requirements: (Must have Requirements) 8+ years of experience in data engineering or related field Strong experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark) Proficiency in programming languages such as Python, Java, or Scala. Experience with cloud platforms (AWS, Azure, Google Cloud) and their data services. Deep understanding of ETL processes and data pipeline orchestration tools (Airflow, Apache NiFi). Knowledge of data modeling, data warehousing concepts, and data integration techniques. Strong problem-solving skills and ability to work under pressure. Excellent communication and interpersonal skills. Experience in the banking or financial services industry. Familiarity with regulatory requirements related to data security and privacy in the banking sector. Certifications in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc. ). Experience with machine learning and data science frameworks Location : Pune and Bangalore

Posted 3 days ago

Apply

5.0 - 10.0 years

7 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache JMeter Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Conduct code reviews and ensure coding standards are met- Implement best practices for application design and development Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache JMeter- Strong understanding of performance testing methodologies- Experience in load testing and stress testing- Knowledge of scripting languages for test automation- Familiarity with performance monitoring tools Additional Information:- The candidate should have a minimum of 5 years of experience in Apache JMeter- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Spring Boot Good to have skills : Apache SparkMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning to align application development with business objectives, ensuring that the solutions provided are effective and efficient. Your role will require you to stay updated with industry trends and best practices to enhance the overall performance of the applications being developed. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Spring Boot.- Good To Have Skills: Experience with Apache Spark.- Strong understanding of microservices architecture and RESTful APIs.- Experience with cloud platforms such as AWS or Azure.- Familiarity with containerization technologies like Docker and Kubernetes. The candidate should have minimum 5 years of experience in Spring Boot. Additional Information:- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

Exploring Apache Jobs in India

Apache is a widely used software foundation that offers a range of open-source software solutions. In India, the demand for professionals with expertise in Apache tools and technologies is on the rise. Job seekers looking to pursue a career in Apache-related roles have a plethora of opportunities in various industries. Let's delve into the Apache job market in India to gain a better understanding of the landscape.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving IT sectors and see a high demand for Apache professionals across different organizations.

Average Salary Range

The salary range for Apache professionals in India varies based on experience and skill level. - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

In the Apache job market in India, a typical career path may progress as follows: 1. Junior Developer 2. Developer 3. Senior Developer 4. Tech Lead 5. Architect

Related Skills

Besides expertise in Apache tools and technologies, professionals in this field are often expected to have skills in: - Linux - Networking - Database Management - Cloud Computing

Interview Questions

  • What is Apache HTTP Server and how does it differ from Apache Tomcat? (medium)
  • Explain the difference between Apache Hadoop and Apache Spark. (medium)
  • What is mod_rewrite in Apache and how is it used? (medium)
  • How do you troubleshoot common Apache server errors? (medium)
  • What is the purpose of .htaccess file in Apache? (basic)
  • Explain the role of Apache Kafka in real-time data processing. (medium)
  • How do you secure an Apache web server? (medium)
  • What is the significance of Apache Maven in software development? (basic)
  • Explain the concept of virtual hosts in Apache. (basic)
  • How do you optimize Apache web server performance? (medium)
  • Describe the functionality of Apache Solr. (medium)
  • What is the purpose of Apache Camel? (medium)
  • How do you monitor Apache server logs? (medium)
  • Explain the role of Apache ZooKeeper in distributed applications. (advanced)
  • How do you configure SSL/TLS on an Apache web server? (medium)
  • Discuss the advantages of using Apache Cassandra for data management. (medium)
  • What is the Apache Lucene library used for? (basic)
  • How do you handle high traffic on an Apache server? (medium)
  • Explain the concept of .htpasswd in Apache. (basic)
  • What is the role of Apache Thrift in software development? (advanced)
  • How do you troubleshoot Apache server performance issues? (medium)
  • Discuss the importance of Apache Flume in data ingestion. (medium)
  • What is the significance of Apache Storm in real-time data processing? (medium)
  • How do you deploy applications on Apache Tomcat? (medium)
  • Explain the concept of .htaccess directives in Apache. (basic)

Conclusion

As you embark on your journey to explore Apache jobs in India, it is essential to stay updated on the latest trends and technologies in the field. By honing your skills and preparing thoroughly for interviews, you can position yourself as a competitive candidate in the Apache job market. Stay motivated, keep learning, and pursue your dream career with confidence!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies