Home
Jobs

5940 Apache Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

18.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role: Enterprise Architect Grade: VP Location: Pune / Mumbai/Chennai Experience: 18+ Years Organization: Intellect Design Arena Ltd. www.intellectdesign.com About the Role: We are looking for a senior Enterprise Architect with strong leadership and deep technical expertise to define and evolve the architecture strategy for iGTB , our award-winning transaction banking platform. The ideal candidate will have extensive experience architecting large-scale, cloud-native enterprise applications within the BFSI domain , and will be responsible for driving innovation, ensuring engineering excellence, and aligning architecture with evolving business needs. Mandatory Skills: Cloud-native architecture Microservices-based systems PostgreSQL, Apache Kafka, ActiveMQ Spring Boot / Spring Cloud, Angular Strong exposure to BFSI domain Key Responsibilities: Architectural Strategy & Governance: Define and maintain enterprise architecture standards and principles across iGTB product suites. Set up governance structures to ensure compliance across product lines. Technology Leadership: Stay updated on emerging technologies; assess and recommend adoption to improve scalability, security, and performance. Tooling & Automation: Evaluate and implement tools to improve developer productivity, code quality, and application reliability—including automation across testing, deployment, and monitoring. Architecture Evangelism: Drive adoption of architecture guidelines and tools across engineering teams through mentorship, training, and collaboration. Solution Oversight: Participate in the design of individual modules to ensure technical robustness and adherence to enterprise standards. Performance & Security: Oversee performance benchmarking and security assessments. Engage with third-party labs for certification as needed. Customer Engagement: Represent architecture in pre-sales, CXO-level interactions, and post-production engagements to demonstrate the product's technical superiority. Troubleshooting & Continuous Improvement: Support teams in resolving complex technical issues. Capture learnings and feed them back into architectural best practices. Automation Vision: Lead the end-to-end automation charter for iGTB—across code quality, CI/CD, testing, monitoring, and release management. Profile Requirements: 18+ years of experience in enterprise and solution architecture roles, preferably within BFSI or fintech Proven experience with mission-critical, scalable, and secure systems Strong communication and stakeholder management skills, including CXO interactions Demonstrated leadership in architecting complex enterprise products and managing teams of architects Ability to blend technical depth with business context to drive decisions Passion for innovation, engineering excellence, and architectural rigor Show more Show less

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Kafka.- Strong understanding of data warehousing concepts and architecture.- Familiarity with cloud platforms such as AWS or Azure.- Experience in SQL and NoSQL databases for data storage and retrieval. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Kafka.- Strong understanding of data warehousing concepts and architecture.- Familiarity with cloud platforms such as AWS or Azure.- Experience in SQL and NoSQL databases for data storage and retrieval. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide innovative solutions to enhance data accessibility and usability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Kafka, Apache Airflow, and cloud platforms such as AWS or Azure.- Strong understanding of data modeling and database design principles.- Experience with SQL and NoSQL databases for data storage and retrieval.- Familiarity with data warehousing concepts and tools. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

5.0 - 10.0 years

7 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache JMeter Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Conduct code reviews and ensure coding standards are met- Implement best practices for application design and development Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache JMeter- Strong understanding of performance testing methodologies- Experience in load testing and stress testing- Knowledge of scripting languages for test automation- Familiarity with performance monitoring tools Additional Information:- The candidate should have a minimum of 5 years of experience in Apache JMeter- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Spring Boot Good to have skills : Apache SparkMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning to align application development with business objectives, ensuring that the solutions provided are effective and efficient. Your role will require you to stay updated with industry trends and best practices to enhance the overall performance of the applications being developed. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Spring Boot.- Good To Have Skills: Experience with Apache Spark.- Strong understanding of microservices architecture and RESTful APIs.- Experience with cloud platforms such as AWS or Azure.- Familiarity with containerization technologies like Docker and Kubernetes. The candidate should have minimum 5 years of experience in Spring Boot. Additional Information:- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

Job Title: Senior Java Developer (Remote) Experience: 6 to 8 Years Location: Remote Employment Type: Full-Time Notice period: Immediate Joiner Job Summary: We are looking for a highly skilled and experienced Senior Java Developer to join our distributed team. The ideal candidate should have a strong background in developing scalable enterprise-grade applications using Java and related technologies, with exposure to full-stack development, system integration, and performance optimization. Key Responsibilities: Design and develop high-performance, scalable, and reusable Java-based applications. Build RESTful APIs with a strong understanding of RESTful architecture. Implement enterprise integration patterns using Apache Camel or Spring Integration. Ensure application security in compliance with OWASP guidelines. Write and maintain unit, integration, and BDD tests using JUnit, Cucumber, Selenium. Conduct performance and load testing; optimize through memory and thread dump analysis. Collaborate with product owners, QA teams, and other developers in Agile/Scrum environments. Participate in code reviews, architecture discussions, and mentoring junior developers. Technical Skills & Experience Required: Core Backend: Strong proficiency in Java (8 or higher) Proficient in Spring Boot , Spring Security , Spring MVC , Spring Data Solid experience with REST API design, implementation, and testing using Postman , SoapUI Unit Testing , Integration Testing , BDD Testing Web Services and Integration: Experience with XML , Web Services (RESTful and SOAP) , Apache CXF Knowledge of Enterprise Integration Patterns Exposure to Apache Camel or Spring Integration Frontend & Full Stack: Familiarity with HTML5 , CSS3 Experience with TypeScript , JavaScript , jQuery , Node.js Working knowledge of Webpack and Gulp Database & Data Streaming: Strong in RDBMS and Database Design (e.g., Oracle , PL/SQL ) Exposure to MongoDB and NoSQL Understanding of Kafka architecture , Kafka as a data streaming platform Performance & Security: Experience in Performance Analysis and Application Tuning Understanding of Security aspects and OWASP guidelines Experience with Memory & Thread Dump Analysis Cloud & DevOps: Working knowledge of Kubernetes Familiarity with Elastic solutions at the enterprise level Experience in Identity and Access Management tools like ForgeRock About IGT Solutions: IGT Solutions is a next-gen customer experience (CX) company, defining and delivering transformative experiences for the global and most innovative brands using digital technologies. With the combination of Digital and Human Intelligence, IGT becomes the preferred partner for managing end-to-end CX journeys across Travel and High Growth Tech industries. We have a global delivery footprint, spread across 30 delivery centers in China, Colombia, Egypt, India, Indonesia, Malaysia, Philippines, Romania, South Africa, Spain, UAE, the US, and Vietnam, with 25000+ CX and Technology experts from 35+ nationalities. IGT's Digital team collaborates closely with our customers business & technology teams to take solutions faster to market while sustaining quality while focusing on business value and improving overall end-Customer Experience. Our offerings include industry solutions as well as Digital services. We work with leading global enterprise customers to improve synergies between business & technology by enabling rapid business value realization leveraging Digital Technologies. These include lifecycle transformation & rapid development / technology solution delivery services delivered leveraging traditional as well as Digital Technologies, deep functional understanding and software engineering expertise. IGT is ISO 27001:2013, CMMI SVC Level 5 and ISAE-3402 compliant for IT, and COPC® Certified v6.0, ISO 27001:2013 and PCI DSS 3.2 certified for BPO processes. The organization follows Six Sigma rigor for process improvements. It is our policy to provide equal employment opportunities to all individuals based on job-related qualifications and ability to perform a job, without regard to age, gender, gender identity, sexual orientation, race, color, religion, creed, national origin, disability, genetic information, veteran status, citizenship or marital status, and to maintain a non-discriminatory environment free from intimidation, harassment or bias based upon these grounds. Show more Show less

Posted 1 day ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / Data Bricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : Java Enterprise EditionMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of the projects you are involved in, ensuring that the applications you develop are efficient and effective in meeting user needs. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with Java Enterprise Edition.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services. Additional Information:- The candidate should have minimum 5 years of experience in Apache Spark.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kochi

Work from Office

Naukri logo

Develop user-friendly web applications using Java and React.js while ensuring high performance. Design, develop, test, and deploy robust and scalable applications. Building and consuming RESTful APIs. Collaborate with the design and development teams to translate UI/UX design wireframes into functional components. Optimize applications for maximum speed and scalability. Stay up-to-date with the latest Java and React.js trends, techniques, and best practices. Participate in code reviews to maintain code quality and ensure alignment with coding standards. Identify and address performance bottlenecks and other issues as they arise. Help us shape the future of Event Driven technologies, including contributing to Apache Kafka, Strimzi, Apache Flink, Vert.x and other relevant open-source projects. Collaborate within a dynamic team environment to comprehend and dissect intricate requirements for event processing solutions. Translate architectural blueprints into actualized code, employing your technical expertise to implement innovative and effective solutions. Conduct comprehensive testing of the developed solutions, ensuring their reliability, efficiency, and seamless integration. Provide ongoing support for the implemented applications, responding promptly to customer inquiries, resolving issues, and optimizing performance. Serve as a subject matter expert, sharing insights and best practices related to product development, fostering knowledge sharing within the team. Continuously monitor the evolving landscape of event-driven technologies, remaining updated on the latest trends and advancements. Collaborate closely with cross-functional teams, including product managers, designers, and developers, to ensure a holistic and harmonious product development process. Take ownership of technical challenges and lead your team to ensure successful delivery, using your problem-solving skills to overcome obstacles. Mentor and guide junior developers, nurturing their growth and development by providing guidance, knowledge transfer, and hands-on training. Engage in agile practices, contributing to backlog grooming, sprint planning, stand-ups, and retrospectives to facilitate effective project management and iteration. Foster a culture of innovation and collaboration, contributing to brainstorming sessions and offering creative ideas to push the boundaries of event processing solutions. Maintain documentation for the developed solutions, ensuring comprehensive and up-to-date records for future reference and knowledge sharing. Involve in building and orchestrating containerized services Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Proven 5+ years of experience as aFull stack developer (Java and React.js) with a strong portfolio of previous projects. Proficiency in Java, JavaScript, HTML, CSS, and related web technologies. Familiarity with RESTfulAPIs and their integration into applications. Knowledge of modern CICD pipelines and tools like Jenkinsand Travis. Strong understanding of version control systems, particularly Git. Good communication skills and the ability to articulate technical concepts to both technical and non-technical team members. Familiarity with containerizationand orchestrationtechnologies like Docker and Kubernetes for deploying event processing applications. Proficiency in troubleshootingand debugging. Exceptional problem-solving and analytical abilities, with a knack for addressing technical challenges. Ability to work collaboratively in an agile and fast-paced development environment. Leadership skills to guide and mentorjunior developers, fostering their growth and skill development. Strong organizational and time management skills to manage multiple tasks and priorities effectively. Adaptability to stay current with evolving event-driven technologies and industry trends. Customer-focused mindset, with a dedication to delivering solutions that meet or exceed customer expectations. Creative thinking and innovation mindset to drive continuous improvement and explore new possibilities. Collaborative and team-oriented approach to work, valuing open communication and diverse perspectives. Preferred technical and professional ex

Posted 1 day ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are seeking a talented Full Stack Developer experienced in Java, Kotlin, Spring Boot, Angular, and Apache Kafka to join our dynamic engineering team. The ideal candidate will design, develop, and maintain end-to-end web applications and real-time data processing solutions, leveraging modern frameworks and event-driven architectures. Location : Noida/Pune/Bangalore/Hyderabad/Chennai Timings : 2pm to 11pm Experience : 4-6 Years Key Responsibilities Design, develop, and maintain scalable web applications using Java, Kotlin, Spring Boot, and Angular. Build and integrate RESTful APIs and microservices to connect frontend and backend components. Develop and maintain real-time data pipelines and event-driven features using Apache Kafka. Collaborate with cross-functional teams (UI/UX, QA, DevOps, Product) to define, design, and deliver new features. Write clean, efficient, and well-documented code following industry best practices and coding standards. Participate in code reviews, provide constructive feedback, and ensure code quality and consistency. Troubleshoot and resolve application issues, bugs, and performance bottlenecks in a timely manner. Optimize applications for maximum speed, scalability, and security. Stay updated with the latest industry trends, tools, and technologies, and proactively suggest improvements. Participate in Agile/Scrum ceremonies and contribute to continuous integration and delivery pipelines. Required Qualifications Experience with cloud-based technologies and deployment (Azure, GCP). Familiarity with containerization (Docker, Kubernetes) and microservices architecture. Proven experience as a Full Stack Developer with hands-on expertise in Java, Kotlin, Spring Boot, and Angular (Angular 2+). Strong understanding of object-oriented and functional programming principles. Experience designing and implementing RESTful APIs and integrating them with frontend applications. Proficiency in building event-driven and streaming applications using Apache Kafka. Experience with database systems (SQL/NoSQL), ORM frameworks (e.g., Hibernate, JPA), and SQL. Familiarity with version control systems (Git) and CI/CD pipelines. Good understanding of HTML5, CSS3, JavaScript, and TypeScript. Experience with Agile development methodologies and working collaboratively in a team environment. Excellent problem-solving, analytical, and communication skills. Show more Show less

Posted 1 day ago

Apply

14.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

General Information Job Role: Lead DevOps Engineer Functional Area: DevOps Job Location: Pan India Job Shift: General Indian/ UK Shift Education: B.Sc./ B.Tech/ B.E / MTech in Any Specialization Employment Type Full Time, Permanent About Unified Infotech Embark on a transformative journey with Unified Infotech, a beacon of innovation and excellence in the tech consulting and software development landscape for over 14 years. We are dedicated to designing custom, forward-thinking web, mobile, and software solutions for a diverse clientele, from burgeoning MSMEs to towering Enterprises. Our mission is to engineer products that not only solve complex challenges but also set new benchmarks in the digital realm. At Unified, a job is not simply a job. It is a pursuit of excellence, to build and create, to understand and consult, to imagine and be creative, to reformulate UX, to invent and redefine, to code for performance, to collaborate and communicate. Role Description We are seeking a highly skilled and motivated DevOps Lead with expertise in both AWS and Azure cloud platforms to join our dynamic team. The successful candidate will collaborate with solution architects, developers, project managers, customer technical teams, and internal stakeholders to drive results. Your primary focus will be ensuring seamless customer access to applications in the cloud, managing customer workload migrations, implementing robust backup policies, overseeing hybrid cloud deployments, and building solutions for service assurance with a strong emphasis on leveraging Azure's unique capabilities. Desired Experience Define architecture, design, implement, program manage, and lead technology teams in delivering complex technical solutions for our clients across both AWS and Azure platforms. Span across DevOps, Continuous Integration (CI), and Continuous Delivery (CD) areas, providing demonstrable implementation experience in shaping value-add consulting solutions. Deploy, automate, and maintain cloud infrastructure with leading public cloud vendors such as Amazon Web Services (AWS) and Microsoft Azure, with a keen focus on integrating Azure-specific services and tools. Set up backups, replications, archiving, and implement disaster recovery measures leveraging Azure's resilience and geo-redundancy features. Utilize Azure DevOps services for better collaboration, reporting, and increasing automation in the CI/CD pipelines. Job Requirements Detail-oriented with a holistic perspective on system architecture, including at least 1 year of hands-on experience with Azure cloud services. Strong shell scripting and Linux administration skills, with a deep understanding of Linux and virtualization. Expertise in server technologies like Apache, Nginx, and Node, including optimization experience. Knowledge of database technologies such as MySQL, Redis, and MongoDB, with proficiency in management, replication, and disaster recovery. Proven experience in medium to large-scale public cloud deployments on AWS and Azure, including the migration of complex, multi-tier applications to these platforms. In-depth working knowledge of AWS and Azure, showcasing the ability to leverage Azure-specific features such as Azure Active Directory, Azure Kubernetes Service (AKS), Azure Functions, and Azure Logic Apps. Familiarity with CI/CD, automation, and monitoring processes for production-level infrastructure, including the use of Azure Monitor and Azure Automation and third party . Practical experience in setting up full-stack monitoring solutions using Prometheus, Grafana, and Loki, including long-term storage, custom dashboard creation, alerting, and integration with Kubernetes clusters. Worked extensively with Azure Front Door , including custom routing, WAF policies, SSL/TLS certificate integration, and performance optimization for global traffic. Experienced in multi Ingress Controller architecture setup and management, including namespace-specific ingress deployments Hands-on experience in setting up, configuring, and managing Azure API Management (APIM) Deep understanding of system performance and the ability to analyze root causes using tools available in Azure. Experience with Azure-specific management and governance tools, such as Azure Policy, Azure Blueprints, and Azure Resource Manager (ARM) templates. Proficiency in CI/CD automation using tools like Jenkins, Travis CI, Circle CI, or Azure DevOps. Knowledge of security infrastructure and vulnerabilities, including Azure's security tools like Azure Security Center and Azure Sentinel. Capability to analyze costs for the entire infrastructure, including cost management and optimization in Azure environments. Hands-on experience with configuration management tools like Ansible, Puppet, Chef, or similar, with an emphasis on their integration in Azure environments. Experience with container orchestration tools such as Kubernetes, Docker Swarm, and Docker containers, with a preference for those proficient in Azure Kubernetes Service (AKS). Total Exp : 6+ Years Exp in Cloud : AWS 3+, Azure 1+ Years NP : Immediate to 30 days preferred. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Provide support for data production systems in Nielsen Technology International Media (Television and Radio audio measurement) playing a critical role in ensuring the reliability, scalability, and security. Configure, implement, and deploy audience measurement solutions. Provide expert-level support, leading infrastructure automation initiatives, driving continuous improvement across our DevOps practices and supporting Agile processes. Core Technologies: Linux, Airflow, Bash, CI/CD, AWS services (EC2, S3, RDS, EKS, VPC), PostgreSQL, Python, Kubernetes. Responsibilities: Architect, manage, and optimize scalable and secure cloud infrastructure (AWS) using Infrastructure as Code (Terraform, CloudFormation, Ansible). Implement and maintain robust CI/CD pipelines to streamline software deployment and infrastructure changes. Identify and implement cost-optimization strategies for cloud resources. Ensure the smooth operation of production systems across 30+ countries, providing expert-level troubleshooting and incident response. Manage cloud-related migration changes and updates, supporting the secure implementation of changes/fixes. Participate in 24/7 on-call rotation for emergency support. Key Skills: Proficiency in Linux OS, particularly Fedora and Debian-based distributions (AlmaLinux, Amazon Linux, Ubuntu). Strong proficiency in scripting languages (Bash, Python) and SQL. Knowledge of scripting languages (Bash, SQL). Versed in leveraging Automations / DevOps principles with understanding of CI/CD concepts Working knowledge of infrastructure-as-code tools like Terraform, CloudFormation, and Ansible. Solid experience with AWS core services (EC2, EKS, S3, RDS, VPC, IAM, Security Groups). Hands-on experience with Docker, Kubernetes for containerized workloads. Solid understanding of DevOps practices, including monitoring, security, and high-availability design. Hands-on experience with Apache Airflow for workflow automation and scheduling. Strong troubleshooting skills, with experience in resolving issues and handling incidents in production environments. Foundational understanding of modern networking principles and cloud network architectures. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Provide support for data production systems in Nielsen Technology International Media (Television and Radio audio measurement) playing a critical role in ensuring the reliability, scalability, and security. Configure, implement, and deploy audience measurement solutions. Provide expert-level support, leading infrastructure automation initiatives, driving continuous improvement across our DevOps practices and supporting Agile processes. Core Technologies: Linux, Airflow, Bash, CI/CD, AWS services (EC2, S3, RDS, EKS, VPC), PostgreSQL, Python, Kubernetes. Responsibilities: Architect, manage, and optimize scalable and secure cloud infrastructure (AWS) using Infrastructure as Code (Terraform, CloudFormation, Ansible). Implement and maintain robust CI/CD pipelines to streamline software deployment and infrastructure changes. Identify and implement cost-optimization strategies for cloud resources. Ensure the smooth operation of production systems across 30+ countries, providing expert-level troubleshooting and incident response. Manage cloud-related migration changes and updates, supporting the secure implementation of changes/fixes. Participate in 24/7 on-call rotation for emergency support. Key Skills: Proficiency in Linux OS, particularly Fedora and Debian-based distributions (AlmaLinux, Amazon Linux, Ubuntu). Strong proficiency in scripting languages (Bash, Python) and SQL. Knowledge of scripting languages (Bash, SQL). Versed in leveraging Automations / DevOps principles with understanding of CI/CD concepts Working knowledge of infrastructure-as-code tools like Terraform, CloudFormation, and Ansible. Solid experience with AWS core services (EC2, EKS, S3, RDS, VPC, IAM, Security Groups). Hands-on experience with Docker, Kubernetes for containerized workloads. Solid understanding of DevOps practices, including monitoring, security, and high-availability design. Hands-on experience with Apache Airflow for workflow automation and scheduling. Strong troubleshooting skills, with experience in resolving issues and handling incidents in production environments. Foundational understanding of modern networking principles and cloud network architectures. Show more Show less

Posted 1 day ago

Apply

7.0 years

0 Lacs

Udaipur, Rajasthan, India

On-site

Linkedin logo

Requirements: 7+ years of hands-on Python development experience Proven experience designing and leading scalable backend systems Expert knowledge of Python and at least one framework (e.g., Django, Flask) Familiarity with ORM libraries and server-side templating (Jinja2, Mako, etc.) Strong understanding of multi-threading, multi-process, and event-driven programming Proficient in user authentication, authorization, and security compliance Skilled in frontend basics: JavaScript, HTML5, CSS3 Experience designing and implementing scalable backend architectures and microservices Ability to integrate multiple databases, data sources, and third-party services Proficient with version control systems (Git) Experience with deployment pipelines, server environment setup, and configuration Ability to implement and configure queueing systems like RabbitMQ or Apache Kafka Write clean, reusable, testable code with strong unit test coverage Deep debugging skills and secure coding practices ensuring accessibility and data protection compliance Optimize application performance for various platforms (web, mobile) Collaborate effectively with frontend developers, designers, and cross-functional teams Lead deployment, configuration, and server environment efforts Show more Show less

Posted 1 day ago

Apply

5.0 - 7.0 years

0 Lacs

Udaipur, Rajasthan, India

Remote

Linkedin logo

At GKM IT , we’re passionate about building seamless digital experiences powered by robust and intelligent data systems. We’re on the lookout for a Data Engineer - Senior II to architect and maintain high-performance data platforms that fuel decision-making and innovation. If you enjoy designing scalable pipelines, optimising data systems, and leading with technical excellence, you’ll thrive in our fast-paced, outcome-driven culture. You’ll take ownership of building reliable, secure, and scalable data infrastructure—from streaming pipelines to data lakes. Working closely with engineers, analysts, and business teams, you’ll ensure that data is not just available, but meaningful and impactful across the organization. Requirements 5 to 7 years of experience in data engineering Architect and maintain scalable, secure, and reliable data platforms and pipelines Design and implement data lake/data warehouse solutions such as Redshift, BigQuery, Snowflake, or Delta Lake Build real-time and batch data pipelines using tools like Apache Airflow, Kafka, Spark, and DBT Ensure data governance, lineage, quality, and observability Collaborate with stakeholders to define data strategies, architecture, and KPIs Lead code reviews and enforce best practices Mentor junior and mid-level engineers Optimize query performance, data storage, and infrastructure Integrate CI/CD workflows for data deployment and automated testing Evaluate and implement new tools and technologies as required Demonstrate expert-level proficiency in Python and SQL Possess deep knowledge of distributed systems and data processing frameworks Be proficient in cloud platforms (AWS, GCP, or Azure), containerization, and CI/CD processes Have experience with streaming platforms like Kafka or Kinesis and orchestration tools Be highly skilled with Airflow, DBT, and data warehouse performance tuning Exhibit strong leadership, communication, and mentoring skills Benefits We don’t just hire employees—we invest in people. At GKM IT, we’ve designed a benefits experience that’s thoughtful, supportive, and actually useful. Here’s what you can look forward to: Top-Tier Work Setup You’ll be equipped with a premium MacBook and all the accessories you need. Great tools make great work. Flexible Schedules & Remote Support Life isn’t 9-to-5. Enjoy flexible working hours, emergency work-from-home days, and utility support that makes remote life easier. Quarterly Performance Bonuses We don’t believe in waiting a whole year to celebrate your success. Perform well, and you’ll see it in your pay check—quarterly. Learning is Funded Here Conferences, courses, certifications—if it helps you grow, we’ve got your back. We even offer a dedicated educational allowance. Family-First Culture Your loved ones matter to us too. From birthday and anniversary vouchers (Amazon, BookMyShow) to maternity and paternity leaves—we’re here for life outside work. Celebrations & Gifting, The GKM IT Way Onboarding hampers, festive goodies (Diwali, Holi, New Year), and company anniversary surprises—it’s always celebration season here. Team Bonding Moments We love food, and we love people. Quarterly lunches, dinners, and fun company retreats help us stay connected beyond the screen. Healthcare That Has You Covered Enjoy comprehensive health insurance for you and your family—because peace of mind shouldn’t be optional. Extra Rewards for Extra Effort Weekend work doesn’t go unnoticed, and great referrals don’t go unrewarded. From incentives to bonuses—you’ll feel appreciated. Show more Show less

Posted 1 day ago

Apply

0.0 - 5.0 years

0 Lacs

Kochi, Kerala

On-site

Indeed logo

Job Description Highly skilled Laravel developer with a minimum of 4-5 year of Laravel experience well-versed with current web technologies and use of cutting-edge tools and 3rd party API's. Strong knowledge of PHP, MySQL, HTML, CSS, JavaScript, and MVC architecture Most important thing should have experience with custom e-commerce website Familiarity with modern JavaScript frameworks like Vue.js, React, or Angular Responsibilities & Duties Design, develop, test, deploy and support new software solutions and changes to existing software solutions. Translate Business Requirements into components of complex, loosely-coupled, distributed systems. Responsible for creating REST based web services and APIs for consumption by mobile and web platforms. Responsible systems analysis, code creation, testing, build/release and technical support. Responsible for keeping excellent, organized project records and documentation. You strive for innovative solutions, quality code with on time delivery. Manages multiple projects with timely deadlines. Required Experience, Skills and Qualifications: Working experience in Laravel Framework, at least done few project in Laravel or minimum 3-4 year of Laravel development experience. Working knowledge of HTML5, CSS3, and AJAX/ JavaScript, jQuery or similar libraries. Experience in application development in the LAMP stack (Linux, Apache, MySQL, and PHP) environment. Good working knowledge of object-oriented PHP (OOPs) & MVC frameworks. Must know Laravel coding standards and best practices. Must have working experience with Web service technologies such as REST, JSON etc., and writing REST APIs for consumption by mobile and web platforms. Working knowledge of GIT version control. Exposure to Responsive Web design. Strong unit testing and debugging skills. Good experience with databases (MySQL) and query writing. Excellent teamwork and problem-solving skills, flexibility, and ability to handle multiple tasks. Hands-on experience with project management tools like Desk log, Jira, or Asana Understanding of server-side security, performance optimization, and cross-browser compatibility Experience deploying applications on cloud platforms (AWS, Azure, or similar) is a plus How to Apply: Interested candidates are invited to submit their resume and a cover letter detailing your relevant experience and achievements to hr.kochi@mightywarner.com . Please include “ Laravel Developer” in the subject line. Job Type: Full-time Pay: ₹35,000.00 - ₹45,000.00 per month Benefits: Provident Fund Schedule: Day shift Monday to Friday Ability to commute/relocate: Kochi, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Are you ready to join immediately ? Custom Development experience do you have ? Do you have E-commerce Developing experience ? Experience: Laravel Developer: 5 years (Preferred) Work Location: In person Expected Start Date: 22/06/2025

Posted 1 day ago

Apply

4.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Linkedin logo

The world's top banks use Zafin's integrated platform to drive transformative customer value. Powered by an innovative AI-powered architecture, Zafin's platform seamlessly unifies data from across the enterprise to accelerate product and pricing innovation, automate deal management and billing, and create personalized customer offerings that drive expansion and loyalty. Zafin empowers banks to drive sustainable growth, strengthen their market position, and define the future of banking centered around customer value. The world's top banks use Zafin's integrated platform to drive transformative customer value. Powered by an innovative AI-powered architecture, Zafin's platform seamlessly unifies data from across the enterprise to accelerate product and pricing innovation, automate deal management and billing, and create personalized customer offerings that drive expansion and loyalty. Zafin empowers banks to drive sustainable growth, strengthen their market position, and define the future of banking centered around customer value. Zafin is privately owned and operates out of multiple global locations including North America, Europe, and Australia. Zafin is backed by significant financial partners committed to accelerating the company's growth, fueling innovation and ensuring Zafin's enduring value as a leading provider of cloud-based solutions to the financial services sector. Zafin is proud to be recognized as a top employer. In Canada, UK and India, we are certified as a "Great Place to Work". The Great Place to Work program recognizes employers who invest in and value their people and organizational culture. The company's culture is driven by strategy and focused on execution. We make and keep our commitments. What is the opportunity? This role is at the intersection of banking and analytics. It requires diving deep into the banking domain to understand and define the metrics, and into the technical domain to implement and present the metrics through business intelligence tools. We're building a next-generation analytics product to help banks maximize the financial wellness of their clients. The product is ambitious - that's why we're looking for a team member who is laterally skilled and comfortable with ambiguity. Reporting to the Senior Vice President, Analytics as part of the Zafin Product Team, you are a data-visualization subject matter expert who can define and implement the insights to be embedded in the product using data visualization tools (DataViz) and applying analytics expertise to make an impact. If storytelling with data is a passion of yours, and data visualization and analytics expertise is what has enabled you to reach your current level in your career - you should take a look at how we do it on one of the most advanced banking technology products in the market today - connect with us to learn more. Location – Chennai or Trivandrum India Purpose of the Role As a Software Engineer – APIs & Data Services, you will own the "last mile" that transforms data pipelines into polished, product-ready APIs and lightweight microservices. Working alongside data engineers and product managers, you will deliver features that power capabilities like Dynamic Cohorts, Signals, and our GPT-powered release notes assistant. What You'll Build & Run Approximate Focus: 60% API / 40% Data Focus Area Typical Work Product-Facing APIs Design REST/GraphQL endpoints for cohort, feature-flag, and release-notes data. Build microservices in Java/Kotlin (Spring Boot or Vert.x) or Python (FastAPI) with production-grade SLAs. Schema & Contract Management Manage JSON/Avro/Protobuf schemas, generate client SDKs, and enforce compatibility through CI/CD pipelines. Data-Ops Integration Interface with Delta Lake tables in Databricks using Spark/JDBC. Transform datasets with PySpark or Spark SQL and surface them via APIs. Pipeline Stewardship Extend Airflow 2.x DAGs (Python), orchestrate upstream Spark jobs, and manage downstream triggers. Develop custom operators as needed. DevOps & Quality Manage GitHub Actions, Docker containers, Kubernetes manifests, and Datadog dashboards to ensure service reliability. LLM & AI Features Enable prompt engineering and embeddings exposure via APIs; experiment with tools like OpenAI, LangChain, or LangChain4j to support product innovation. About You You're a language-flexible engineer with a solid grasp of system design and the discipline to ship robust, well-documented, and observable software. You're curious, driven, and passionate about building infrastructure that scales with evolving product needs. Mandatory Skills 4 to 6 years of professional experience in Java (11+) and Spring Boot Solid command of API design principles (REST, OpenAPI, GraphQL) Proficiency in SQL databases Experience with Docker, Git, and JUnit Hands-on knowledge of low-level design (LLD) and system design fundamentals Highly Preferred / Optional Skills Working experience with Apache Airflow Familiarity with cloud deployment (e.g., Azure AKS, GCP, AWS) Exposure to Kubernetes and microservice orchestration Frontend/UI experience in any modern framework (e.g., React, Angular) Experience with Python (FastAPI, Flask) Good-to-Have Skills CI/CD pipeline development using GitHub Actions Familiarity with code reviews, HLD, and architectural discussions Experience integrating with LLM APIs like OpenAI and building prompt-based systems Exposure to schema validation tools such as Pydantic, Jackson, Protobuf Monitoring and alerting with Datadog, Prometheus, or equivalent What's in it for you Joining our team means being part of a culture that values diversity, teamwork, and high-quality work. We offer competitive salaries, annual bonus potential, generous paid time off, paid volunteering days, wellness benefits, and robust opportunities for professional growth and career advancement. Want to learn more about what you can look forward to during your career with us? Visit our careers site and our openings: zafin.com/careers Zafin welcomes and encourages applications from people with disabilities. Accommodations are available on request for candidates taking part in all aspects of the selection process. Zafin is committed to protecting the privacy and security of the personal information collected from all applicants throughout the recruitment process. The methods by which Zafin contains uses, stores, handles, retains, or discloses applicant information can be accessed by reviewing Zafin's privacy policy at https://zafin.com/privacy-notice/. By submitting a job application, you confirm that you agree to the processing of your personal data by Zafin described in the candidate privacy notice. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Purpose As a key member of the support team, the Application Support Engineer is responsible for ensuring the stability and availability of critical applications. This role involves monitoring, troubleshooting, and resolving application issues, adhering to defined SLAs and processes. Desired Skills And Experience Experience in an application support or technical support role with strong troubleshooting, problem-solving, and analytical skills. Ability to work independently and effectively and to thrive in a fast-paced, high-pressure environment. Experience in either C# or Java preferred, to support effective troubleshooting and understanding of application code Knowledge of various operating systems (Windows, Linux, macOS) and familiarity with software applications and tools used in the industry. Proficiency in programming languages such as Python, and scripting languages like Bash or PowerShell. Experience with database systems such as MySQL, Oracle, SQL Server, and the ability to write and optimize SQL queries. Understanding of network protocols, configurations, and troubleshooting network-related issues. Skills in managing and configuring servers, including web servers (Apache, Nginx) and application servers (Desirable) Familiarity with ITIL incident management processes. Familiarity with monitoring and logging tools like Nagios, Splunk, or ELK stack to track application performance and issues. Knowledge of version control systems like Git to manage code changes and collaborate with development teams. (Desirable) Experience with cloud platforms such as AWS, Azure, or Google Cloud for deploying and managing applications. (Desirable) Experience in Fixed Income Markets or financial applications support is preferred Strong attention to detail and ability to follow processes. Ability to adapt to changing priorities and client needs with good verbal and written communication skills. Key Responsibilities Provide L1/L2 technical support for applications Monitor application performance and system health, proactively identifying potential issues. Investigate, diagnose, and resolve application incidents and service requests within agreed SLAs. Escalate complex or unresolved issues to the Service Manager or relevant senior teams. Document all support activities, including incident details, troubleshooting steps, and resolutions. Participate in shift handovers and knowledge sharing. Perform routine maintenance tasks to ensure optimal application performance. Collaborate with other support teams to ensure seamless issue resolution. Develop and maintain technical documentation and knowledge base articles. Assist in the implementation of new applications and updates. Provide training and support to junior team members. Show more Show less

Posted 1 day ago

Apply

Exploring Apache Jobs in India

Apache is a widely used software foundation that offers a range of open-source software solutions. In India, the demand for professionals with expertise in Apache tools and technologies is on the rise. Job seekers looking to pursue a career in Apache-related roles have a plethora of opportunities in various industries. Let's delve into the Apache job market in India to gain a better understanding of the landscape.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving IT sectors and see a high demand for Apache professionals across different organizations.

Average Salary Range

The salary range for Apache professionals in India varies based on experience and skill level. - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

In the Apache job market in India, a typical career path may progress as follows: 1. Junior Developer 2. Developer 3. Senior Developer 4. Tech Lead 5. Architect

Related Skills

Besides expertise in Apache tools and technologies, professionals in this field are often expected to have skills in: - Linux - Networking - Database Management - Cloud Computing

Interview Questions

  • What is Apache HTTP Server and how does it differ from Apache Tomcat? (medium)
  • Explain the difference between Apache Hadoop and Apache Spark. (medium)
  • What is mod_rewrite in Apache and how is it used? (medium)
  • How do you troubleshoot common Apache server errors? (medium)
  • What is the purpose of .htaccess file in Apache? (basic)
  • Explain the role of Apache Kafka in real-time data processing. (medium)
  • How do you secure an Apache web server? (medium)
  • What is the significance of Apache Maven in software development? (basic)
  • Explain the concept of virtual hosts in Apache. (basic)
  • How do you optimize Apache web server performance? (medium)
  • Describe the functionality of Apache Solr. (medium)
  • What is the purpose of Apache Camel? (medium)
  • How do you monitor Apache server logs? (medium)
  • Explain the role of Apache ZooKeeper in distributed applications. (advanced)
  • How do you configure SSL/TLS on an Apache web server? (medium)
  • Discuss the advantages of using Apache Cassandra for data management. (medium)
  • What is the Apache Lucene library used for? (basic)
  • How do you handle high traffic on an Apache server? (medium)
  • Explain the concept of .htpasswd in Apache. (basic)
  • What is the role of Apache Thrift in software development? (advanced)
  • How do you troubleshoot Apache server performance issues? (medium)
  • Discuss the importance of Apache Flume in data ingestion. (medium)
  • What is the significance of Apache Storm in real-time data processing? (medium)
  • How do you deploy applications on Apache Tomcat? (medium)
  • Explain the concept of .htaccess directives in Apache. (basic)

Conclusion

As you embark on your journey to explore Apache jobs in India, it is essential to stay updated on the latest trends and technologies in the field. By honing your skills and preparing thoroughly for interviews, you can position yourself as a competitive candidate in the Apache job market. Stay motivated, keep learning, and pursue your dream career with confidence!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies