Home
Jobs

8532 Kafka Jobs - Page 49

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Role Proficiency: Create and Organise testing process based on project requirement and manage test activities within team Outcomes: Test Estimates and Schedules-. Ensure Test Coverage Produce test results defect reports test logs and reports to evidence for testing Publish RCA reports and preventive measures Ensure Quality of Deliverables Report project metrics and status Ensure adherence of Engineering practices processes and standards Understand and contribute to test automation/performance testing Work with DevOps team when required; to understand testing framework and QA process for implementing continuous testing Manage team utilization Measures Of Outcomes: Test Script Creation and Execution Productivity Defect Leakage Metrics (% of defect leaked % of UAT defects and % of Production defects) % of Test case reuse Test execution Coverage Defect Acceptance Ratio Test Review efficiency On-time delivery Effort Variance Test Automation Coverage Outputs Expected: Supporting Organization: Ensure utilization and quality of deliverables prepared by the team Co-ordinate Test Environment and Test Data provisioning Test Design Development Execution: Participate in review walkthrough demo and obtain sign off by stakeholder Prepare Test Summary Report for modules/features Requirements Management: Analyse Prioritize Identify Gaps; create workflow diagrams based on Requirements/User stories Manage Project: Participate in Test management Preparing Tracking and Reporting of Test progress based on schedule Domain Relevance: Identify business processes conduct risk analysis and ensure test coverage Estimate: Prepare Estimate Schedule Identify dependencies Knowledge Management: Consume Contribute Review (Best Practices Lesson learned Retrospective) Test Design Execution: Test Plan preparation Test Case/Script Creation Test Execution Risk Identification: Identification of risk/issues and prepare Mitigation and Contingency plans Test & Defect Management: Conduct root cause and trend analysis of the defects Test Planning: Identify the test scenarios with understanding of systems interfaces and application Identify end-to-end business critical scenarios with less support Create and review the test scenarios and prepare RTM Prepare estimates (time /effort) based on the requirements/User stories Identify scope of testing Client Management: Define KPIs to the engagement and ensure adherence to these KPIs. Stakeholder Connect: Handle monthly/weekly governance calls and represent issues for the team Skill Examples: Ability to Create Review and manage a test plan Ability to prepare schedules based on estimates Ability to track report progress and take corrective measures on need basis Ability to identify test scenarios and prepare RTM Ability to analyze requirement/user stories and prioritize testing Ability to carry out RCA Ability to capture and report metrics Ability to identify Test Data and Test Env. Specifications Knowledge Examples: Knowledge of Estimation techniques Knowledge of Testing standards Knowledge of identifying the scope of testing Knowledge of RCA Techniques Knowledge of Test design techniques Knowledge of Test methodologies Knowledge of scope identification and planning Knowledge of Test automation tools and frameworks Additional Comments: Design, develop, and execute automated performance test scripts using tools such as Apache JMeter. Define test strategies and performance testing plans to validate scalability, stability, and reliability of applications. Collaborate with developers, architects, and DevOps teams to ensure applications meet performance expectations. Analyze test results and metrics, including CPU usage, memory consumption, garbage collection, throughput, and response time. Diagnose and troubleshoot performance issues in pre-production and production environments. Utilize AppDynamics, ElasticSearch, OpenSearch, Grafana, and Kafka for real-time monitoring and performance visualization. Perform root cause analysis to detect memory leaks, connection issues, and other system bottlenecks. Document findings, create performance reports, and present results to stakeholders with clear recommendations. Maintain performance baselines and monitor deviations over time. Drive performance tuning efforts across application layers including database, services, and infrastructure. Participate in capacity planning and support system scaling efforts. ________________________________________ Required Skills & Experience: Proficiency in performance testing tools such as Apache JMeter or similar. Experience with performance monitoring tools like AppDynamics, Grafana, or OpenSearch. Deep understanding of microservices architectures and cloud environments Azure Strong experience with test planning, workload modeling, and test data management. Solid experience analyzing system performance metrics across app servers, databases, OS, and network layers. Demonstrated ability to communicate performance insights through clear, actionable reporting. Familiarity with CI/CD pipelines, integration with performance tests, and automated workflows. Good understanding of DB tuning, application server tuning, and best practices for scalable architecture. Ability to work independently and collaboratively in a fast-paced, agile environment. Understanding about Kafka-based event streaming architectures. Knowledge of scripting languages such as Python, Groovy, or Shell for automation. Skills Performance Engineering,Apache Jmeter,Elastic Search,Grafana Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Technology @Dream11: Technology is at the core of everything we do. Our technology team helps us deliver a mobile-first experience across platforms (Android & iOS) while managing over 700 million rpm (requests per minute) at peak with a user concurrency of over 16.5 million. We have over 190+ micro-services written in Java and backed by a Vert.x framework. These work with isolated product features with discrete architectures to cater to the respective use cases. We work with terabytes of data, the infrastructure for which is built on top of Kafka, Redshift, Spark, Druid, etc. and it powers a number of use cases like Machine Learning and Predictive Analytics. Our tech stack is hosted on AWS, with distributed systems like Cassandra, Aerospike, Akka, Voltdb, Ignite, etc. Your Role: Analyze requirements and design software solutions basis first design principles (e.g. Object Oriented Design and Analysis, E-R Modeling) Build resilient, event-driven microservices using reactive Java based framework, sql and no-sql datastores, caches, messaging and big-data processing frameworks Deploy and configure cloud-native software services on public cloud Operate and support software services in production based on on-call schedules, using observability tools such as Datadog for logging, alerting, monitoring Qualifiers: 3+ years coding experience with at least one object oriented programming language, preferably Java, relational databases, database modeling (E-R modeling), SQL Familiarity with no-SQL databases and caching frameworks preferred Working experience of messaging frameworks such as Kafka or MQ Familiarity with object oriented design patterns Working experience with AWS or any cloud infrastructure About Dream Sports: Dream Sports is India’s leading sports technology company with 250 million users, housing brands such as Dream11 , the world’s largest fantasy sports platform, FanCode , a premier sports content & commerce platform and DreamSetGo , a sports experiences platform. Dream Sports is based in Mumbai and has a workforce of close to 1,000 ‘Sportans’. Founded in 2008 by Harsh Jain and Bhavit Sheth, Dream Sports’ vision is to ‘Make Sports Better’ for fans through the confluence of sports and technology. For more information: https://dreamsports.group/ Dream11 is the world’s largest fantasy sports platform with 230 million users playing fantasy cricket, football, basketball & hockey on it. Dream11 is the flagship brand of Dream Sports, India’s leading Sports Technology company and has partnerships with several national & international sports bodies and cricketers. Show more Show less

Posted 5 days ago

Apply

7.0 - 12.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Experience - 7+Years Location - Bangalore : Must have in-depth working experience with Java/J2EE, Web Services using JSON. Experience with Multithreaded systems and Object-Oriented design. Must have strong work experience with Spring, Spring boot, Hibernate along with Unit Test (Junit classes, etc.) Experience working with Microservices based applications Experience working with databases such as SQL. Hands-on experience in HTML, CSS, Agile & Scrum Good to have Good knowledge of messaging systems like Kafka Education Required Bachelor s degree or equivalent in business analysis/process analysis, organizational development or related discipline or equivalent. Perks & Benefits: Health and WellnessHealthcare policy covering your family and parents. FoodEnjoy a scrumptious buffet lunch at the office every day (For Bangalore) Professional DevelopmentLearn and propel your career. We provide workshops, funded online courses and other learning opportunities based on individual needs. Rewards and RecognitionsRecognition and rewards programs in place to celebrate your achievements and contributions. Why join Relanto Health & FamilyComprehensive benefits for you and your loved ones, ensuring well-being. Growth MindsetContinuous learning opportunities to stay ahead in your field. Dynamic & InclusiveVibrant culture fostering collaboration, creativity, and belonging. Career LadderInternal promotions and clear path for advancement. Recognition & RewardsCelebrate your achievements and contributions. Work-Life HarmonyFlexible arrangements to balance your commitments.

Posted 6 days ago

Apply

2.0 - 7.0 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

We are looking for a DevOps with Azure engineer who is passionate and ready to develop the state of art technology solutions for their digital platforms. This job will have a variety of challenges on a daily basis, where you will need to understand the business needs and based on your creative thinking-ways, you will develop and design solutions in Azure with Kafka and DevOps capabilities. You will implement them according to the DevOps practices. This job is for someone who is excited and happy to work with cutting edge technologies and really motivated to work with large amounts of complex cloud-based data. Skills Must have Significant experience in designing and developing Azure (with Kubernetes) solutions Strong knowledge and experience of working with Kakfa Comfortable working with large amounts of data Knowledge of technologies such as Docker and Kubernetes Devops skills are also essential Good Postgres DB knowledge Microsoft Azure expertise and certification is a plus Nice to have A positive attitude & willingness to learn & desire to improve the environment around you Knowledge of virtualization and containerization Track record as engineer working in a globally distributed team On-the-job examples of working in a fast-paced Agile environment Other Languages English: C2 Proficient Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies DevOps Engineer DevOps India Bengaluru Senior DevOps Engineer DevOps Poland Remote Poland Senior DevOps Engineer DevOps Romania Remote Romania Pune, India Req. VR-114554 DevOps BCM Industry 23/05/2025 Req. VR-114554 Apply for DevOps Engineer in Pune *

Posted 6 days ago

Apply

4.0 - 9.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Senior Software Engineer - DevOps Bangalore, India Who we are: INVIDI Technologies Corporation is the worlds leading developer of software transforming television all over the world. Our two-time Emmy Award-winning technology is widely deployed by cable, satellite, and telco operators. We provide a device-agnostic solution delivering ads to the right household no matter what program or network you re watching, how youre watching, or whether you re in front of your TV, laptop, cell phone or any other device. INVIDI created the multi-billion-dollar addressable television business that today is growing rapidly globally. INVIDI is right at the heart of the very exciting and fast-paced world of commercial television; companies benefiting from our software include DirecTV and Dish Network, networks such as CBS/Viacom and A&E, advertising agencies such as Ogilvy and Publicis, and advertisers such as Chevrolet and Verizon. INVIDI s world-class technology solutions are known for their flexibility and adaptability. These traits allow INVIDI partners to transform their video content delivery network, revamping legacy systems without significant capital or hardware investments. Our clients count on us to provide superior capabilities, excellent service, and ease of use. The goal of developing a unified video ad tech platform is a big one and the right DevOps Engineer --like you--flourish in INVIDI s creative, inspiring, and supportive culture. It is a demanding, high-energy, and fast-paced environment. INVIDI s developers are self-motivated quick studies, can-do individuals who embrace the challenge of solving difficult and complex problems. About the role: We are a modern agile product organization looking for an excellent DevOps engineer that can support and offload a remote product development team. Our platform handles tens of thousands of requests/second with sub-second response times across the globe. We serve ads to some of the biggest live events in the world, providing reports and forecasts based on billions of log rows. These are some of the complex challenges that make development and operational work at INVIDI interesting and rewarding. To accomplish this, we use the best frameworks and tools out there or, when they are not good enough, we write our own. Most of the code we write is Java or Kotlin on top of Dropwizard, but every problem is unique, and we always evaluate the best tools for the job. We work with technologies such as Kafka, Google Cloud (GKE, Pub/Sub), BigTable, Terraform and Jsonnet and a lot more. The position will report directly to the Technical Manager of Software Development and will be based in our Chennai, India office. Key responsibilities: You will maintain, deploy and operate backend services in Java and Kotlin that are scalable, durable and performant. You will proactively evolve deployment pipelines and artifact generation. You will have a commitment to Kubernetes and infrastructure maintenance. You will troubleshoot incoming issues from support and clients, fixing and resolving what you can You will collaborate closely with peers and product owners in your team. You will help other team members grow as engineers through code review, pairing, and mentoring. Our Requirements: You are an outstanding DevOps Engineer who loves to work with distributed high-volume systems. You care about the craft and cherish the opportunity to work with smart, supportive, and highly motivated colleagues. You are curious; you like to learn new things, mentor and share knowledge with team members. Like us, you strive to handle complexity by keeping things simple and elegant. As a part of the DevOps team, you will be on-call for the services and clusters that the team owns. You are on call for one week, approximately once or twice per month. While on-call, you are required to be reachable by telephone and able to act upon alarm using your laptop. Skills and qualifications: Master s degree in computer science, or equivalent 4+ years of experience in the computer science industry Strong development and troubleshooting skill sets Ability to support a SaaS environment to meet service objectives Ability to collaborate effectively and work well in an Agile environment Excellent oral and written communication skills in English Ability to quickly learn new technologies and work in a fast-paced environment. Highly Preferred: Experience building service applications with Dropwizard/Spring Boot Experience with cloud services such as GCP and/or AWS. Experience with Infrastructure as Code tools such as Terraform. Experience in Linux environment. Experience working with technologies such as SQL, Kafka, Kafka Streams Experience with Docker Experience with SCM and CI/CD tools such as GIT and Bitbucket Experience with build tools such as Gradle or Maven Experience in writing Kubernetes deployment manifests and troubleshooting cluster and application-level issues. Physical Requirements: INVIDI is a conscious, clean, well-organized, and supportive office environment. Prolonged periods of sitting at a desk and working on a computer are normal. Note: Final candidates must successfully pass INVIDI s background screening requirements. Final candidates must be legally authorized to work in India. INVIDI has reopened its offices on a flexible hybrid model. Ready to join our team? Apply today!

Posted 6 days ago

Apply

15.0 - 20.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and experienced technical/solution architect to deliver of technical architect artifacts, solution summary matrix, Solution intended diagrams, cost estimate of the solutions, ensuring seamless integration and alignment with cross applications with multiple cross-application impacts. This is an IC role reporting to Director Architecture and should work in 2- 11 pm IST shift. Primary Responsibilities: Identify impacted applications, size capabilities, and create new capabilities Lead complex initiatives with multiple cross-application impacts, ensuring seamless integration Drive innovation, optimize processes, and deliver high-quality architecture solutions Understand business objectives, review business scenarios, and plan acceptance criteria for proposed solution architecture Discuss capabilities with individual applications, resolve dependencies and conflicts, and reach agreements on proposed high-level approaches and solutions Group capabilities logically and check their high-level viability with impacted IT teams as per roadmap options propose and justify the right tools and technologies needed to build solutions Finalize capabilities as per phases and feature grooming with impacted applications Participate in Architecture Review, present solutions, and review other solutions Work with Enterprise architects to learn and adopt standards and best practices Design solutions adhering to applicable rules and compliances Stay updated with the latest technology trends to solve business problems with minimal change or impact Involve in solution prototyping, solution patterns, and reference architectures Help derive a high-level picture for the business to achieve its goals within a stipulated timeframe using a multi-solution and multi-phase approach Ensure strategic architecture alignment with the business roadmap and enterprise security compliance Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Graduate degree or equivalent experience 15+ years of experience in a similar role, leading and mentoring a team of architects and technical leads Experience in driving innovation, optimizing processes, and delivering high-quality solutions Experience in complex initiatives with multiple cross-application impacts Java, Python, Spring, Spring boot framework, SQL, Mongo DBS, KAFKA, React JS, Bid Data, Dynatrace, Power BI kind of exposure is needed Solid understanding of healthcare domain knowledge, and AI platforms and high-level architecture to use the AI based solutions Exposure to cloud platforms and tools Good knowledge of the latest happenings in the technology world Proven ability to think from a long-term perspective and arrive at strategic architecture Proven excellent communication and leadership skills #ExcTech #NJP

Posted 6 days ago

Apply

6.0 - 8.0 years

11 - 16 Lacs

Gurugram

Work from Office

Naukri logo

Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our approximately 30,000 employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $4.89 billion in fiscal 2023.

Posted 6 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5.00 billion in fiscal 2024. For more information, visit www.amdocs.com In one sentence Responsible for design, development, modification, debug and/or maintenance of software systems What will your job look like Design, develop, and maintain middleware and ESB-based integration services to enable seamless data exchange between IT systems, vendor platforms, and network applications. Develop and manage APIs, REST/SOAP web services, and messaging frameworks using Enterprise Service Bus (ESB) solutions Implement message routing, transformation, orchestration, and error handling patterns for both batch and near real-time integrations. Collaborate with business, development, and infrastructure teams to gather integration requirements and deliver scalable and reliable middleware solutions. Troubleshoot integration issues across services, protocols, and systems; provide root cause analysis and implement permanent fixes. Skillset - Enterprise Application Integration (EAI) & Service-Oriented Architecture (SOA) Full API Lifecycle Management Integration using microservices Design and develop RESTful APIs using Spring Boot and Quarkus Deploy and manage applications on Red Hat OpenShift Integrated APIs with IBM API Connect (APIC) for enhanced security and management Micro services architecture using Java, NodeJs, ReactJs, MongoDB, Kafka, Splunk, Dynatrace, Jenkins, SonarQube, Jfrog, OpenShift/Kubernetes. ESB(Oracle service bus) development All you need is... Bachelor's degree in Science/IT/Computer Science or equivalent 3+ years Java experience (server side) on Linux/Unix/Windows Demonstrable experience with the Spring framework components - Spring Boot, MVC, Integration, Security, etc. Strong understanding of RESTful APIs and open systems Why you will love this job: You will be challenged to design and develop new software applications. You will have the opportunity to work in a growing organization, with ever growing opportunities for personal growth.

Posted 6 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5.00 billion in fiscal 2024. For more information, visit www.amdocs.com In one sentence Responsible for design, development, modification, debug and/or maintenance of software systems What will your job look like Design, develop, and maintain middleware and ESB-based integration services to enable seamless data exchange between IT systems, vendor platforms, and network applications. Develop and manage APIs, REST/SOAP web services, and messaging frameworks using Enterprise Service Bus (ESB) solutions Implement message routing, transformation, orchestration, and error handling patterns for both batch and near real-time integrations. Collaborate with business, development, and infrastructure teams to gather integration requirements and deliver scalable and reliable middleware solutions. Troubleshoot integration issues across services, protocols, and systems; provide root cause analysis and implement permanent fixes. Skillset - Enterprise Application Integration (EAI) & Service-Oriented Architecture (SOA) Full API Lifecycle Management Integration using microservices Design and develop RESTful APIs using Spring Boot and Quarkus Deploy and manage applications on Red Hat OpenShift Integrated APIs with IBM API Connect (APIC) for enhanced security and management Micro services architecture using Java, NodeJs, ReactJs, MongoDB, Kafka, Splunk, Dynatrace, Jenkins, SonarQube, Jfrog, OpenShift/Kubernetes. ESB(Oracle service bus) development All you need is... Bachelor's degree in Science/IT/Computer Science or equivalent 3+ years Java experience (server side) on Linux/Unix/Windows Demonstrable experience with the Spring framework components - Spring Boot, MVC, Integration, Security, etc. Strong understanding of RESTful APIs and open systems Why you will love this job: You will be challenged to design and develop new software applications. You will have the opportunity to work in a growing organization, with ever growing opportunities for personal growth.

Posted 6 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5.00 billion in fiscal 2024. For more information, visit www.amdocs.com In one sentence Responsible for design, development, modification, debug and/or maintenance of software systems What will your job look like "Design, develop, and maintain middleware and ESB-based integration services to enable seamless data exchange between IT systems, vendor platforms, and network applications. Develop and manage APIs, REST/SOAP web services, and messaging frameworks using Enterprise Service Bus (ESB) solutions Implement message routing, transformation, orchestration, and error handling patterns for both batch and near real-time integrations. Collaborate with business, development, and infrastructure teams to gather integration requirements and deliver scalable and reliable middleware solutions. Troubleshoot integration issues across services, protocols, and systems; provide root cause analysis and implement permanent fixes. Skillset - Enterprise Application Integration (EAI) & Service-Oriented Architecture (SOA) Full API Lifecycle Management Integration using microservices Design and develop RESTful APIs using Spring Boot and Quarkus Deploy and manage applications on Red Hat OpenShift Integrated APIs with IBM API Connect (APIC) for enhanced security and management Micro services architecture using Java, NodeJs, ReactJs, MongoDB, Kafka, Splunk, Dynatrace, Jenkins, SonarQube, Jfrog, OpenShift/Kubernetes. ESB(Oracle service bus) development " All you need is... Bachelor's degree in Science/IT/Computer Science or equivalent 3+ years Java experience (server side) on Linux/Unix/Windows Demonstrable experience with the Spring framework components - Spring Boot, MVC, Integration, Security, etc. Strong understanding of RESTful APIs and open systems Why you will love this job: You will be challenged to design and develop new software applications. You will have the opportunity to work in a growing organization, with ever growing opportunities for personal growth.

Posted 6 days ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5.00 billion in fiscal 2024. For more information, visit www.amdocs.com In one sentence We are seeking a Data Engineer with advanced expertise in Databricks SQL, PySpark, Spark SQL, and workflow orchestration using Airflow. The successful candidate will lead critical projects, including migrating SQL Server Stored Procedures to Databricks Notebooks, designing incremental data pipelines, and orchestrating workflows in Azure Databricks What will your job look like Migrate SQL Server Stored Procedures to Databricks Notebooks, leveraging PySpark and Spark SQL for complex transformations. Design, build, and maintain incremental data load pipelines to handle dynamic updates from various sources, ensuring scalability and efficiency. Develop robust data ingestion pipelines to load data into the Databricks Bronze layer from relational databases, APIs, and file systems. Implement incremental data transformation workflows to update silver and gold layer datasets in near real-time, adhering to Delta Lake best practices. Integrate Airflow with Databricks to orchestrate end-to-end workflows, including dependency management, error handling, and scheduling. Understand business and technical requirements, translating them into scalable Databricks solutions. Optimize Spark jobs and queries for performance, scalability, and cost-efficiency in a distributed environment. Implement robust data quality checks, monitoring solutions, and governance frameworks within Databricks. Collaborate with team members on Databricks best practices, reusable solutions, and incremental loading strategies All you need is... Bachelor s degree in computer science, Information Systems, or a related discipline. 4+ years of hands-on experience with Databricks, including expertise in Databricks SQL, PySpark, and Spark SQL. Proven experience in incremental data loading techniques into Databricks, leveraging Delta Lake's features (e.g., time travel, MERGE INTO). Strong understanding of data warehousing concepts, including data partitioning, and indexing for efficient querying. Proficiency in T-SQL and experience in migrating SQL Server Stored Procedures to Databricks. Solid knowledge of Azure Cloud Services, particularly Azure Databricks and Azure Data Lake Storage. Expertise in Airflow integration for workflow orchestration, including designing and managing DAGs. Familiarity with version control systems (e.g., Git) and CI/CD pipelines for data engineering workflows. Excellent analytical and problem-solving skills with a focus on detail-oriented development. Preferred Qualifications Advanced knowledge of Delta Lake optimizations, such as compaction, Z-ordering, and vacuuming. Experience with real-time streaming data pipelines using tools like Kafka or Azure Event Hubs. Familiarity with advanced Airflow features, such as SLA monitoring and external task dependencies. Certifications such as Databricks Certified Associate Developer for Apache Spark or equivalent. Experience in Agile development methodologie Why you will love this job: You will be able to use your specific insights to lead business change on a large scale and drive transformation within our organization. You will be a key member of a global, dynamic and highly collaborative team with various possibilities for personal and professional development. You will have the opportunity to work in multinational environment for the global market leader in its field! We offer a wide range of stellar benefits including health, dental, vision, and life insurance as well as paid time off, sick time, and parental leave!

Posted 6 days ago

Apply

4.0 - 7.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our approximately 30,000 employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $4.89 billion in fiscal 2023. In one sentence Responsible for Leading and mentoring a small development team within a specific task or project, side by side with hands-on development. What will your job look like You will provide technical leadership to software engineers by coaching and mentoring throughout end-to-end software development, maintenance, and lifecycle to achieve project goals to the required level of quality; promote team engagement and motivation. Provide recommendations to the software engineering manager for estimates, resource needs, breakthroughs and risks; ensure effective delegation, supervising tasks, identifying risks and handling mitigation and critical issues. Hands-on technical and functional mentorship to design, maintenance, build, integration and testing of sophisticated software components according to functional and technical design specifications; Follow software development methodologies and release processes You will analyze and report the requirements and provide impact assessment for new features or bug fixes. Make high-level design and establishes technical standards. You will represent and lead discussions related to product/ application/ modules/ team and build relationships with internal customers/partners You will implement quality processes (such as performing technical root cause analysis and outlining corrective action for given problems), measure them and takes corrective actions in case of variances and ensure all the project agreed work are completed to the required level of quality. All you need is... Very good experience on Java and must be hands on and good Knowledge on Spring Boot Handson experience/good knowledge/troubleshooting experience on Kubernetes ,Docker, Microservices , Kafka Experience/Knowledge about REST Experience of working on Cloud Platform (AWS/Azur) Knowledge about - DevOps skills like CI/CD- Jenkins Unix - experience of basic commands Knowledge of Automation tool - Cucumber (will be added advantage.) Knowledge of ELK - (will be added advantage.) Experience of RDBMS - Oracle/Postgres , (Cassandra will be added advantage) Very good Problem solving and decision making Ability to work independently with less supervision Ready to stretch respective working hours, when necessary, to support business needs. Ready to work on Product and Catalog as config tool. Why you will love this job: You will be challenged with leading and mentoring a small development team and h owning the technical aspects of the project You will have the opportunity to work in a growing organization, with ever growing opportunities for personal growth You will have the opportunity to work with the industry most sophisticated technologies! We offer a wide range of stellar benefits including health, dental, vision, and life insurance as well as paid time off, sick time, and parental leave!

Posted 6 days ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more . Based in Asia and part of Booking Holdings, our 7,100+ employees representing 95+ nationalities in 27 markets foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enhancing the ability for our customers to experience the world. Our Purpose – Bridging the World Through Travel We believe travel allows people to enjoy, learn and experience more of the amazing world we live in. It brings individuals and cultures closer together, fostering empathy, understanding and happiness. We are a skillful, driven and diverse team from across the globe, united by a passion to make an impact. Harnessing our innovative technologies and strong partnerships, we aim to make travel easy and rewarding for everyone. Get to Know our Team: In Agoda’s Back End Engineering department, we build the scalable, fault-tolerant systems and APIs that host our core business logic. Our systems cover all major areas of our business: inventory and pricing, product information, customer data, communications, partner data, booking systems, payments, and more. These mission-critical systems change frequently with dozens of releases per day, so we must employ state-of-the-art CI/CD and testing techniques in order to make sure everything works without any downtime. We also ensure that our systems are self-healing, responding gracefully to extreme loads or unexpected input. In order to accomplish this, we use state-of-the-art languages like Scala and Go, data technologies like Kafka and Aerospike, and agile development practices. Most importantly though, we hire great people from all around the world and empower them to be successful. Whether it’s building new projects like Flights and Packages or reimagining our existing business, you’ll make a big impact as part of the Back End Engineering team. The Opportunity: Agoda is looking for developers to work on mission critical systems that deal with the designing and development of APIs that serve millions of user search requests a day. In this Role, you’ll get to Lead development of features, experiments, technical projects and complex systems Be a technical architect, mentor, and driver towards the right technology Continue to evolve our architecture and build better software Be a major contributor to our agile and scrum practices Get involved with software engineering and collaborate with server, other client, and infrastructure technical team members to build the best solution Constantly look for ways to improve our products, code-base and development practices Write great code and help others write great code Drive Technical decisions in the organization What You’ll Need To Succeed 7+ years’ experience under your belt developing performance-critical applications that run in a production environment using Scala, Java or C# Experience in leading projects, initiatives and/or teams, with full ownership of the systems involved Data platforms like SQL, Cassandra or Hadoop. You understand that different applications have different data requirements Good understanding of algorithms and data structures Strong coding ability You are passionate about the craft of software development and constantly work to improve your knowledge and skills Excellent verbal and written English communication skills It’s Great If You Have Experience with Scrum/Agile development methodologies Experience building large-scale distributed products Core engineering infrastructure tools like Git for source control, TeamCity for Continuous Integration and Puppet for deployment Hands-on experience working with technology like queueing systems (Kafka, RabbitMQ, ActiveMQ, MSMQ), Spark, Hadoop, NoSQL (Cassandra, MongoDB), Play framework, Akka library #india #newdelhi #Bangalore #Bengaluru #Pune #Hyderabad #Chennai #Kolkata #Lucknow #IT #ENG #4 #Mumbai #Delhi #Noida Equal Opportunity Employer At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person’s merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics. We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy . Disclaimer We do not accept any terms or conditions, nor do we recognize any agency’s representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee. Show more Show less

Posted 6 days ago

Apply

5.0 - 10.0 years

19 - 32 Lacs

Pune

Hybrid

Naukri logo

BMC is looking for a C++ Specialist Development & Maintenance to join our product R&D support and patch development efforts. In this role, youll be a part of a new engineering team, and tackle high-impact challenges to enhance product reliability and deliver exceptional value to our customers. From addressing critical issues in our software to collaborating with globally distributed teams, you'll play a pivotal role in ensuring our software meets the highest quality and performance standards. If you're passionate about problem-solving, working in a collaborative and supportive environment, and making a direct impact on product quality, this is the role for you! Here is how, through this exciting role, YOU will contribute to BMC's and your own success: Be part of a new engineering team, focused on product support and patch development for mission-critical software. Develop, test, and implement diagnostic tools and processes to enhance product performance. Contribute to patch development cycles, ensuring timely delivery and quality assurance. Take ownership of specific technical tasks and drive them to completion with a sense of urgency. Continuously learn and contribute to the growth of the team through sharing knowledge and best practices. Work cross-functionally to ensure software quality meets customer expectations. To ensure youre set up for success, you will bring the following skillset & experience: Bachelors degree in computer science, Engineering, or a related field. 5+ years of experience in a similar role. 3+ years of experience in C++ Proficiency in Linux and Windows OS. Deep understanding of database technologies (PostgreSQL, MySQL, Oracle). Result-driven, problem-solver at heart. Ability to work effectively both independently and as part of a team. Excellent communication and collaboration skills. Whilst these are nice to have, our team can help you develop in the following skills: Experience with Java. Experience with messaging systems (Kafka or similar). Experience working in an enterprise product-based company.

Posted 6 days ago

Apply

6.0 - 10.0 years

10 - 14 Lacs

Chennai, Guindy

Work from Office

Naukri logo

Overview : We are seeking an experienced Technical Lead with a strong background in Java software development . The ideal candidate should possess hands-on coding expertise , architectural understanding , and leadership abilities to drive development teams in building scalable and high-performance applications . This role involves technical mentorship, solution architecture, and ensuring best development practices are followed. Responsibilities Roles and Responsibilities: 1. Technical Leadership & Solution Architecture Define technical architecture and design for Java-based applications. Provide technical direction and mentor software engineers. Conduct code reviews to ensure high-quality standards. Define best practices for development, security, and performance optimization . Guide the migration of legacy applications to modern frameworks . 2. Software Development & Deployment Design, develop, and maintain scalable Java microservices . Work on database architecture & optimization . Implement automated CI/CD pipelines for seamless deployments. Optimize backend performance, caching, and data processing . 3. Cross-Team Collaboration Work closely with Product Owners, UX/UI Designers, and DevOps . Collaborate with Cloud, Security, and Data Engineering Teams . Ensure alignment with business goals & technical feasibility . 4. Cloud & DevOps Implementation Deploy applications to AWS, Azure, or GCP using containerization (Docker, Kubernetes) . Manage scalability, monitoring, and logging (Azure Monitor, AWS CloudWatch, Prometheus, ELK Stack) . Automate infrastructure provisioning & cloud resource management . 5. Agile & Team Management Participate in sprint planning, standups, retrospectives . Track and manage work using JIRA, Trello, or Azure DevOps . Train and mentor junior developers and ensure knowledge sharing. Primary Skills: Core Java, Java 8+ (or latest version) Spring Boot, Spring Framework (Spring MVC, Spring Security, Spring Cloud) Microservices Architecture & API Development RESTful Web Services, GraphQL (optional but preferred) Database Management (MySQL, PostgreSQL, MongoDB) Message Brokers (Kafka, RabbitMQ) Cloud Services (AWS, Azure, GCP Any one preferred) DevOps & CI/CD (Docker, Kubernetes, Jenkins, GitHub Actions, Terraform) Security & Authentication (OAuth2, JWT, SSO, OpenID) Performance Optimization & System Scalability Secondary Skills: Frontend Framework Knowledge (React.js, Angular, or Vue.js) Containerization & Orchestration (Docker, Kubernetes) Event-Driven Architecture (Kafka, RabbitMQ, ActiveMQ) Infrastructure as Code (Terraform, CloudFormation) Unit Testing & Automation (JUnit, Mockito, Cypress) Agile & Scrum Practices (JIRA, Confluence, Standups, Sprint Planning) Technical Documentation & Architectural Design Patterns AI & Machine Learning Basics (Optional but good to have)

Posted 6 days ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

job requisition idJR1025488 Overall Responsibilities: Node.js Developer (TypeScript and GraphQL) will be responsible for developing, implementing, and maintaining scalable and high-performance applications using Node.js technologies. The role involves working with TypeScript and GraphQL to develop Restful APIs and other backend solutions. The developer will collaborate with cross-functional teams to ensure the timely delivery of high-quality software solutions. Software : coding experience in Node.js, JavaScript, and databases. hands-on experience in TypeScript. Proven expertise in performance tuning, debugging, and monitoring applications. Technical Skills (Category-wise): API Development: Excellent knowledge of developing scalable and highly-available Restful APIs using Node.js technologies. Practical experience with GraphQL. CI/CD and DevOps: Well-versed with CI/CD principles. Actively involved in solving and troubleshooting issues in distributed services ecosystems. Understanding of containerization, experienced in Docker and Kubernetes. Exposure to API gateway integrations like 3Scale. Authentication and Security: Understanding of Single-Sign-On or token-based authentication (REST, JWT, OAuth). Expert knowledge of task/message queues, including but not limited to AWS, Microsoft Azure, Pushpin, and Kafka. Functional Skills: Experience in following best coding, security, unit testing, and documentation standards and practices. Experience in banking, financial, and fintech environments preferred. Proficiency in Agile methodology. Ensure the quality of technical and application architecture and design of systems across the organization. Effectively research and benchmark technology against other best-in-class technologies. Experience: Minimum 5 years of coding experience in Node.js, JavaScript, and databases. At least 3 years of hands-on experience in TypeScript. Hands-on experience in performance tuning, debugging, and monitoring applications. Day-to-Day Activities: Develop and maintain scalable and highly-available Restful APIs using Node.js and TypeScript. Implement and manage GraphQL for efficient data fetching. Engage in performance tuning, debugging, and monitoring of applications. Collaborate with cross-functional teams to ensure the timely delivery of high-quality software solutions. Troubleshoot and resolve issues in distributed services ecosystems using CI/CD principles. Work with containerization technologies like Docker and Kubernetes. Integrate API gateways such as 3Scale. Implement and manage Single-Sign-On or token-based authentication mechanisms. Utilize task/message queues like AWS, Microsoft Azure, Pushpin, and Kafka. Follow best coding, security, unit testing, and documentation standards and practices. Participate in Agile development processes. Conduct research and benchmark technology against industry best practices. Qualification: Degree or postgraduate degree in Computer Science or a related field (or equivalent industry experience). Soft Skills: Able to influence multiple teams on technical considerations, increasing their productivity and effectiveness by sharing deep knowledge and experience. Self-motivator and self-starter with the ability to own and drive tasks without supervision. Works collaboratively with teams across the organization. Excellent soft skills and interpersonal skills to interact with and present ideas to senior and executive management.

Posted 6 days ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

job requisition idJR1027452 Overall Responsibilities: Data Pipeline Development: Design, develop, and maintain highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform, ensuring data integrity and accuracy. Data Ingestion: Implement and manage data ingestion processes from a variety of sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP. Data Transformation and Processing: Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs and business requirements. Performance Optimization: Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes. Data Quality and Validation: Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline. Automation and Orchestration: Automate data workflows using tools like Apache Oozie, Airflow, or similar orchestration tools within the Cloudera ecosystem. Monitoring and Maintenance: Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and associated data processes. Collaboration: Work closely with other data engineers, analysts, product managers, and other stakeholders to understand data requirements and support various data-driven initiatives. Documentation: Maintain thorough documentation of data engineering processes, code, and pipeline configurations. Software : Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Familiarity with Hadoop, Kafka, and other distributed computing tools. Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Strong scripting skills in Linux. Category-wise Technical Skills: PySpark: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Cloudera Data Platform: Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Data Warehousing: Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Big Data Technologies: Familiarity with Hadoop, Kafka, and other distributed computing tools. Orchestration and Scheduling: Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Scripting and Automation: Strong scripting skills in Linux. Experience: 5-12 years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform. Proven track record of implementing data engineering best practices. Experience in data ingestion, transformation, and optimization on the Cloudera Data Platform. Day-to-Day Activities: Design, develop, and maintain ETL pipelines using PySpark on CDP. Implement and manage data ingestion processes from various sources. Process, cleanse, and transform large datasets using PySpark. Conduct performance tuning and optimization of ETL processes. Implement data quality checks and validation routines. Automate data workflows using orchestration tools. Monitor pipeline performance and troubleshoot issues. Collaborate with team members to understand data requirements. Maintain documentation of data engineering processes and configurations. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. Relevant certifications in PySpark and Cloudera technologies are a plus. Soft Skills: Strong analytical and problem-solving skills. Excellent verbal and written communication abilities. Ability to work independently and collaboratively in a team environment. Attention to detail and commitment to data quality.

Posted 6 days ago

Apply

7.0 - 12.0 years

14 - 19 Lacs

Pune, Hinjewadi

Work from Office

Naukri logo

job requisition idJR1027427 Job Summary Synechron is seeking a highly experienced and technically proficient Lead Java Developer to lead the design, development, and deployment of enterprise-grade backend solutions. In this leadership role, you will spearhead critical projects involving microservices, multithreading, and message-driven architectures, contributing directly to the organizations digital transformation initiatives. Your expertise will ensure scalable, secure, and efficient systems that align with business objectives and industry best practices. Software Required Skills: Java (Java 8+ preferred) Spring Framework (Spring Boot, Spring MVC, Spring Data) Messaging APIsKafka, Solace, Aeron, LBM, or similar ScriptingShell scripting, Groovy, Purl (basic proficiency) NoSQL data storageMongoDB, Amps (Advanced working knowledge) Multithreading and concurrency management in Java Version control tools such as Git Build toolsMaven or Gradle Preferred Skills: Cloud platforms (AWS, Azure, GCP) knowledge Containerization and orchestrationDocker, Kubernetes API design and development Monitoring and logging tools Overall Responsibilities Lead and develop scalable Java backend systems, ensuring robust functional and non-functional requirements are met. Architect microservices and API-driven solutions with high availability and performance. Implement multithreading, concurrency, and asynchronous processing to optimize system responsiveness. Design and manage data schemas and workflows using MongoDB and related storage solutions. Integrate messaging systems such as Kafka, Solace, or Aeron to facilitate real-time data exchange. Collaborate with cross-functional teams, including product owners, QA, and DevOps, to ensure seamless delivery. Review code, establish best practices, and enforce coding standards to maintain high quality. Mentor junior developers and provide technical guidance across projects. Drive continuous improvement in system architecture, security, and performance. Technical Skills (By Category) Programming Languages: EssentialJava (Java 8+), Shell scripting, Groovy PreferredKotlin, Purl scripting Databases/Data Management: EssentialMongoDB, Amps PreferredData modeling, indexing, and optimization Messaging & Communication: EssentialKafka, Solace, Aeron, LBM PreferredRabbitMQ, ActiveMQ Cloud & Infrastructure: PreferredExperience with cloud providers (AWS, Azure, GCP) and container orchestration tools Frameworks & Libraries: EssentialSpring Boot, Spring MVC, Spring Data PreferredReactor, WebFlux Development Tools & Methodologies: EssentialMaven, Gradle, Git, CI/CD pipelines PreferredJenkins, Azure DevOps, monitoring dashboards Security Protocols: Basic understanding of OAuth, TLS, and secure API development Experience Minimum 7 years of professional experience in Java development, with demonstrated leadership in complex backend systems. Proven experience in designing and implementing microservices architectures. Hands-on expertise with multithreading, concurrency, and messaging APIs. Domain experience in finance, banking, or enterprise solutions preferred. Demonstrable success in implementing scalable and high-performance systems. Experience mentoring peers and leading technical teams. Day-to-Day Activities Leading development efforts on backend systems, ensuring adherence to best practices. Designing and implementing microservices, APIs, and messaging-driven data flows. Collaborating with architecture teams to define service patterns and integration strategies. Conducting code reviews and providing technical mentorship. Monitoring application performance, diagnosing issues, and optimizing system throughput. Participating in Agile ceremonies, sprint planning, and stakeholder communication. Developing and maintaining documentation of system architecture, APIs, and processes. Staying current with emerging Java, microservices, and messaging technologies to drive innovation. Qualifications Educational : Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. Equivalent professional experience in enterprise Java development. Certifications (Preferred): Certifications in Java, Spring, Cloud (AWS, Azure), or messaging platforms. Training & Development: Commitment to continuous professional growth and staying updated with relevant technologies. Professional Competencies Strong analytical and problem-solving skills. Leadership qualities with the ability to influence teams and stakeholders. Effective communication skills for technical and non-technical audiences. Ability to prioritize tasks efficiently in a fast-paced environment. Adaptability to evolving technologies and project scopes. Commitment to high standards for quality, security, and system reliability. Collaboration and team-building skills to foster a productive work environment.

Posted 6 days ago

Apply

7.0 - 12.0 years

16 - 20 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

JR1027511 Job Summary: We are seeking a highly skilled Senior Fullstack Developer with advanced expertise in Next.js and React to join our dynamic team in Bangalore. The ideal candidate will have industry experience, focusing on designing, developing, and maintaining scalable web applications that ensure high performance and responsiveness. This role is pivotal in driving Synechron's strategic objectives and contributing to the success of our digital transformation initiatives. Software : Required Proficiency: Next.js and React with a focus on Server-Side Rendering (SSR) and Static Site Generation (SSG). RESTful APIs and microservices architecture. C# for backend development, emphasizing Object-Oriented Programming (OOP) concepts and design patterns. Source control using Git; CI/CD processes with tools such as TeamCity or Jenkins, and deployment using Octopus or similar tools. Relational databases (MSSQL/Oracle) and NoSQL databases (MongoDB). Basic knowledge of AWS and required experience with Azure. Monitoring application performance using Splunk and performance tooling like AppDynamics. Preferred Proficiency: Familiarity with Kafka. Knowledge of GitHub Actions. Overall Responsibilities: Develop and maintain web applications using Next.js and React, implementing SSR, SSG, and client-side navigation. Design and develop RESTful APIs and microservices to support application functionality. Perform unit, integration, and automation testing to ensure code quality and reliability. Collaborate with cross-functional teams in an Agile environment to deliver high-quality software solutions. Monitor and improve application performance and health using performance tools. Evaluate new technologies and make recommendations for adoption. Develop and maintain technology roadmaps, ensuring alignment with overall business strategy. Technical Skills (By Category): Programming Languages: Required: JavaScript (React, Next.js), C# Preferred: Additional JavaScript libraries or frameworks Databases/Data Management: Required: MSSQL/Oracle, MongoDB Cloud Technologies: Required: Azure Preferred: AWS Frameworks and Libraries: Required: Next.js, React Development Tools and Methodologies: Required: Git, CI/CD (TeamCity, Jenkins, Octopus) Preferred: GitHub Actions Monitoring Tools: Required: Splunk, AppDynamics Experience : 7-12 years of experience as a Full-Stack Engineer or similar role. Proven track record in developing scalable web applications and managing complex projects. Experience with Agile methodologies and project management tools. Day-to-Day Activities: Engage in both front-end and back-end development tasks. Participate in regular team meetings and contribute to project planning and execution. Deliver high-quality code and contribute to system architecture discussions. Exercise decision-making authority in software design and development processes. Qualifications: Required: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Preferred: Relevant certifications in web technologies or cloud platforms. Commitment to continuous professional development through industry certifications and training. Professional Competencies: Critical thinking and problem-solving capabilities. Leadership and teamwork abilities. Excellent communication and stakeholder management skills. Adaptability and learning orientation. Innovation mindset to drive digital transformation initiatives. Effective time and priority management skills.

Posted 6 days ago

Apply

8.0 years

0 Lacs

India

On-site

Linkedin logo

Staff Software Engineer, Data Ingestion The Staff Software Engineer, Data Ingestion will be a critical individual contributor responsible for designing, developing, and maintaining robust and scalable data pipelines. This role is at the heart of our data ecosystem, deliver new analytical software solution to access timely, accurate, and complete data for insights, products, and operational efficiency. Key Responsibilities: Design, develop, and maintain high-performance, fault-tolerant data ingestion pipelines using Python. Integrate with diverse data sources (databases, APIs, streaming platforms, cloud storage, etc.). Implement data transformation and cleansing logic during ingestion to ensure data quality. Monitor and troubleshoot data ingestion pipelines, identifying and resolving issues promptly. Collaborate with database engineers to optimize data models for fast consumption. Evaluate and propose new technologies or frameworks to improve ingestion efficiency and reliability. Develop and implement self-healing mechanisms for data pipelines to ensure continuity. Define and uphold SLAs and SLOs for data freshness, completeness, and availability. Participate in on-call rotation as needed for critical data pipeline issues Key Skills: 8+ years of experience, ideally with an engineering background, working in software product companies. Extensive Python Expertise: Extensive experience in developing robust, production-grade applications with Python. Data Collection & Integration: Proven experience collecting data from various sources (REST APIs, OAuth, GraphQL, Kafka, S3, SFTP, etc.). Distributed Systems & Scalability: Strong understanding of distributed systems concepts, designing for scale, performance optimization, and fault tolerance. Cloud Platforms: Experience with major cloud providers (AWS or GCP) and their data-related services (e.g., S3, EC2, Lambda, SQS, Kafka, Cloud Storage, GKE). Database Fundamentals: Solid understanding of relational databases (SQL, schema design, indexing, query optimization). OLAP database experience is a plus (Hadoop) Monitoring & Alerting: Experience with monitoring tools (e.g., Prometheus, Grafana) and setting up effective alerts. Version Control: Proficiency with Git. Containerization (Plus): Experience with Docker and Kubernetes. Streaming Technologies (Plus): Experience with real-time data processing using Kafka, Flink, Spark Streaming. Show more Show less

Posted 6 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Are you passionate about delivering a high quality enterprise applications or solutions built with cutting-edge technologies? Are you intrigued by the possibility of contributing to a large-scale suite of applications built for the cloud and used by major communications brands across the globe? Would you like to join a highly collaborative agile team where you will be empowered to tackle complex objectives, while growing your skills and career? Responsibilities The Oracle Communications Order & Service Management (OSM) team is looking for a talented Senior Quality Assurance Engineer to join our team, contributing to the OSM product’s ambitious roadmap, including our integration into the Digital Business Experience application suite. The architecture includes a mix of advanced Java, Kubernetes, WebLogic, Oracle Database, Java Microprofile (Helidon) Microservices, REST, JSON, and Kafka. We use modern tools, including Oracle’s Cloud, and are proud to run a profitable business while remaining passionate about quality, security, and our Agile practices. You will work from our Oracle's Hyderabad, India office, collaborating with a talented local technical team as well as members in North America time zone. The successful candidate will have: B.Tech/B.E/M.Tech/M.E/MCA in Computer Science, or Information Technology or equivalent. 6+ years of software testing experience. Strong knowledge of software Quality Engineering methodologies, tools and processes Experience with Test Automation tools. Strong knowledge of Linux. Experience with relational database systems, especially oracle database, Knowledge of Cloud Native principles, including experience in Kubernetes, Helm, Linux scripting and related technologies will be an asset. Thorough understanding of best software practices (e.g., design, testing, quality, security, performance). Experience with Agile (e.g., Scrum). Experience with Oracle OSM or competitors (e.g., Vlocity, Amdocs) or integration of other fulfillment products into CRM systems would be a great asset. Excellent analytical and problem-solving abilities. Able to do test requirement analysis, design, testing and documentation. A proven collaborator who is self-starting and can contribute to our collegiate culture. Good communication skills, verbal and written (English). Able to work independently, and as a member of a small team with common goal. Career Level – IC3 Responsibilities: As a member of OSM Quality Engineering team, you will: Actively participate in Requirement analysis, test design, execution of key functionalities along with designing automated test cases. Set up complex deployments of OSM and configure its integration with other systems. Contribute to certification activities with new versions of software (e.g., database), new Kubernetes versions, new OS versions, or new public cloud services. Develop standards and procedures to provide quality guidance and methods. Automate operational procedures (incl. CI/CD) using scripting and pipeline frameworks. Administer and operate shared specialized servers, databases, and clusters. Contribute for accurate and useful technical documentation. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

JOB DESCRIPTION: We are seeking an experienced Full Stack Java Developer skilled in designing, developing, and deploying scalable applications using Java/J2EE, Spring Boot, Microservices, and Angular. The ideal candidate will have hands-on experience with Gradle, Kafka, MySQL, Amazon S3, Docker, and AWS cloud services. You will play a key role in building robust backend systems and intuitive front-end interfaces, leveraging modern DevOps and cloud-native practices. Key Responsibilities: Design and implement scalable microservices using Java/J2EE and Spring Boot frameworks. Develop RESTful APIs to facilitate communication between distributed services. Build and maintain front-end components using Angular (or similar frameworks). Integrate and manage message-driven architectures using Kafka for event streaming and processing. Use Gradle for dependency management and build automation. Implement and optimize relational database solutions using MySQL. Manage file storage and retrieval operations with Amazon S3. Containerize applications using Docker and orchestrate deployments on AWS. Collaborate with DevOps teams to automate CI/CD pipelines and infrastructure provisioning. Ensure application reliability, scalability, and security in a cloud-native (AWS) environment. Participate in code reviews, unit testing, and integration testing to ensure code quality. Document technical designs and implementation processes for future reference. Engage in Agile/Scrum ceremonies and collaborate with cross-functional teams. Required Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience (typically 3+ years) in Java/J2EE application development. Proficiency in Spring Framework, including Spring Boot and Spring Data JPA. Strong understanding of microservices architecture and RESTful API design. Hands-on experience with Gradle for build automation. Proficient in integrating and configuring Kafka for messaging solutions. Experience with MySQL or other relational databases. Familiarity with Amazon S3 for cloud storage operations. Front-end development skills with Angular (TypeScript, HTML, CSS). Experience with Docker for containerization and AWS for cloud deployments. Knowledge of CI/CD tools and DevOps practices. Strong problem-solving, communication, and teamwork skills. Preferred/Good to Have: Experience with additional AWS services (EC2, Lambda, EKS, etc.). Familiarity with security best practices (OAuth2, JWT, etc.). Exposure to observability tools (Prometheus, Grafana) and infrastructure-as-code (Terraform). Experience with Agile methodologies and mentoring junior developers. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Hybrid mode of work: 3 days/week Project description The WMI Core stream provides Core Banking capabilities across WM International locations, and works towards integration and synergies across WMI locations, driving capability-driven and modular platform strategy for Core Banking. We are seeking a Technical Lead with strong experience in Temenos Transact development, particularly in core banking customizations, interfaces, and API integrations. This role requires hands-on leadership to deliver scalable and maintainable banking solutions. Responsibilities We are looking for a DevOps with Azure engineer who is passionate and ready to develop the state of art technology solutions for their digital platforms. This job will have a variety of challenges on a daily basis, where you will need to understand the business needs and based on your creative thinking-ways, you will develop and design solutions in Azure with Kafka and DevOps capabilities. You will implement them according to the DevOps practices. This job is for someone who is excited and happy to work with cutting edge technologies and really motivated to work with large amounts of complex cloud-based data. Skills Must have Significant experience in designing and developing Azure (with Kubernetes) solutions Strong knowledge and experience of working with Kakfa Comfortable working with large amounts of data Knowledge of technologies such as Docker and Kubernetes Devops skills are also essential Good Postgres DB knowledge Microsoft Azure expertise and certification is a plus Nice to have A positive attitude & willingness to learn & desire to improve the environment around you Knowledge of virtualization and containerization Track record as engineer working in a globally distributed team On-the-job examples of working in a fast-paced Agile environment Languages English: C1-C2 Show more Show less

Posted 6 days ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Strategy and Transactions - SaT – DnA Associate Manager EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Associate Manager - Data Engineering. The main objective of the role is to support cloud and on-prem platform analytics and data engineering projects initiated across engagement teams. The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. This role will be instrumental in designing, developing, and evolving the modern data warehousing solutions and data integration build-outs using cutting edge tools and platforms for both on-prem and cloud architectures. In this role you will be coming up with design specifications, documentation, and development of data migration mappings and transformations for a modern Data Warehouse set up/data mart creation and define robust ETL processing to collect and scrub both structured and unstructured data providing self-serve capabilities (OLAP) in order to create impactful decision analytics reporting. Your Key Responsibilities Evaluating and selecting data warehousing tools for business intelligence, data population, data management, metadata management and warehouse administration for both on-prem and cloud based engagements Strong working knowledge across the technology stack including ETL, ELT, data analysis, metadata, data quality, audit and design Design, develop, and test in ETL tool environment (GUI/canvas driven tools to create workflows) Experience in design documentation (data mapping, technical specifications, production support, data dictionaries, test cases, etc.) Provides technical leadership to a team of data warehouse and business intelligence developers Coordinate with other technology users to design and implement matters of data governance, data harvesting, cloud implementation strategy, privacy, and security Adhere to ETL/Data Warehouse development Best Practices Responsible for Data orchestration, ingestion, ETL and reporting architecture for both on-prem and cloud ( MS Azure/AWS/GCP) Assisting the team with performance tuning for ETL and database processes Skills And Attributes For Success Minimum of 7 years of total experience with 3+ years in Data warehousing/ Business Intelligence field Solid hands-on 3+ years of professional experience with creation and implementation of data warehouses on client engagements and helping create enhancements to a data warehouse Strong knowledge of data architecture for staging and reporting schemas ,data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) Minimum 3+ years experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 2+ years hands-on experience in various Azure services preferred – Azure Data Factory,Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics ,Azure Analysis Services & Databricks Minimum of 3 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases ,Oracle/MySQL, Azure SQL and Azure Synapse Strong in PySpark, SparkSQL Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem solving skills Excellent written and verbal communication skills To qualify for the role, you must have Bachelor’s or equivalent degree in computer science, or related field, required. Advanced degree or equivalent business experience preferred Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analyzing large volumes of data Relevant work experience of minimum 6 to 8 years in a big 4 or technology/ consulting set up Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Experience in Snowflake What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies