Jobs
Interviews

28 Streams Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

0 Lacs

karnataka

On-site

You should have 5 to 10+ years of experience in C++ programming with a focus on memory management, file I/O, and streams concepts. Your expertise should also include a strong understanding of multithreading, including creating and managing threads, synchronization mechanisms like mutexes and condition variables, and kernel-level operations. Additionally, you should possess a good understanding of Linux development and triaging, including familiarity with command-line tools, POSIX, processes, and network operations. A solid foundation in building applications in a C++ environment is also crucial for this role.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a professional in the IT & Tech Engineering field at Allianz Technology, you will be expected to possess a diverse set of technical skills. Your role will involve understanding code management and release approaches, including knowledge of monorepo / multirepo concepts. It is essential to have a good understanding of functional programming principles and various code management methodologies such as SDLC, DRY, KISS, and SOLID. Moreover, familiarity with authorization and authentication mechanisms like ABAC, RBAC, JWT, SAML, AAD, OIDC, and experience with NoSQL databases like DynamoDB are highly valued. Proficiency in UI development using technologies like React, hooks, and TypeScript is crucial. Additionally, expertise in event-driven architecture, including queues, streams, batches, and pub/subs, is necessary. You should also possess a solid grasp of functional programming concepts such as list, map, reduce, compose, and monads. Understanding scalability, concurrency, networking, proxies, CI/CD pipelines, GitFlow, Github, and GitOps tools like Flux and ArgoCD is required. Being a polyglot programmer proficient in at least two languages such as Python, TypeScript, or Golang at an expert level is preferred. Furthermore, you must be fluent in operating Kubernetes clusters from a development perspective, creating custom CRDs, operators, and controllers, and have experience in developing serverless cloud applications. Deep knowledge of AWS cloud services and a basic understanding of Azure cloud are advantageous. Apart from technical skills, soft skills play a vital role in this role. Effective communication, leadership abilities, team supervision, task delegation, feedback issuance, risk evaluation, conflict resolution, project management, crisis management, problem-solving, innovation, ownership, and vision are key soft skills expected from you. Your responsibilities will also include providing technical guidance, making informed decisions, shaping solutions, enforcing development practices, and ensuring quality gates through activities like code reviews, pair programming, and team review meetings.,

Posted 3 days ago

Apply

4.0 - 8.0 years

3 - 12 Lacs

Bengaluru

Work from Office

Responsibilities: Teach Core Java, OOP, Java 8+, Collections, Multithreading Deliver training on Spring Boot, REST APIs, & Hibernate/JPA Create coding exercises, projects, and support student queries Track learner progress and offer mentorship

Posted 4 days ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Bengaluru

Work from Office

• Strong programming skills Java 8 and above – Lambda Expressions and Streams, etc. • Multithreading and Collections (Data structures) • Web services (RESTful),Spring Boot, Spring MVC • Java Messaging, Kafka Git, Maven,Agile, SCRUM

Posted 1 week ago

Apply

6.0 - 10.0 years

14 - 19 Lacs

Mumbai, Pune

Work from Office

We are looking for a talented and experienced developer who is technically passionate, solution-focused and able to design, develop, test and maintain high-quality software. You will be working with one of our client - top tier investment bank to design and develop their risk technology platform. As a Senior Java Developer, you will: Work closely with data from sources like Bloomberg, Markit and model and transform source data for specific applications Work with Java 8 and all its features Work with Spring Boot and other Spring modules ( web, data, security, batch) or any other dependency injection framework Work with ( and configure) Distributed caching based on Redis and event based kafka streams Interact with event based applications, micro services and have a strong focus on performance and real time analytics Design and develop various database queries, scripts and tables to pull, clean, arrange and persist risk management data Own delivery and take responsibility of milestones Have a BDD approach ( Cucumber), as well as design and develop automated unit, integration, and regression tests

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

kolkata, west bengal

On-site

Genpact is a global professional services and solutions firm focused on delivering outcomes that shape the future. With over 125,000 employees across 30+ countries, our team is driven by curiosity, agility, and the desire to create lasting value for our clients. Our purpose, the relentless pursuit of a world that works better for people, underpins our work as we serve and transform leading enterprises worldwide, including Fortune Global 500 companies. We leverage deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI to drive innovation and success. We are currently seeking applications for the role of Vice President, Enterprise Architecture Consulting- GCP Delivery Lead at Genpact. In this critical leadership position, you will be responsible for managing the delivery of complex Google Cloud Platform (GCP) projects, ensuring client satisfaction, team efficiency, and innovation. The ideal candidate will bring deep industry expertise, technical excellence, and strong business acumen to shape our organization's data and cloud transformation roadmap. As the Delivery Lead, your key responsibilities include overseeing the successful delivery of multimillion-dollar engagements involving GCP, managing client relationships, leading global project teams, ensuring adherence to delivery governance standards, and driving innovation within the scope of GCP initiatives. You will play a vital role in shaping the data and cloud transformation journey of our organization. Key Responsibilities: - Own and drive end-to-end delivery of GCP & Data Engineering programs across multiple geographies & industry verticals. - Establish a best-in-class data delivery framework ensuring scalability, security, and efficiency in GCP-based transformations. - Act as a trusted advisor to C-level executives, driving customer success, innovation, and business value. - Lead executive-level stakeholder engagement, aligning with business strategy and IT transformation roadmaps. - Drive account growth, supporting pre-sales, solutioning, and go-to-market (GTM) strategies for GCP and data-driven initiatives. - Ensure customer satisfaction and build long-term strategic partnerships with enterprise clients. - Shape the organization's Data & AI strategy, promoting the adoption of GCP, AI/ML, real-time analytics, and automation in enterprise data solutions. - Establish data accelerators, reusable frameworks, and cost optimization strategies to enhance efficiency and profitability. - Build and mentor a high-performing global team of cloud data professionals, including data engineers, architects, and analytics experts. - Foster a culture of continuous learning and innovation, driving upskilling and certifications in GCP, AI/ML, and Cloud Data technologies. - Stay informed about emerging trends in GCP, cloud data engineering, and analytics to drive innovation. Qualifications: Minimum Qualifications: - Experience in IT Services, with a focus on data engineering, GCP, and cloud transformation leadership. - Bachelor's degree in Computer Science, Engineering, or a related field (Masters or MBA preferred). Preferred Qualifications/ Skills: - Proven track record in delivering large-scale, multimillion dollar GCP & data engineering programs. - Deep understanding of the GCP ecosystem, including Data Sharing, Streams, Tasks, Performance Tuning, and Cost Optimization. - Strong expertise in cloud platforms (Azure, AWS) and data engineering pipelines. - Proficiency in modern data architectures, AI/ML, IoT, and edge analytics. - Experience in managing global, multi-disciplinary teams across multiple geographies. - Exceptional leadership and executive presence, with the ability to influence C-suite executives and key decision-makers. Preferred Certifications: - Certified Google Professional Cloud Architect or equivalent. - Cloud Certifications (Azure Data Engineer, AWS Solutions Architect, or equivalent). - PMP, ITIL, or SAFe Agile certifications for delivery governance. If you are a dynamic leader with a passion for driving innovation and transformation in the cloud and data space, we encourage you to apply for the Vice President, Enterprise Architecture Consulting- GCP Delivery Lead role at Genpact. Join us in shaping the future and delivering value to clients worldwide.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

As a Java Full Stack Developer with 7-10 years of experience, you will be responsible for designing, developing, and maintaining scalable Java-based backend systems. Your role will involve building dynamic and responsive user interfaces using React.js, as well as developing and integrating RESTful APIs and microservices. You will work with distributed systems and event-driven architecture, collaborating closely with cross-functional teams in an Agile environment. Your key responsibilities will include participating in code reviews, troubleshooting, and performance tuning. You will also be expected to integrate applications with cloud services (AWS preferred), work with containerized environments, and use Kubernetes for deployment. Ensuring code quality, scalability, and maintainability will be essential aspects of your work. To excel in this role, you should possess a Bachelor's degree in Computer Science or a related field (preferred) and have at least 8 years of experience in Java application development. Proficiency in Java 11+ is required, including Streams, Lambdas, and functional programming. Strong knowledge of Spring Boot, Spring Framework, and RESTful API development is essential, as well as experience with microservices architecture and monitoring tools. You should have a solid understanding of persistence layers such as JPA, Hibernate, MS-SQL, and PostgreSQL. Hands-on experience with React.js and strong frontend development skills with HTML, CSS3/Tailwind, and responsive design are also necessary. Experience with CI/CD pipelines (Jenkins, GitLab CI, GitHub Actions, or AWS DevOps) and familiarity with cloud platforms like AWS, Azure, or GCP (AWS preferred) are important. Exposure to container orchestration using EKS, AKS, or GKE, knowledge of Domain-Driven Design (DDD) and Backend-for-Frontend (BFF) patterns, and working knowledge of Kafka, MQ, or other event-driven technologies are advantageous. Strong problem-solving, debugging, and optimization skills, proficiency in Agile methodologies, version control (Git), and SDLC best practices are also required. Experience in the hospitality domain is a plus.,

Posted 1 week ago

Apply

6.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

You have a job opportunity for a Back End Engineer position requiring 6-14 years of experience in technologies such as Java, Springboot, Microservices, Python, AWS or Cloud Native Deployment, Event bridge, Api gateway, DynamoDb, and CloudWatch. The ideal candidate should have at least 7 years of experience in these technologies and be comfortable working with complex code and requirements. The essential functions of this position include working with a Tech Stack that includes Java, Springboot, Microservices, Python, AWS, Event bridge, Api gateway, DynamoDb, and CloudWatch. The qualifications required for this role include expertise in Spring boot (Annotations, Autowiring with reflection, spring starters, auto-configuration vs configuration), CI CD Tools, Gradle or Maven Knowledge, Docker, containers, scale up and scale down, Health checks, Distributed Tracing, exception handling in microservices, Lambda expressions, threads, and streams. Candidates with knowledge of GraphQL, prior experience working on projects with a lot of PII data, or experience in the Financial Services industry are preferred. The job offers an opportunity to work on bleeding-edge projects, collaborate with a highly motivated team, competitive salary, flexible schedule, benefits package including medical insurance, sports, corporate social events, professional development opportunities, and a well-equipped office. Grid Dynamics (NASDAQ: GDYN) is the company offering this job opportunity. They are a leading provider of technology consulting, platform and product engineering, AI, and advanced analytics services. With a focus on solving technical challenges and enabling positive business outcomes for enterprise companies undergoing business transformation, Grid Dynamics has expertise in enterprise AI, data, analytics, cloud & DevOps, application modernization, and customer experience. Founded in 2006, Grid Dynamics is headquartered in Silicon Valley with offices across the Americas, Europe, and India.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

guwahati, assam

On-site

The Developer role entails overseeing the development, implementation, and technical aspects of software projects to ensure the successful realization of the technical vision and strategy. This includes upholding technical standards, ensuring code quality, and maintaining the overall technical integrity of the project. The position requires a minimum of 7+ years of experience and a qualification of B.E./B. Tech in any specialization or MCA. The job location is in Guwahati, Assam. The ideal candidate should possess expertise in core Java concepts, object-oriented programming principles, Java features like lambda expressions and streams, as well as experience in developing enterprise-level applications using Java EE technologies. Proficiency in the Spring framework for building scalable applications, Spring Boot for rapid microservices development, and ORM concepts with frameworks like Hibernate is essential. Additionally, skills in web development using HTML, CSS, and JavaScript, along with experience in analyzing and optimizing Java applications for performance are required. Experience working in Agile/Scrum environments, relational databases like MariaDB, MySQL, PostgreSQL, or Oracle, and version control systems is crucial. Proficiency in CI/CD pipelines implementation using tools like Jenkins, GitLab CI, or Travis CI, automated testing and deployment processes, and familiarity with containerization technologies like Docker are preferred. Knowledge of building microservices-based architectures and understanding service discovery, load balancing, and API gateways are advantageous. Responsibilities include collaborating with stakeholders to grasp requirements and technical challenges, designing system architecture, writing and optimizing front-end and back-end code, integrating third-party services, implementing performance optimizations, and setting up CI/CD pipelines. Monitoring system health, providing maintenance, documenting code, working closely with the team, ensuring security best practices, and suggesting process improvements are also core duties. The Developer will be responsible for staying updated with new technologies, monitoring application response times, maintaining software documentation, recording support activities, collaborating with stakeholders, conducting feasibility studies, writing efficient code, executing tests, debugging and resolving issues, participating in team meetings and code reviews, and identifying areas for process improvement. Compliance with ISO 9001, ISO 20000, ISO 27001, and CMMI Level 5 standards is essential. Fluency in English and Hindi (speaking, reading, and writing) is required, with fluency in Assamese being preferred. The position was posted on June 30, 2025, and the last date for submission is July 31, 2025.,

Posted 1 week ago

Apply

3.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Thingworx Developer with a minimum of 3 to 9 years of experience, your primary responsibility will be to develop custom IoT applications using ThingWorx Foundation and Composer. You will be required to design and implement mashups, data models, and services using JavaScript. Furthermore, you will need to integrate ThingWorx with industrial systems such as PLC, SCADA, OPC, Kepware, as well as enterprise systems like ERP and MES. In this role, you will be tasked with developing and managing Things, ThingTemplates, ThingShapes, DataTables, and InfoTables. It will be essential to implement secure role-based access and permissions within the ThingWorx environment. Collaboration with stakeholders to understand requirements and effectively translate them into functional solutions will also be a key aspect of your responsibilities. Moreover, you will play a crucial role in optimizing application performance, troubleshooting issues, and ensuring high system availability. Integration of ThingWorx with external applications via REST/SOAP APIs will be part of your daily tasks. Additionally, you will work with Edge devices using ThingWorx Edge SDKs and communication protocols like MQTT. Supporting deployment, configuration, and version control across development and production environments will also fall under your purview. Your strong hands-on experience with the PTC ThingWorx platform and proficiency in JavaScript, ThingWorx Services, and Mashup Builder will be vital in excelling in this role. Experience with IoT protocols like MQTT, REST, and WebSockets is also required, along with a good understanding of data modeling using ThingTemplates, ThingShapes, and DataTables. Furthermore, integration knowledge with Kepware, SCADA, ERP, or MES systems, familiarity with Edge devices and ThingWorx Edge SDKs, as well as a solid understanding of InfoTables, Streams, and Value Streams are skills that will be beneficial for this position. This role will involve contract hiring and falls under the IT/Computers - Software industry type. If you are passionate about integration, Java, and PTC platforms, this role offers an exciting opportunity to showcase your skills and contribute to the development of innovative IoT applications.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

15 - 20 Lacs

Hyderabad, Bengaluru

Hybrid

Hi, Greetings from Fifthgen Job Description Java Developer / Senior DeveloperClient Name - TechwaveRole: Java Developer / Senior Developer Experience: 6+ Years Work Location: Primary: Hyderabad Secondary: Bangalore Core Skills Required: Java 8 and above (especially Lambda Expressions, Streams) Strong programming skills (including timeseries problems from platforms like LeetCode) Multithreading and Data Structures (Collections framework) RESTful Web Services Spring Boot, Spring MVC Java Messaging Git, Maven Agile, SCRUM methodology Additional Notes: Candidates who appeared for Techwave interviews in the last 4 months are not eligible Short notice candidates (less than 30 days) are highly preferred Compensation is based on experience, feedback, and market standards Drive Info: Virtual drives conducted every Saturday until all positions are filled Regards, Rupam

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 15 Lacs

Bengaluru

Hybrid

Standard Job Requirements 5+ Years of experience in Application Development using Java and Advance Technologies tool Strong understanding of fundamental architecture and design principles, object-orientation principles, and coding standards Ability to design and build smart, scalable, and resilient solutions with tight deadlines, both high and low-level. Strong analytical and problem-solving skills Strong verbal and written communication skills Good knowledge in DevOps, CI-CD Understanding on source control, versioning, branching etc. Experienced in Agile methodology and Waterfall models Strong experience in Application Delivery, that also includes Production Support Very Good presentation and documentation skills Ability to learn and adapt to new technologies and frameworks Awareness about Release Management Strong team player who can collaborate effectively with relevant stakeholders Recommend future technology capabilities and architecture design considering business objectives, technology strategy, trends and regulatory requirements. Technical Competence Must Have Strong programming and hands-on skills in Java 8 or above (preferably Java 17) Good hands on Java Collections and Streams Good hands on Data structure and Algorithms. Good experience in developing vulnerable free Spring Framework applications Good knowledge on Spring DI/Blueprints, Spring Boot, etc. Good knowledge about Design Patterns and Principles Good knowledge on OR frameworks like Hibernate, JPA etc. Good knowledge on API building (Web Service, SOAP/REST) Good knowledge on Unit testing and code coverage using JUnit/Mockito Good knowledge on code quality tools like SonarQube, Security Scans etc. Good knowledge on containerized platforms like Kubernetes, OpenShift, EKS (AWS) Good knowledge in Enterprise Application Integration patterns (synchronous, asynchronous) Good knowledge on multi-threading and multi-processing implementations Experience in RDBMS (Oracle, PostgreSQL, MySQL) Knowledge on SQL queries Ability to work in quick paced, dynamic environment adapting agile methodologies Ability to work with minimal guidance and/or high-level design input Knowledge on Microservices based development and implementation Knowledge on CI-CD pattern with related tools like Azure DevOps, GIT, Bitbucket, etc. Knowledge on JSON libraries like Jackson/GSON Knowledge on basic Unix Commands Possess good documentation and presentation skills Able to articulate ideas, designs, and suggestions Mentoring fellow team members, conducting code reviews Good to Have Hands-on skills in J2EE specifications like JAX-RS, JAX-WS Experience in working and supporting OLTP and OLAP systems Good Knowledge on Spring Batch, Spring Security Good knowledge in Linux Operating System (Preferably RHEL) Good knowledge on NoSQL offerings (Cassandra, MongoDB, GraphDB, etc) Knowledge on testing methodologies like performance testing, smoke testing, stress testing, endurance testing, etc. Knowledge in Python, Groovy Knowledge in middleware technologies like Kafka, Solace etc. Knowledge in DSE DataStax or Neo4j Cloud environments knowledge (AWS / Azure etc.) Knowledge on IMDG (Hazelcast, Ignite) Knowledge on Rule Engines like Drools, OpenL Tablets, Easy Rules etc. Experience in presenting solutions to architecture forums and follow the principles and standards in implementation Domain: Good to Have Experience in application development for Client Due Diligence (CDD), On-boarding, FATCA & CRS, AML, KYC, and Screening Good knowledge on Cloud native application development, and knowledge of Cloud computing services Training, Qualifications and Certifications Training/qualifications and Certifications in some of the functional and/or technical domains as mentioned will be an added advantage

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a candidate for this role, you should possess knowledge and experience in Migration projects, with a certification in Snowflake. Proficiency in programming languages such as Snowflake SnowSQL and Snowpipe is essential for this position. Additionally, you should have hands-on experience in working with semi-structured XML and JSON data within the Snowflake environment. Familiarity with Snowflake components like Stages, Streams, Tasks, and External Tables is a requirement for this role. While a working knowledge of Python scripting would be beneficial, experience in Database Design, Database Modeling, and Schema creation is crucial for success in this position.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Hyderabad

Work from Office

5 - 10 years of experience. Looking for a hands-on experienced resource. SAP Ent HANA space. Have technical expertise in SAP eHANA with XSA. Added benefit would be knowledge in any of the value streams such as RTR/PTP/OTC/EAM/MTD/FTS.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

25 - 40 Lacs

Bengaluru

Remote

Key Responsibilities: C++: 6-8+ years of working Experience on C++ Programming, Memory Management & File I/O and Streams Concepts. Multithreading : Strong understanding on Multithreading (creating and managing threads, synchronization mechanisms (such as mutexes and condition variables)) & Kernel Level. Linux: Good Understanding on develop and triage on Linux with Understanding on (Command-Line Tools, POSIX, Processes, Network) Unit Test: Good understanding on writing Unit Testing for developed Application Coding Test: Evaluate Coding Test & Coding Standards C++ C++: 5+ years of working Experience on C++ Programming, Memory Management & File I/O and Streams Concepts. Multithreading : Strong understanding on Multithreading (creating and managing threads, synchronization mechanisms (such as mutexes and condition variables)) & Kernel Level. Linux: Good Understanding on develop and triage on Linux with Understanding on (Command-Line Tools, POSIX, Processes, Network) Architecture: Strong understanding on building applications on C++ environment Good to have Skill SCM Tool & IDE: Good exposure to AgAbility to integrate IDE with Source Code system such as ClearCase, Ability to setup Linux IDEile & Scrum Methodologies, GIT, Confluence. Web Application: Good understanding on Developing Web Application on C++ Platform Project Exposure: Strong understanding on Project and SDLC Process. Troubleshooting: Experience in Debugging and troubleshooting Performance optimization: Performance optimization (like reducing memory allocations, optimizing loops, and using inline functions). SCM Tool & IDE: Good exposure to AgAbility to integrate IDE with Source Code system such as ClearCase, Ability to setup Linux IDEile & Scrum Methodologies, GIT, Confluence. Web Application: Good understanding on Developing Web Application on C++ Platform Project Exposure: Strong understanding on Project and SDLC Process. Troubleshooting: Experience in Debugging and troubleshooting and performance optimization(like reducing memory allocations, optimizing loops, and using inline functions). Docker & Containers: Good understanding in Docker & Containers for Deployment Soft Skill Communication : Concise and articulate written and verbal communication Interpersonal Skills: Maintaining positive relationship by empathy, active listening and emotional intelligence Attitude : Positive attitudes to be more adaptable, collaborative, and able to overcome challenges effectively Decision Making: Understanding the factors that influence decision making and employing appropriate strategies and techniques Collaboration: Working together with others to achieve a common goal or objective Communication: Concise and articulate written and verbal communication Interpersonal Skills: Maintaining positive relationship by empathy, active listening and emotional intelligence Attitude : Positive attitudes to be more adaptable, collaborative, and able to overcome challenges effectively Decision Making: Understanding the factors that influence decision making and employing appropriate strategies and techniques Collaboration: Working together with others to achieve a common goal or objective

Posted 3 weeks ago

Apply

7.0 - 9.0 years

30 - 35 Lacs

Pune

Work from Office

Core Responsibilities: The candidate is expected to lead one of the key business areas end to end. This is pure hands-on role but he/she may need to mentor junior person in the team. Excellent team management skills and must hols experience in leading the team in the current organization. Requirement gathering with business and get this prioritized in the sprint cycle. Come up with Project Architecture design and get the same approved from Tech Review committee. Ensure quality and timely delivery. Preference and Experience: Very Strong fundamental of OOPs programming Very Strong at Java fundamentals, Multithreading, Streams Good understanding of Data Structure Good knowledge of any distributed caching /computing framework/tools Good at SQL query/optimization. AWS Lambda (Serverless), Redis, spring boot, NoSQL database, UI Technologies, JMS/SQS, AWS Cloud, NodeJS, python are Good to have Well versed with latest technology stack on server-side programming Academic qualifications: Must be a Technical Graduate B. Tech / M. Tech Tier 1/2 colleges.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Chennai

Work from Office

5 years +(Mandate) Hands-on experience with Core Java, Java 8, Spring Boot and Microservices architecture. Hands-on experience on Rest API, Lambda expressions and Functional interface Hands on experience with Java Collections (including implementation experience), Exception handling, Streams API, Multithreading (Completable Future) Framework, JDBC and SQL Strong knowledge of software development methodologies (including Agile and associated tools) and best practices. Experience with Testing Frameworks - Junit, Cucumber, Mockito Experience with Build tools - Jenkins, Maven, Gradle Knowledge of Data Structures: LinkedList, Array, Sorting Algos Experience with cloud platforms (e.g., AWS, Azure, Google Cloud), OpenShift, Kubernetes is a plus. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills. Ability to work in a fast-paced and collaborative environment. Handson experience in front end technologies like React

Posted 3 weeks ago

Apply

6.0 - 10.0 years

6 - 15 Lacs

Gurugram

Work from Office

Requirements Elicitation, Understanding, Analysis, & Management Understand the project's Vision and requirements, and contribute to the creation of the supplemental requirements, building the low-level technical specifications for a particular platform and/or service solution. Project Planning, Tracking, & Reporting Estimate the tasks and resources required to design, create (build), and test the code for assigned module(s). Provide inputs in creating the detailed schedule for the project. Support the team in project planning activities, in evaluating risks, and shuffle priorities based on unresolved issues. During development and testing, ensure that assigned parts of the project/modules are on track with respect to schedules and quality. Note scope changes within the assigned modules and work with the team to shuffle priorities accordingly. Communicate regularly with the team about development changes, scheduling, and status. Participate in project review meetings. Tracking and reporting progress for assigned modules Design: Create a detailed (LLD) design for the assigned piece(s) with possible alternate solutions. Ensure that LLD design meets business requirements. Submit the LLD design for review. Fix the detailed (LLD) design for the assigned piece(s) for the comments received from team. Development & Support Build the code of high-priority and complex systems according to the functional specifications, detailed design, maintainability, and coding and efficiency standards. Use code management processes and tools to avoid versioning problems. Ensure that the code does not affect the functioning of any external or internal systems. Perform peer reviews of code to ensure it meets coding and efficiency standards. Act as the primary reviewer to review the application code created by software engineers to ensure compliance to defined standards. Recommend changes to the code as required. Testing & Debugging Attend the Test Design walkthroughs to help verify that the plans and conditions will test all functions and features effectively. Perform impact analysis for issues assigned to self and software engineers. Actively assist with project- and code-level problem solving, such as suggesting paths to explore when testing engineers or software engineers encounter a debugging problem, and escalate urgent issues. Documentation Review technical documentation for the code for accuracy, completeness, and usability. Document and maintain the reviews conducted and the unit test results. Process Management Adhere to the project and support processes. Adhere to best practices and comply with approved policies, procedures, and methodologies, such as the SDLC cycle for different project sizes. Shows responsibility for corporate funds, materials and resources. Ensure adherence to SDLC and audits requirements. Adhere to best practices and comply with approved policies, procedures, and methodologies. Position Summary As a Lead Collaboration Engineer at Guardian Life Insurance, you will be responsible for designing, building, testing, deploying, and supporting Microsoft 365 collaboration capabilities for 16,000 users globally. You are Excellent problem solver Strong collaborator with team members and other teams Strong communicator, documenter, and presenter Strong project ownership and execution skills, ensuring timely and quality delivery. Continuous self-learner, subject matter expert for Microsoft 365 You have Bachelor’s degree in computer science, Information Technology, or significant relevant experience 5+ years of experience, preferably in a large financial services enterprise Expert-level experience with Microsoft 365: Administration, Outlook/Exchange Online/Exchange Server, Teams, SharePoint Online/OneDrive, Power Automate, Viva Engage (Yammer), Stream, PowerShell scripting, advanced troubleshooting diagnostics, Copilot, Word, Excel, PowerPoint, OneNote, Visio, Project, Whiteboard, To Do, Planner, Lists, Viva Insights, Power Apps, Loop, Azure. Intermediate-level experience with Proofpoint E-mail Protection or a similar e-mail security service – Administration, Routing, Allow/Block List, Encryption, DLP, Send Securely, Secure Portal, SPF/DKIM/DMARC, delivery troubleshooting, incident response. Knowledge of other complimentary collaboration applications are desired: Zoom, BitTitan MigrationWiz, or ShareGate. Strong knowledge of IT Service Management and ITIL, preferably using Service Now – Incidents, Tasks, Problems, Knowledge, CMDB, Reporting, Dashboards. Proven ability to manage support and request tickets within SLAs, and drive Microsoft support cases to closure. Knowledge of Project Management using waterfall and agile frameworks. Proven ability to complete projects reliably and with quality. Knowledge of Networking and Security - DNS, Active Directory, Entra ID (Azure AD) including conditional access policies, certificates, firewalls, proxies, cloud access security brokers (CASB), single sign on (SSO), multi-factor authentication (MFA), data loss prevention (DLP) and identity and access management (IAM). Knowledge of Endpoints, Servers, and Cloud – Devices, operating systems, browsers, Intune, System Center, Nexthink, Amazon AWS, Azure. Microsoft certifications are desired, preferably MS-900, MS-700, MS-721, MS-102 You will Deliver excellent support for Collaboration capabilities to achieve service level agreements. Participation in the team on-call support rotation is required. Design, build, test, and deploy new Collaboration capabilities to achieve strategic goals and key deliverables reliably and with quality. Current goals are focused on Copilot, and Service Improvements. Reporting Relationships As our Collaboration Engineer, you will administratively report to our Delivery Manager/ Head of IT who reports to our Head of Infrastructure IT; and functionally to the Head of Collaboration Technology. Location: This position can be based in any of the following locations: Gurgaon For internal use only: R000106866

Posted 1 month ago

Apply

12.0 - 15.0 years

55 - 60 Lacs

Ahmedabad, Chennai, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Platform Engineer to build and maintain scalable, secure, and reliable data infrastructure for analytics and real-time processing. Key Responsibilities: Design and manage data pipelines, storage layers, and ingestion frameworks. Build platforms for batch and streaming data processing (Spark, Kafka, Flink). Optimize data systems for scalability, fault tolerance, and performance. Collaborate with data engineers, analysts, and DevOps to enable data access. Enforce data governance, access controls, and compliance standards. Required Skills & Qualifications: Proficiency with distributed data systems (Hadoop, Spark, Kafka, Airflow). Strong SQL and experience with cloud data platforms (Snowflake, BigQuery, Redshift). Knowledge of data warehousing, lakehouse, and ETL/ELT pipelines. Experience with infrastructure as code and automation. Familiarity with data quality, security, and metadata management. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Chennai

Work from Office

Architect & Build Scalable Systems: Design and implement a petabyte-scale lakehouse Architectures to unify data lakes and warehouses. Real-Time Data Engineering: Develop and optimize streaming pipelines using Kafka, Pulsar, and Flink. Required Candidate profile Data engineering experience with large-scale systems• Expert proficiency in Java for data-intensive applications. Handson experience with lakehouse architectures, stream processing, & event streaming

Posted 1 month ago

Apply

9.0 - 14.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Job Description: SQL & Database Management: Deep knowledge of relational databases (PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake . ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools. Role & responsibilities Build the data pipeline for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Work with data and analytics experts to strive for greater functionality in our data systems. Assemble large, complex data sets that meet functional / non-functional business requirements. – Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. – Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability, etc. – Unit Test databases and perform bug fixes. – Develop best practices for database design and development activities. – Take on technical leadership responsibilities of database projects across various scrum teams. Manage exploratory data analysis to support dashboard development (desirable) Required Skills: – Strong experience in SQL with expertise in relational database(PostgreSQL preferrable cloud hosted in AWS/Azure/GCP) or any cloud-based Data Warehouse (like Snowflake, Azure Synapse). – Competence in data preparation and/or ETL/ELT tools like SnapLogic, StreamSets, DBT, etc. (preferably strong working experience in one or more) to build and maintain complex data pipelines and flows to handle large volume of data. – Understanding of data modelling techniques and working knowledge with OLAP systems – Deep knowledge of databases, data marts, data warehouse enterprise systems and handling of large datasets. – In-depth knowledge of ingestion techniques, data cleaning, de-dupe, etc. – Ability to fine tune report generating queries. – Solid understanding of normalization and denormalization of data, database exception handling, profiling queries, performance counters, debugging, database & query optimization techniques. – Understanding of index design and performance-tuning techniques – Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption(TDE), signed stored procedures, and assignment of user permissions – Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting(desirable). – Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions – Exposure to Source control like GIT, Azure DevOps – Understanding of Agile methodologies (Scrum, Itanban) – experience with NoSQL database to migrate data into other type of databases with real time replication (desirable). – Experience with CI/CD automation tools (desirable) – Programming language experience in Golang, Python, any programming language, Visualization tools (Power BI/Tableau) (desirable).

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. AWS Data/API Gateway Pipeline Engineer responsible for designing, building, and maintaining real-time, serverless data pipelines and API services. This role requires extensive hands-on experience with Java, Python, Redis, DynamoDB Streams, and PostgreSQL, along with working knowledge of AWS Lambda and AWS Glue for data processing and orchestration. This position involves collaboration with architects, backend developers, and DevOps engineers to deliver scalable, event-driven data solutions and secure API services across cloud-native systems. Key Responsibilities API & Backend Engineering Build and deploy RESTful APIs using AWS API Gateway, Lambda, and Java and Python. Integrate backend APIs with Redis for low-latency caching and pub/sub messaging. Use PostgreSQL for structured data storage and transactional processing. Secure APIs using IAM, OAuth2, and JWT, and implement throttling and versioning strategies. Data Pipeline & Streaming Design and develop event-driven data pipelines using DynamoDB Streams to trigger downstream processing. Use AWS Glue to orchestrate ETL jobs for batch and semi-structured data workflows. Build and maintain Lambda functions to process real-time events and orchestrate data flows. Ensure data consistency and resilience across services, queues, and databases. Cloud Infrastructure & DevOps Deploy and manage cloud infrastructure using CloudFormation, Terraform, or AWS CDK. Monitor system health and service metrics using CloudWatch, SNS and structured logging. Contribute to CI/CD pipeline development for testing and deploying Lambda/API services. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor's degree in computer science, Engineering, or a related field. Over 6 years of experience in developing backend or data pipeline services using Java and Python . Strong hands-on experience with: AWS API Gateway , Lambda , DynamoDB Streams Redis (caching, messaging) PostgreSQL (schema design, tuning, SQL) AWS Glue for ETL jobs and data transformation Solid understanding of REST API design principles, serverless computing, and real-time architecture. Preferred Skills and Experience Familiarity with Kafka, Kinesis, or other message streaming systems Swagger/OpenAPI for API documentation Docker and Kubernetes (EKS) Git, CI/CD tools (e.g., GitHub Actions) Experience with asynchronous event processing, retries, and dead-letter queues (DLQs) Exposure to data lake architectures (S3, Glue Data Catalog, Athena) Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 month ago

Apply

5.0 - 7.0 years

35 - 55 Lacs

Bengaluru

Work from Office

Serko is a cutting-edge tech platform in global business travel & expense technology. When you join Serko, you become part of a team of passionate travellers and technologists bringing people together, using the world’s leading business travel marketplace. We are proud to be an equal opportunity employer, we embrace the richness of diversity, showing up authentically to create a positive impact. There's an exciting road ahead of us, where travel needs real, impactful change. With offices in New Zealand, Australia, North America, and China, we are thrilled to be expanding our global footprint, landing our new hub in Bengaluru, India. With rapid a growth plan in place for India, we’re hiring people from different backgrounds, experiences, abilities, and perspectives to help us build a world-class team and product. As a Senior Principal Engineer, you’ll play a key role in shaping our technical vision and driving engineering excellence across our product streams. Your leadership will foster a high-performance culture that empowers teams to build innovative solutions with real-world impact. Requirements Working closely with stream leadership—including the Head of Engineering, Senior Engineering Managers, Architects, and domain specialists—you’ll provide hands-on technical guidance and help solve complex engineering challenges. As a Senior Principal Engineer, you'll also lead targeted projects and prototypes, shaping new technical approaches and ensuring our practices stay ahead of the curve. What you'll do Champion best practices across engineering teams, embedding them deeply within the stream Proactively resolve coordination challenges within and across streams to keep teams aligned and unblocked Partner with Product Managers to ensure customer value is delivered in the most pragmatic and impactful way Lead or contribute to focused technical projects that solve high-priority problems Collaborate with cross-functional teams to define clear requirements, objectives, and timelines for key initiatives Explore innovative solutions through research and analysis, bringing fresh thinking to technical challenges Mentor engineers and share technical expertise to uplift team capability and growth Continuously evaluate and enhance system performance, reliability, and scalability Stay ahead of the curve by tracking industry trends, emerging technologies, and evolving best practices Drive continuous improvement across products and processes to boost quality, efficiency, and customer satisfaction Maintain strong communication with stakeholders to gather insights, provide updates, and incorporate feedback What you'll bring to the team Strong proficiency in stream-specific technologies, tool and programming languages Demonstrated expertise in specific areas of specialization related to the stream Excellent problem-solving skills and attention to detail Ability to lead teams through complex changes to engineering related areas, and maintain alignment across Product and Technology teams Effective communication and interpersonal skills Proven ability to work independently and collaboratively in a fast-paced environment Tertiary level qualification in a relevant Engineering discipline or equivalent. Benefits At Serko we aim to create a place where people can come and do their best work. This means you’ll be operating in an environment with great tools and support to enable you to perform at the highest level of your abilities, producing high-quality, and delivering innovative and efficient results. Our people are fully engaged, continuously improving, and encouraged to make an impact. Some of the benefits of working at Serko are: A competitive base pay Medical Benefits Discretionary incentive plan based on individual and company performance Focus on development: Access to a learning & development platform and opportunity for you to own your career pathways Flexible work policy. Apply Hit the ‘apply’ button now, or explore more about what it’s like to work at Serko and all our global opportunities at www.Serko.com .

Posted 2 months ago

Apply

5.0 - 10.0 years

0 - 1 Lacs

Ahmedabad, Chennai, Bengaluru

Hybrid

Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.

Posted 2 months ago

Apply

7.0 - 10.0 years

15 - 22 Lacs

Pune

Work from Office

As an experienced member of our Core banking Base Development / Professional Service Group, you will be responsible for effective Microservice development in Scala and delivery of our NextGen transformation / professional services projects/programs. What You Will Do: • Adhere the processes followed for development in the program. • Report status, and proactively identify issues to the Tech Lead and management team. • Personal ownership and accountability for delivering assigned tasks and deliverables within the established schedule. • Facilitate a strong and supportive team environment that enables the team as well as individual team members to overcome any political, bureaucratic and/or resource barriers to participation. • Recommend and Implement solutions. Be totally hands on and have the ability to work independently. What You Will Need to Have: • 4 to 8 years of recent hands-on in Scala and Akka Framework • Technical Skillset required o Should possess Hands-on experience in Scala development including Akka Framework. o Must have good understanding on Akka Streams. o Test driven development. o Awareness on message broker. o Hands-on Experience in design and development of Microservices. o Good awareness on Event driven Microservices Architecture. o GRPC Protocol + Protocol Buffers. o Hands-on Experience in Docker Containers. o Hands-on Experience in Kubernetes. o Awareness on cloud native applications. o Jira, Confluence, Ansible, Terraform. o Good knowledge of the cloud platforms (preferably AWS), their IaaS, PaaS, SaaS solutions. o Good knowledge and hands on experience on the scripting languages like Batch, Bash, hands on experience on Python would be a plus. o Knowledge of Integration and unit test and Behavior Driven Development o Need to have good problem-solving skills. o Good communication skills. What Would Be Great to Have: • Experience integrating to third party applications. • Agile knowledge • Good understanding of the configuration management • Financial Industry and Core Banking integration experience --

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies