Jobs
Interviews

48 Streams Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

4 - 6 Lacs

bengaluru

Remote

Hello Everyone, ixie is conducting virtual drive on 12th Sep for the role Automation Tester . Kindly apply only if you've an experience into Automation Testing. Role: Automation Tester Experience: 2 - 6 years Mode: Remote (with reporting to office based on client needs) Shift: noon Base Location: Bangalore / Chennai Note: Candidates should be open to reporting to the office if required Interested ones please share your CV to gamingtag@indium.tech Virtual Drive Details Date: 12th Sep 2025 Time: 2:30 PM - 4:30 PM Link: https://indiumsoft.zoom.us/j/94886246260?pwd=IzhCoMn7VdJPmb8HxvcbhiP8iHbbiw.1 Meeting ID: 948 8624 6260 Passcode: 674598 Job Description: Must Have Skills: Strong knowledge in Java (Core Java, Basic Design Pattern like singleton, factory, builder, chain of invocation) Hands on Experience in any one of the following (Mobile Automation with Appium / API Automation with Rest Assured) along with web Automation SCM : GIT (CLI experience) , Build Management tool : Maven, CI and CD : Jenkins TestNG Good to Have : Java 8 functions (Streams and Lambda expressions) Git Hub Actions Good knowledge or ready to learn C# Allure Report Gradle build management tool Interested candidates share your cv to gamingtag@indium.tech *Fresher don't apply *Candidate must have strong experience on CoreJava + Appium + TestNG + GIT.

Posted 1 week ago

Apply

8.0 - 10.0 years

20 - 22 Lacs

pune

Work from Office

Core Java, Collections skills, Java 8, Lambda, Streams, Spring Framework (Core / Boot / Integration), Apache Flink, Apache Kafka, ELK stack, Elasticsearch, Logstash & Kibana, BPMN/CMMN, Angular/JavaScript / React / Redux, CI / CD, Git, agile SDLC

Posted 1 week ago

Apply

8.0 - 13.0 years

18 - 22 Lacs

pune

Hybrid

Java Full stack java software engineer with 8+ years of experience strong in Core Java, Collections skills with 8+ years of experience. Preferred experience on Java 8 features such as Lambda Expressions, Streams etc. extensive experience on Spring Framework (Core / Boot / Integration) good knowledge of the design patterns applicable to data streaming experience of Apache Flink/Apache Kafka and the ELK stack are highly desirable (Elasticsearch, Logstash & Kibana) experience of Flowable or similar BPMN/CMMN tooling also highly desirable knowledge of front end technologies like Angular/JavaScript / React / Redux also applicable familiarity with CI / CD (TeamCity / Jenkins), Git / GitHub /GitLab familiarity with Docker/containerization technologies familiarity with Microsoft Azure proven track record in an agile SDLC in a large scale enterprise environment knowledge of Post trade processing in large financial institutions an added bonus!

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

delhi

On-site

As a Lead Software Engineering Team Lead at AdPushup, you will play a crucial role in leading a team of talented engineers to deliver high-quality software solutions. Your responsibilities will include mentoring team members, driving successful project deliveries, and fostering an environment of collaboration and innovation. To excel in this role, you should possess a strong technical background with a minimum of 5 years of experience in software development. A Bachelor's or Master's degree in Computer Science or a related field is required. You should have a proven track record of delivering backend solutions in production, with expertise in Core Java, Java concurrency framework, and unit testing frameworks. Additionally, hands-on experience with Java frameworks such as SpringBoot or Vert.x, cloud platforms like AWS, Azure, or Google Cloud, and database technologies including SQL and NoSQL databases is highly desirable. Familiarity with DevOps practices, containerization tools like Docker, Kubernetes, and orchestration tools like Terraform will be advantageous. Join us at AdPushup, where we value expertise, ownership, and collaboration. If you are passionate about software engineering, leadership, and driving innovation, we look forward to having you on board as a key member of our dynamic team.,

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 15 Lacs

hyderabad

Work from Office

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using Snowflake as the primary database engine. Collaborate with cross-functional teams to gather requirements and design solutions that meet business needs. Develop complex SQL queries to extract insights from large datasets stored in Snowflake tables. Troubleshoot issues related to data quality, performance tuning, and security compliance. Participate in code reviews to ensure adherence to coding standards and best practices. Desired Candidate Profile 3-7 years of experience working with Snowflake as a Data Engineer or similar role. Strong understanding of SQL programming language with ability to write efficient queries for large datasets. Proficiency in Python scripting language with experience working with popular libraries such as Pandas, NumPy, etc.

Posted 1 week ago

Apply

2.0 - 6.0 years

1 - 6 Lacs

chennai

Remote

Hello Everyone, ixie is conducting virtual drive on 4th Sep for the role Automation Tester . Kindly apply only if you've an experience into Automation Testing. Role: Automation Tester Experience: 2 - 4 years Mode: Remote (with reporting to office based on client needs) Shift: noon Base Location: Bangalore / Chennai Note: Candidates should be open to reporting to the office if required Interested ones please share your CV to gamingtag@indium.tech Virtual Drive Details Date: 4th Sep 2025 Time: 2:30 PM - 4:30 PM Link: https://indiumsoft.zoom.us/j/93512975140pwd=X7c4xpHPp2BqMRIoxsM6NQ1K1kDMds.1 Meeting ID : 935 1297 5140 Passcode: 660194 Job Description: Must Have Skills: Strong knowledge in Java (Core Java, Basic Design Pattern like singleton, factory, builder, chain of invocation) Hands on Experience in any one of the following (Mobile Automation with Appium / API Automation with Rest Assured) along with web Automation SCM : GIT (CLI experience) , Build Management tool : Maven, CI and CD : Jenkins TestNG Good to Have : Java 8 functions (Streams and Lambda expressions) Git Hub Actions Good knowledge or ready to learn C# Allure Report Gradle build management tool

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a DBT professional, you will be responsible for designing, developing, and defining technical architecture for data pipelines and performance scaling in a big data environment. Your expertise in PL/SQL, including queries, procedures, and JOINs, will be crucial for the integration of Talend data and ensuring data quality. You will also be proficient in Snowflake SQL, writing SQL queries against Snowflake, and developing scripts in Unix, Python, etc., to facilitate Extract, Load, and Transform operations. It would be advantageous to have hands-on experience and knowledge of Talend. Candidates with previous experience in PROD support will be given preference. Your role will involve working with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. You will be responsible for data analysis, troubleshooting data issues, and providing technical support to end-users. In this position, you will develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Your problem-solving skills will be put to the test, and you will be expected to have a continuous improvement approach. Possessing Talend/Snowflake Certification would be considered desirable. Excellent SQL coding skills, effective communication, and documentation skills are essential. Knowledge of the Agile delivery process is preferred. You must be analytical, creative, and self-motivated to excel in this role. Collaboration within a global team environment is key, necessitating excellent communication skills. Your contribution to Virtusa will be valued, as teamwork, quality of life, and professional development are the core values the company upholds. By joining a global team of 27,000 professionals, you will have access to exciting projects and opportunities to work with cutting-edge technologies throughout your career. Virtusa provides an environment that nurtures new ideas, fosters excellence, and encourages personal and professional growth.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for designing and implementing scalable data models using Snowflake to support business intelligence and analytics solutions. This will involve implementing ETL/ELT solutions with complex business transformations and handling end-to-end data warehousing solutions. Additionally, you will be tasked with migrating data from legacy systems to Snowflake systems and writing complex SQL queries to extract, transform, and load data with a focus on high performance and accuracy. Your role will also include optimizing SnowSQL queries for better processing speeds and integrating Snowflake with 3rd party applications. To excel in this role, you should have a strong understanding of Snowflake architecture, features, and best practices. Experience in using Snowpipe and Snowpark/Streamlit, as well as familiarity with cloud platforms such as AWS, Azure, or GCP and other cloud-based data technologies, will be beneficial. Knowledge of data modeling concepts like star schema, snowflake schema, and data partitioning is essential. Experience with tools like dbt, Matillion, or Airbyte for data transformation and automation is preferred, along with familiarity with Snowflake's Time Travel, Streams, and Tasks features. Proficiency in data pipeline orchestration using tools like Airflow or Prefect, as well as scripting and automation skills in Python or Java, are required. Additionally, experience with data visualization tools like Tableau, Power BI, QlikView/QlikSense, or Looker will be advantageous.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

We are seeking a skilled and experienced Senior Backend Developer with over 8 years of experience to take the lead in developing scalable, high-performance backend systems. The perfect candidate will possess expertise in Java, Spring Boot, and Microservices architecture, as well as a profound understanding of Data Structures and Algorithms, Multi-threading, Collections, Streams, and Hibernate. As a senior member of our development team based in Bangalore, you will play a crucial role in designing, implementing, and maintaining backend systems that serve as the foundation for our enterprise applications. Key Responsibilities: - Lead the design and development of high-performance, scalable, and robust backend systems utilizing Java, Spring Boot, and Microservices. - Architect and implement solutions that leverage multi-threading and Java collections/streams to meet performance and scalability requirements. - Utilize Hibernate and JPA for efficient data management, ensuring proper mapping, querying, and performance optimization in database interactions. - Optimize backend services for performance, reliability, and scalability while ensuring adherence to high-quality standards. - Mentor and guide junior and mid-level developers, offering technical leadership, code reviews, and best practices. - Collaborate with cross-functional teams, including product managers, front-end developers, DevOps, and QA, to ensure seamless integration of backend systems. - Design and develop RESTful APIs and microservices in compliance with security, performance, and reliability standards. - Lead troubleshooting and performance optimization efforts, identifying bottlenecks and ensuring smooth and efficient backend systems. - Implement and enforce development best practices, including automated testing (unit and integration), continuous integration/continuous deployment (CI/CD), and agile methodologies. - Stay abreast of emerging technologies, trends, and practices in backend development and software architecture. Required Skills and Qualifications: - 8+ years of backend development experience with a strong proficiency in Java and associated frameworks, particularly Spring Boot. - Demonstrated expertise in designing and developing Microservices and distributed systems. - Strong grasp of Data Structures, Algorithms, and Multi-threading concepts. - Proficiency in Java Collections, Streams, and Lambda expressions to optimize system performance. - Extensive experience with Hibernate or JPA for database management, including entity modeling and query optimization. - Strong problem-solving skills and the ability to optimize code for performance, scalability, and maintainability. - Deep knowledge of RESTful APIs, service-oriented architecture (SOA), and microservices best practices. - Experience with version control tools like Git. - Solid understanding of database design, SQL, and transaction management. - Experience in code review processes, mentoring junior developers, and leading technical initiatives. - Sound understanding of cloud-based infrastructure and deployment, such as AWS, GCP, Azure. - Proven experience working in an Agile/Scrum development environment. Preferred Skills: - Experience with containerization tools like Docker and orchestration tools such as Kubernetes. - Familiarity with CI/CD pipelines (e.g., Jenkins, GitLab CI). - Experience with message brokers (e.g., Kafka, RabbitMQ) and event-driven architectures. - Knowledge of monitoring and logging frameworks (e.g., ELK stack, Prometheus). - Familiarity with modern testing frameworks and methodologies (e.g., JUnit, TestNG, TDD). - Familiarity with enterprise-level security practices and tools. Education: - BTech, MTech, or MCA only. Benefits: - Opportunity to work with one of the Big 4 companies in India.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Snowflake SQL Developer, you will be responsible for writing SQL queries against Snowflake and developing scripts in Unix, Python, and other languages to facilitate the Extract, Load, and Transform (ELT) process for data. Your role will involve hands-on experience with various Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Your primary objective will be to design and implement scalable and performant data pipelines that ingest, process, and transform data from diverse sources into Snowflake. You should have proven experience in configuring and managing Fivetran connectors for data integration, and familiarity with DBT knowledge is considered a plus. To excel in this role, you must possess excellent SQL coding skills, along with strong communication and documentation abilities. Your complex problem-solving capabilities, coupled with an ever-improving approach, will be crucial in delivering high-quality solutions. Analytical thinking, creativity, and self-motivation are key attributes that will drive your success in this position. Collaboration is essential in our global team environment, and your ability to work effectively with colleagues worldwide will be valued. While a Snowflake Certification is preferred, familiarity with Agile delivery processes and outstanding communication skills are also highly desirable traits for this role.,

Posted 2 weeks ago

Apply

5.0 - 7.0 years

40 - 55 Lacs

bengaluru

Work from Office

Serko is a cutting-edge tech platform in global business travel & expense technology. When you join Serko, you become part of a team of passionate travellers and technologists bringing people together, using the world’s leading business travel marketplace. We are proud to be an equal opportunity employer, we embrace the richness of diversity, showing up authentically to create a positive impact. There's an exciting road ahead of us, where travel needs real, impactful change. With offices in New Zealand, Australia, North America, and China, we are thrilled to be expanding our global footprint, landing our new hub in Bengaluru, India. With rapid a growth plan in place for India, we’re hiring people from different backgrounds, experiences, abilities, and perspectives to help us build a world-class team and product. As a Senior Principal Engineer, you’ll play a key role in shaping our technical vision and driving engineering excellence across our product streams. Your leadership will foster a high-performance culture that empowers teams to build innovative solutions with real-world impact. Requirements Working closely with stream leadership—including the Head of Engineering, Senior Engineering Managers, Architects, and domain specialists—you’ll provide hands-on technical guidance and help solve complex engineering challenges. As a Senior Principal Engineer, you'll also lead targeted projects and prototypes, shaping new technical approaches and ensuring our practices stay ahead of the curve. What you'll do Champion best practices across engineering teams, embedding them deeply within the stream Proactively resolve coordination challenges within and across streams to keep teams aligned and unblocked Partner with Product Managers to ensure customer value is delivered in the most pragmatic and impactful way Lead or contribute to focused technical projects that solve high-priority problems Collaborate with cross-functional teams to define clear requirements, objectives, and timelines for key initiatives Explore innovative solutions through research and analysis, bringing fresh thinking to technical challenges Mentor engineers and share technical expertise to uplift team capability and growth Continuously evaluate and enhance system performance, reliability, and scalability Stay ahead of the curve by tracking industry trends, emerging technologies, and evolving best practices Drive continuous improvement across products and processes to boost quality, efficiency, and customer satisfaction Maintain strong communication with stakeholders to gather insights, provide updates, and incorporate feedback Exp: 14+ years What you'll bring to the team Strong proficiency in stream-specific technologies, tool and programming languages Demonstrated expertise in specific areas of specialization related to the stream Excellent problem-solving skills and attention to detail Ability to lead teams through complex changes to engineering related areas, and maintain alignment across Product and Technology teams Effective communication and interpersonal skills Proven ability to work independently and collaboratively in a fast-paced environment Tertiary level qualification in a relevant Engineering discipline or equivalent. Benefits At Serko we aim to create a place where people can come and do their best work. This means you’ll be operating in an environment with great tools and support to enable you to perform at the highest level of your abilities, producing high-quality, and delivering innovative and efficient results. Our people are fully engaged, continuously improving, and encouraged to make an impact. Some of the benefits of working at Serko are: A competitive base pay Medical Benefits Discretionary incentive plan based on individual and company performance Focus on development: Access to a learning & development platform and opportunity for you to own your career pathways Flexible work policy. Apply Hit the ‘apply’ button now, or explore more about what it’s like to work at Serko and all our global opportunities at www.Serko.com .

Posted 3 weeks ago

Apply

9.0 - 14.0 years

10 - 17 Lacs

ahmedabad, chennai, bengaluru

Work from Office

Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 3 weeks ago

Apply

7.0 - 10.0 years

12 - 22 Lacs

bengaluru

Work from Office

RTS Engineer - REDIS Required Skills & Experience Design, deploy, and maintain Redis clusters for high availability, fault tolerance, and scalability. Configure and optimize Redis persistence, replication, sharding, and failover . Manage Redis in both on-premises and cloud environments (AWS ElastiCache, Azure Cache for Redis, GCP MemoryStore). Proven experience in Redis administration (cluster setup, scaling, performance tuning). Strong understanding of Redis internals (data structures, eviction policies, persistence modes). Hands-on experience with Redis Streams or other streaming mechanisms. Proficiency with DevOps tools (Docker, Kubernetes, Jenkins, GitLab CI/CD). Experience in real-time streaming architectures and event-driven systems. Strong scripting skills in Bash, Python, or similar languages . Familiarity with monitoring and logging solutions for distributed systems. Knowledge of cloud-based Redis services . Automate infrastructure provisioning and configuration using Infrastructure as Code (IaC) tools such as Terraform, Ansible, or CloudFormation. Knowledge on security controls designing Source and Data Transfers including CRON, ETLs, and JDBC-ODBC scripts Experience working with SQL and NoSQL databases

Posted 3 weeks ago

Apply

7.0 - 12.0 years

14 - 20 Lacs

hyderabad, bengaluru, delhi / ncr

Work from Office

About the Role We are seeking a Senior Data Engineer with strong expertise in Snowflake to design, build, and optimize enterprise-scale data pipelines. You will work closely with analysts, data scientists, and application teams to deliver reliable, high-performance, and secure data solutions that drive critical business outcomes. Key Responsibilities Design and develop scalable, secure, and high-performance data pipelines on Snowflake using Snowpipe, Streams, Tasks, and dbt. Build and maintain ELT processes ingesting data from diverse sources into Snowflake using cloud-native services and orchestration tools. Manage and optimize multi-account Snowflake environments for cost, performance, and security. Write efficient SQL transformations and automate workflows using dbt and orchestration frameworks (e.g., Airflow, Prefect). Collaborate with analysts, data scientists, and application teams to deliver business-critical data products. Implement and enforce data governance and quality frameworks. Integrate Python scripts (Pandas, PySpark, SQLAlchemy) for advanced processing and ML model integration. Preferred Skills & Experience 7+ years of data engineering experience, with 3+ years hands-on with Snowflake in enterprise environments. Strong expertise in Python for data engineering. Experience with metadata management, data cataloging, and column-level lineage. Exposure to BI tools (Tableau, Power BI). Familiarity with data governance frameworks. Snowflake SnowPro Core Certification preferred. Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 3 weeks ago

Apply

7.0 - 12.0 years

10 - 15 Lacs

bengaluru

Remote

Snowflake Developer / Data Engineer Primary Responsibilities: Design and develop scalable, secure, and high-performance data pipelines on Snowflake using Snowpipe, Streams, Tasks, and dbt. Develop and maintain ELT processes that ingest data from diverse sources into Snowflake using cloud-native services and orchestration tools. Work across multiple Snowflake accounts under a single organization setup, optimizing for cost, performance, and data security. Write efficient SQL transformations and automate workflows using dbt and orchestration frameworks (e.g., Airflow, Prefect). Collaborate with analysts, data scientists, and application teams to deliver business-critical data products. Implement and enforce data governance and quality frameworks. Integrate Python scripts for advanced data processing or machine learning model serving when needed. Preferred Skills: Hands-on with Python, including libraries like Pandas, PySpark, or SQLAlchemy. Experience with data cataloging, metadata management, and column-level lineage. Exposure to BI tools like Tableau, or Power BI. Certifications: Snowflake SnowPro Core Certification preferred. Experience: 7+ years of data engineering experience, with 3+ years on Snowflake in enterprise-scale environments.

Posted 3 weeks ago

Apply

9.0 - 12.0 years

5 - 5 Lacs

thiruvananthapuram

Work from Office

Tech Lead - Azure/Snowflake & AWS Migration Key Responsibilities - Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. - Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. - Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: o Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. o Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. o Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. o Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. o Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. - Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. - Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. - Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. - Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. - Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. - Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. - Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications - 9+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. - Proficiency in: o Python for scripting and ETL orchestration o SQL for complex data transformation and performance tuning in Snowflake o Azure Data Factory and Synapse Analytics (SQL Pools) - Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. - Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. - Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. - Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. - Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications - Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. - Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. - Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. - Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Required Skills Azure,AWS REDSHIFT,Athena,Azure Data Lake

Posted 4 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

vadodara, gujarat

On-site

As a Java Quarkus Developer at QuantaLynk, you will be leveraging your expertise in Java, particularly with Quarkus framework, to contribute to the development of cutting-edge technology solutions. Your role will involve working with RESTful APIs, OAuth2, and WebSockets to build efficient and scalable systems that align with business goals. With over 4 years of experience in Java and more than 2 years of hands-on experience with Quarkus, you will bring a deep understanding of Event-Driven Architecture to the table. Your proficiency in RESTful APIs, OAuth2, WebSockets for 3-4 years, and Event-Driven Architecture for over 3 years will be instrumental in developing solutions that maximize profitability and create long-term value for our clients. Additionally, your familiarity with Kafka, Streams, and gRPC, along with a basic knowledge of MongoDB, Docker, and CI/CD will enable you to design and implement software and automation solutions that are future-ready and scalable. This role offers a long-term opportunity to work on challenging projects that require a high level of technical expertise. We are looking for a candidate with a good level of English communication skills who is passionate about leveraging technology to drive business transformation. If you are eager to work in a dynamic environment where innovation and collaboration are encouraged, we invite you to join our team at QuantaLynk and be part of our mission to help businesses scale, optimize, and transform.,

Posted 4 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As an employee at Max Life Insurance, you will be part of a dynamic and forward-thinking company that offers comprehensive protection and long-term savings life insurance solutions through various distribution channels. With a customer-centric approach and a focus on trained human capital, Max Life has established itself as a leader in the industry over the past two decades. During the financial year 2019-20, Max Life achieved a gross written premium of Rs. 16,184 crore and had assets under management of Rs. 68,471 crore as of 31st March 2020. The company's commitment to excellence is reflected in its Sum Assured in Force of Rs. 913,660 crore. At Max Life Insurance, we are guided by our core values: Caring: We believe in appreciating diversity, eliminating biases, and promoting meritocracy. As a compassionate leader, you will inspire your team to excel and foster a culture of high performance. Collaboration: We value teamwork and collaboration, leveraging the expertise of team members to achieve outstanding results. By addressing challenges with a solution-oriented approach, you will create win-win partnerships within and outside the organization. Customer Obsession: Putting the customer at the core of all deliverables, we strive to provide the best customer experience by anticipating their needs and implementing proactive strategies. Growth Mindset: We encourage ambitious leaders who challenge the status quo, sponsor innovative ideas, and rally their teams to achieve high-impact goals. By pushing boundaries and raising performance standards, you will drive growth and success. People Leadership: As a people leader, you will inspire your team to reach their full potential, creating a culture of empowerment and superior business outcomes through coaching and motivation. Max Life Insurance is an Equal Opportunity Employer that values inclusion and diversity in the workplace. Key Responsibilities: - Collaborate with business partners to develop future-proof solutions in digital, automation, APIs, integration, and data - Provide technical expertise in solving performance and non-functional requirements - Design integrations and drive changes to standards based on input from service partners - Support critical projects in all phases of delivery as needed - Analyze the current IT ecosystem and identify opportunities for improvement in Application, Integration, and Solution Architecture - Conduct prototypes to explore new technologies and maintain technical relationships with partners Other Responsibilities: - Define and review continuous delivery, continuous integration, and continuous testing pipelines - Manage stakeholders at strategic levels in technical and business functions - Drive continuous service improvement and strategic initiatives to achieve business goals Measures of Success: - Alignment of IT landscape with overall vision and blueprints - Delivery of applications with improved speed, quality, and cost-efficiency - Exceptional user experience and operational efficiency through cutting-edge technology solutions - Trusted partnership with IT and business departments - Stay updated on emerging technologies, industry trends, and best practices Key Skills Required: - Proficiency in Java Frameworks, databases, AWS cloud, and other relevant technologies - Experience in designing hybrid cloud applications and migrating workloads to the cloud - Knowledge of application and data integration platforms and patterns - Understanding of BFSI domain and application integration best practices - Ability to suggest architectural changes for cost control and resource optimization - Prior experience in AI and Data Analytics implementation is a plus If you are passionate about leveraging technology to drive innovation and deliver exceptional results, we invite you to join our team at Max Life Insurance. For more information, visit our website at www.maxlifeinsurance.com.,

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

You should have 5 to 10+ years of experience in C++ programming with a focus on memory management, file I/O, and streams concepts. Your expertise should also include a strong understanding of multithreading, including creating and managing threads, synchronization mechanisms like mutexes and condition variables, and kernel-level operations. Additionally, you should possess a good understanding of Linux development and triaging, including familiarity with command-line tools, POSIX, processes, and network operations. A solid foundation in building applications in a C++ environment is also crucial for this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a professional in the IT & Tech Engineering field at Allianz Technology, you will be expected to possess a diverse set of technical skills. Your role will involve understanding code management and release approaches, including knowledge of monorepo / multirepo concepts. It is essential to have a good understanding of functional programming principles and various code management methodologies such as SDLC, DRY, KISS, and SOLID. Moreover, familiarity with authorization and authentication mechanisms like ABAC, RBAC, JWT, SAML, AAD, OIDC, and experience with NoSQL databases like DynamoDB are highly valued. Proficiency in UI development using technologies like React, hooks, and TypeScript is crucial. Additionally, expertise in event-driven architecture, including queues, streams, batches, and pub/subs, is necessary. You should also possess a solid grasp of functional programming concepts such as list, map, reduce, compose, and monads. Understanding scalability, concurrency, networking, proxies, CI/CD pipelines, GitFlow, Github, and GitOps tools like Flux and ArgoCD is required. Being a polyglot programmer proficient in at least two languages such as Python, TypeScript, or Golang at an expert level is preferred. Furthermore, you must be fluent in operating Kubernetes clusters from a development perspective, creating custom CRDs, operators, and controllers, and have experience in developing serverless cloud applications. Deep knowledge of AWS cloud services and a basic understanding of Azure cloud are advantageous. Apart from technical skills, soft skills play a vital role in this role. Effective communication, leadership abilities, team supervision, task delegation, feedback issuance, risk evaluation, conflict resolution, project management, crisis management, problem-solving, innovation, ownership, and vision are key soft skills expected from you. Your responsibilities will also include providing technical guidance, making informed decisions, shaping solutions, enforcing development practices, and ensuring quality gates through activities like code reviews, pair programming, and team review meetings.,

Posted 1 month ago

Apply

4.0 - 8.0 years

3 - 12 Lacs

Bengaluru

Work from Office

Responsibilities: Teach Core Java, OOP, Java 8+, Collections, Multithreading Deliver training on Spring Boot, REST APIs, & Hibernate/JPA Create coding exercises, projects, and support student queries Track learner progress and offer mentorship

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Bengaluru

Work from Office

• Strong programming skills Java 8 and above – Lambda Expressions and Streams, etc. • Multithreading and Collections (Data structures) • Web services (RESTful),Spring Boot, Spring MVC • Java Messaging, Kafka Git, Maven,Agile, SCRUM

Posted 1 month ago

Apply

6.0 - 10.0 years

14 - 19 Lacs

Mumbai, Pune

Work from Office

We are looking for a talented and experienced developer who is technically passionate, solution-focused and able to design, develop, test and maintain high-quality software. You will be working with one of our client - top tier investment bank to design and develop their risk technology platform. As a Senior Java Developer, you will: Work closely with data from sources like Bloomberg, Markit and model and transform source data for specific applications Work with Java 8 and all its features Work with Spring Boot and other Spring modules ( web, data, security, batch) or any other dependency injection framework Work with ( and configure) Distributed caching based on Redis and event based kafka streams Interact with event based applications, micro services and have a strong focus on performance and real time analytics Design and develop various database queries, scripts and tables to pull, clean, arrange and persist risk management data Own delivery and take responsibility of milestones Have a BDD approach ( Cucumber), as well as design and develop automated unit, integration, and regression tests

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

kolkata, west bengal

On-site

Genpact is a global professional services and solutions firm focused on delivering outcomes that shape the future. With over 125,000 employees across 30+ countries, our team is driven by curiosity, agility, and the desire to create lasting value for our clients. Our purpose, the relentless pursuit of a world that works better for people, underpins our work as we serve and transform leading enterprises worldwide, including Fortune Global 500 companies. We leverage deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI to drive innovation and success. We are currently seeking applications for the role of Vice President, Enterprise Architecture Consulting- GCP Delivery Lead at Genpact. In this critical leadership position, you will be responsible for managing the delivery of complex Google Cloud Platform (GCP) projects, ensuring client satisfaction, team efficiency, and innovation. The ideal candidate will bring deep industry expertise, technical excellence, and strong business acumen to shape our organization's data and cloud transformation roadmap. As the Delivery Lead, your key responsibilities include overseeing the successful delivery of multimillion-dollar engagements involving GCP, managing client relationships, leading global project teams, ensuring adherence to delivery governance standards, and driving innovation within the scope of GCP initiatives. You will play a vital role in shaping the data and cloud transformation journey of our organization. Key Responsibilities: - Own and drive end-to-end delivery of GCP & Data Engineering programs across multiple geographies & industry verticals. - Establish a best-in-class data delivery framework ensuring scalability, security, and efficiency in GCP-based transformations. - Act as a trusted advisor to C-level executives, driving customer success, innovation, and business value. - Lead executive-level stakeholder engagement, aligning with business strategy and IT transformation roadmaps. - Drive account growth, supporting pre-sales, solutioning, and go-to-market (GTM) strategies for GCP and data-driven initiatives. - Ensure customer satisfaction and build long-term strategic partnerships with enterprise clients. - Shape the organization's Data & AI strategy, promoting the adoption of GCP, AI/ML, real-time analytics, and automation in enterprise data solutions. - Establish data accelerators, reusable frameworks, and cost optimization strategies to enhance efficiency and profitability. - Build and mentor a high-performing global team of cloud data professionals, including data engineers, architects, and analytics experts. - Foster a culture of continuous learning and innovation, driving upskilling and certifications in GCP, AI/ML, and Cloud Data technologies. - Stay informed about emerging trends in GCP, cloud data engineering, and analytics to drive innovation. Qualifications: Minimum Qualifications: - Experience in IT Services, with a focus on data engineering, GCP, and cloud transformation leadership. - Bachelor's degree in Computer Science, Engineering, or a related field (Masters or MBA preferred). Preferred Qualifications/ Skills: - Proven track record in delivering large-scale, multimillion dollar GCP & data engineering programs. - Deep understanding of the GCP ecosystem, including Data Sharing, Streams, Tasks, Performance Tuning, and Cost Optimization. - Strong expertise in cloud platforms (Azure, AWS) and data engineering pipelines. - Proficiency in modern data architectures, AI/ML, IoT, and edge analytics. - Experience in managing global, multi-disciplinary teams across multiple geographies. - Exceptional leadership and executive presence, with the ability to influence C-suite executives and key decision-makers. Preferred Certifications: - Certified Google Professional Cloud Architect or equivalent. - Cloud Certifications (Azure Data Engineer, AWS Solutions Architect, or equivalent). - PMP, ITIL, or SAFe Agile certifications for delivery governance. If you are a dynamic leader with a passion for driving innovation and transformation in the cloud and data space, we encourage you to apply for the Vice President, Enterprise Architecture Consulting- GCP Delivery Lead role at Genpact. Join us in shaping the future and delivering value to clients worldwide.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

As a Java Full Stack Developer with 7-10 years of experience, you will be responsible for designing, developing, and maintaining scalable Java-based backend systems. Your role will involve building dynamic and responsive user interfaces using React.js, as well as developing and integrating RESTful APIs and microservices. You will work with distributed systems and event-driven architecture, collaborating closely with cross-functional teams in an Agile environment. Your key responsibilities will include participating in code reviews, troubleshooting, and performance tuning. You will also be expected to integrate applications with cloud services (AWS preferred), work with containerized environments, and use Kubernetes for deployment. Ensuring code quality, scalability, and maintainability will be essential aspects of your work. To excel in this role, you should possess a Bachelor's degree in Computer Science or a related field (preferred) and have at least 8 years of experience in Java application development. Proficiency in Java 11+ is required, including Streams, Lambdas, and functional programming. Strong knowledge of Spring Boot, Spring Framework, and RESTful API development is essential, as well as experience with microservices architecture and monitoring tools. You should have a solid understanding of persistence layers such as JPA, Hibernate, MS-SQL, and PostgreSQL. Hands-on experience with React.js and strong frontend development skills with HTML, CSS3/Tailwind, and responsive design are also necessary. Experience with CI/CD pipelines (Jenkins, GitLab CI, GitHub Actions, or AWS DevOps) and familiarity with cloud platforms like AWS, Azure, or GCP (AWS preferred) are important. Exposure to container orchestration using EKS, AKS, or GKE, knowledge of Domain-Driven Design (DDD) and Backend-for-Frontend (BFF) patterns, and working knowledge of Kafka, MQ, or other event-driven technologies are advantageous. Strong problem-solving, debugging, and optimization skills, proficiency in Agile methodologies, version control (Git), and SDLC best practices are also required. Experience in the hospitality domain is a plus.,

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies