Home
Jobs

6963 Kafka Jobs - Page 31

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Global Technology Solutions (GTS) at ResMed is a division dedicated to creating innovative, scalable, and secure platforms and services for patients, providers, and people across ResMed. The primary goal of GTS is to accelerate well-being and growth by transforming the core, enabling patient, people, and partner outcomes, and building future-ready operations. The strategy of GTS focuses on aligning goals and promoting collaboration across all organizational areas. This includes fostering shared ownership, developing flexible platforms that can easily scale to meet global demands, and implementing global standards for key processes to ensure efficiency and consistency. Role Overview As a Data Engineering Lead, you will be responsible for overseeing and guiding the data engineering team in developing, optimizing, and maintaining our data infrastructure. You will play a critical role in ensuring the seamless integration and flow of data across the organization, enabling data-driven decision-making and analytics. Key Responsibilities Data Integration: Coordinate with various teams to ensure seamless data integration across the organization's systems. ETL Processes: Develop and implement efficient data transformation and ETL (Extract, Transform, Load) processes. Performance Optimization: Optimize data flow and system performance for enhanced functionality and efficiency. Data Security: Ensure adherence to data security protocols and compliance standards to protect sensitive information. Infrastructure Management: Oversee the development and maintenance of the data infrastructure, ensuring scalability and reliability. Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven initiatives. Innovation: Stay updated with the latest trends and technologies in data engineering and implement best practices. Qualifications Experience: Proven experience in data engineering, with a strong background in leading and managing teams. Technical Skills: Proficiency in programming languages such as Python, Java, and SQL, along with experience in big data technologies like Hadoop, Spark, and Kafka. Data Management: In-depth understanding of data warehousing, data modeling, and database management systems. Analytical Skills: Strong analytical and problem-solving skills with the ability to handle complex data challenges. Communication: Excellent communication and interpersonal skills, capable of working effectively with cross-functional teams. Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Why Join Us? Work on cutting-edge data projects and contribute to the organization's data strategy. Collaborative and innovative work environment that values creativity and continuous learning. If you are a strategic thinker with a passion for data engineering and leadership, we would love to hear from you. Apply now to join our team and make a significant impact on our data-driven journey. Joining us is more than saying “yes” to making the world a healthier place. It’s discovering a career that’s challenging, supportive and inspiring. Where a culture driven by excellence helps you not only meet your goals, but also create new ones. We focus on creating a diverse and inclusive culture, encouraging individual expression in the workplace and thrive on the innovative ideas this generates. If this sounds like the workplace for you, apply now! We commit to respond to every applicant. Show more Show less

Posted 3 days ago

Apply

9.0 - 14.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Greetings from TCS!!! TCS is hiring for Big Data Architect Location - PAN India Years of Experience - 9-14 years Job Description- Experience with Python, Spark, and Hive data pipelines using ETL processes Apache Hadoop development and implementation Experience with streaming frameworks such as Kafka Hands on experience in Azure/AWS/Google data services Work with big data technologies (Spark, Hadoop, BigQuery, Databricks) for data preprocessing and feature engineering. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job description: Google Cloud Infrastructure (GCP) Engineer is responsible for designing, deploying, and managing cloud infrastructure on Google Cloud Platform (GCP). They should work with various GCP services to build and maintain scalable, secure, and reliable solutions. This includes tasks like infrastructure automation, CI/CD pipeline implementation, performance monitoring, and security implementation. should have a good knowledge on Kafka/Cassandra Skills and Experience: GCP Expertise: Strong understanding of GCP services, including Compute Engine, Kubernetes Engine, BigQuery, and other relevant services. Infrastructure as Code: Experience with tools like Terraform, Ansible, or other infrastructure automation technologies. CI/CD: Familiarity with CI/CD pipelines and tools for automating deployments. Cloud Security: Knowledge of cloud security best practices and experience implementing security measures. Linux/Unix: Proficiency in Linux system administration tasks. Networking: Understanding of networking concepts and cloud networking services. Scripting: Ability to write scripts for automation and deployment. Troubleshooting: Experience troubleshooting and resolving technical issues. Collaboration: Strong communication and collaboration skills. Job Title : GCP Cloud Infra Engineer Key Skills : GCP services, including Compute Engine, Kubernetes Engine, BigQuery, and other relevant services,Terraform, Ansible, CI/CD pipelines, Linux system administration tasks. Job Locations : Any Virtusa Experience : 4 - 6 Education Qualification : Any Graduation Work Mode : Hybrid Employment Type : Contract Notice Period : Immediate - 10 Days Payroll : people prime Worldwide Show more Show less

Posted 3 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title :Data Engineer Key Skills :Python , ETL, Snowflake, Apache Airflow Job Locations : Pan India. Experience : 6-7 Education Qualification : Any Graduation. Work Mode : Hybrid. Employment Type : Contract. Notice Period : Immediate Job Description: 6 to 10 years of experience in data engineering roles with a focus on building scalable data solutions. Proficiency in Python for ETL, data manipulation, and scripting. Hands-on experience with Snowflake or equivalent cloud-based data warehouses. Strong knowledge of orchestration tools such as Apache Airflow or similar. Expertise in implementing and managing messaging queues like Kafka , AWS SQS , or similar. Demonstrated ability to build and optimize data pipelines at scale, processing terabytes of data. Experience in data modeling, data warehousing, and database design. Proficiency in working with cloud platforms like AWS, Azure, or GCP. Strong understanding of CI/CD pipelines for data engineering workflows. Experience working in an Agile development environment , collaborating with cross-functional teams. Preferred Skills: Familiarity with other programming languages like Scala or Java for data engineering tasks. Knowledge of containerization and orchestration technologies (Docker, Kubernetes). Experience with stream processing frameworks like Apache Flink . Experience with Apache Iceberg for data lake optimization and management. Exposure to machine learning workflows and integration with data pipelines. Soft Skills: Strong problem-solving skills with a passion for solving complex data challenges. Excellent communication and collaboration skills to work with cross-functional teams. Ability to thrive in a fast-paced, innovative environment. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About Client: Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. Job Title: Kafka Administrator Location : Pan india Experience : 5 +Years Job Type : Contract to hire. Notice Period :- Immediate joiners. Mandatory Skills : Kafka connect, kafka connect clusters,schema registry, ksqlDBs, kafka streams, confluent kafka Job Summary: Should be proficient in designing and implementing a robust Kafka cluster on Azure considering factors such as scalability for future growth fault tolerance performance and multi zone DR. Expertise in developing integration pipelines using Kafka Connectors to facilitate seamless communication between applications This includes enabling Kafka Sources and Sinks using different connectors Should actively monitor the Kafka clusters health performance and address issues promptly. Shall have experience in confluence Kafka. Having a sound communication and presentable skills. Seniority Level Mid-Senior level Industry IT Services and IT Consulting Employment Type Contract Job Functions Business Development Consulting Skills Kafka connect kafka connect clusters schema registry ksql DBs kafka streams confluent kafka Show more Show less

Posted 3 days ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Java Fullstack devloper overall IT experience including the following: with Strong Angular development experience leading a team of Front-end developers. of experience with microservices architecture of experience working in an Agile based development environment. with Strong core java development experience 4+ years leading a team of Java developers. 4 years of development experience Spring Boot 4+ years of development experience MongoDB & Kafka Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description: The next evolution of AI powered cyber defense is here. With the rise of cloud and modern technologies, organizations struggle with the vast amount of data and thereby security alerts generated by their existing security tools. Cyberattacks continue to get more sophisticated and harder to detect in the sea of alerts and false positives. According to the Forrester 2023 Enterprise Breach Benchmark Report, a security breach costs organizations an average of $3M and takes organizations over 200 days to investigate and respond. AiStrike’s platform aims at reducing the time to investigate and respond to threats by over 90%. Our approach is to leverage the power of AI and machine learning to adopt an attacker mindset to prioritize and automate cyberthreat investigation and response. The platform reduces alerts by 100:5 and provides detailed context and link analysis capabilities to investigate the alert. The platform also provides collaborative workflow and no code automation to cut down the time to respond to threats significantly. If you have the desire to join the next evolution of cyber defense, are willing to work hard and learn fast, and be part of building something special, this is the company for you. We are seeking a highly skilled and experienced hands-on Principal Software Engineer with over 10+ years of proven expertise in the field. As a Principal Architect, you will play a crucial role in leading the architecture, designing, and implementing scalable cloud solutions for our Cloud-native SaaS products. The ideal candidate will have significant experience and a strong background in object-oriented design and coding skills, with hands-on experience in Java and Python. Roles and Responsibilities: Manage overarching product/platform architecture, and technology selection and make sure that the design and development of all projects follow the architectural vision Design and architect scalable cloud solutions for Cloud-native SaaS development projects in line with the latest technology and practices Successfully communicate, evangelize, and implement the architectural vision across teams and products Design and coordinate projects of significant size and complexity Work with containerization technologies and orchestration software such as Kubernetes on cloud platforms like AWS and Azure. Develop and implement Microservices-based architecture using Java, SpringBoot, ReactJS, NextJS, and other relevant technologies. Implement secure design principles and practices, ensuring the integrity and confidentiality of our systems. Collaborate with cross-geography cross-functional teams to define and refine requirements and specifications. Deploy workloads at scale in AWS EKS/ECS environments and others as needed Create automation and use monitoring tools to efficiently build, deploy and support cloud implementations. Implement DevOps methodologies and tools for continuous integration and delivery. Utilize APM and Monitoring Tools like ELK, Splunk, Datadog, Dynatrace, and Appdynamics for cloud-scale monitoring. Work with potential customers to understand their environment. Provide technical leadership, architecture guidance, and mentorship to the teams. Have a clear focus on scale, cost, security, and maintainability. Stay updated on industry best practices, emerging technologies, and cybersecurity trends. Skills and Qualifications: 10+ years of overall experience in software development and architecture. In depth knowledge and experience in Cloud-native SaaS development and architecture. Proficient in Java, Python, RESTful APIs, API Gateway, Kafka, and Microservices communications. Experience with RDBMS and NoSQL databases (e.g., Neo4J, MongoDB, Redis). Experience in working with Graph databases like Neo4J. Expertise in containerization technologies (Docker) and Kubernetes. Hands-on experience with secure DevOps practices. Familiarity with Multi-Factor Authentication and Single Sign-On principles. Excellent verbal and written communication skills. Self-starter with strong organizational and problem-solving skills. Prior experience in deploying workloads at scale in AWS EKS/ECS/Fargate. Knowledge of Cloud-scale APM and Monitoring Tools (ELK, Splunk, Datadog, etc.). Previous experience in Cybersecurity products is desirable but not mandatory. Preferred: AWS Certified Solutions Architect – Professional or similar certification, including certifications on other cloud platforms. Commitment, team player, integrity and customer focus AiStrike is committed to providing equal employment opportunities. All qualified applicants and employees will be considered for employment and advancement without regard to race, color, religion, creed, national origin, ancestry, sex, gender, gender identity, gender expression, physical or mental disability, age, genetic information, sexual or affectional orientation, marital status, status regarding public assistance, familial status, military or veteran status or any other status protected by applicable law. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

At Aramya, we’re redefining fashion for India’s underserved Gen X/Y women, offering size-inclusive, comfortable, and stylish ethnic wear at affordable prices. Launched in 2024, we’ve already achieved ₹40 Cr in revenue in our first year, driven by a unique blend of data-driven design, in-house manufacturing, and a proprietary supply chain. Today, with an ARR of ₹100 Cr, we’re scaling rapidly with ambitious growth plans for the future. Our vision is bold to build the most loved fashion and lifestyle brands across the world while empowering individuals to express themselves effortlessly. Backed by marquee investors like Accel and Z47, we’re on a mission to make high-quality ethnic wear accessible to every woman. We’ve built a community of loyal customers who love our weekly design launches, impeccable quality, and value-for-money offerings. With a fast-moving team driven by creativity, technology, and customer obsession, Aramya is more than a fashion brand—it’s a movement to celebrate every woman’s unique journey. We’re looking for a passionate Data Engineer with a strong foundation. The ideal candidate should have a solid understanding of D2C or e-commerce platforms and be able to work across the stack to build high-performing, user-centric digital experiences. Key Responsibilities Design, build, and maintain scalable ETL/ELT pipelines using tools like Apache Airflow, Databricks , and Spark . Own and manage data lakes/warehouses on AWS Redshift (or Snowflake/BigQuery). Optimize SQL queries and data models for analytics, performance, and reliability. Develop and maintain backend APIs using Python (FastAPI/Django/Flask) or Node.js . Integrate external data sources (APIs, SFTP, third-party connectors) and ensure data quality & validation. Implement monitoring, logging, and alerting for data pipeline health. Collaborate with stakeholders to gather requirements and define data contracts. Maintain infrastructure-as-code (Terraform/CDK) for data workflows and services. Must-Have Skills Strong in SQL and data modeling (OLTP and OLAP). Solid programming experience in Python , preferably for both ETL and backend. Hands-on experience with Databricks , Redshift , or Spark . Experience with building and managing ETL pipelines using tools like Airflow , dbt , or similar. Deep understanding of REST APIs , microservices architecture, and backend design patterns. Familiarity with Docker , Git, CI/CD pipelines. Good grasp of cloud platforms (preferably AWS ) and services like S3, Lambda, ECS/Fargate, CloudWatch. Nice-to-Have Skills Exposure to streaming platforms like Kafka, Kinesis, or Flink. Experience with Snowflake , BigQuery , or Delta Lake . Proficient in data governance , security best practices, and PII handling. Familiarity with GraphQL , gRPC , or event-driven systems. Knowledge of data observability tools (Monte Carlo, Great Expectations, Datafold). Experience working in a D2C/eCommerce or analytics-heavy product environment. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Senior Java Developer Location: Airoli, Mumbai (Onsite) Industry: BFSI / Fintech About the Role We are looking for a highly skilled and passionate Senior Java Developer with strong hands-on experience in developing scalable, high-performance applications. You will play a critical role in building low-latency, high-throughput systems focused on risk and fraud monitoring in the BFSI domain. Key Responsibilities Design and develop microservices using Java (latest versions) , Spring Boot , and RESTful APIs Build robust data streaming solutions using Apache Flink and Kafka Implement business rules using Drools Rule Engine Contribute to the development of low-latency, high-throughput platforms for fraud detection and risk monitoring Participate in Agile development, code reviews, and CI/CD pipelines with a strong focus on Test-Driven Development (TDD) Debug complex issues and take full ownership from design to deployment Collaborate with cross-functional teams and participate in cloud-native development using AWS (IaaS / PaaS) Required Skills Java , Spring Boot , REST APIs , Virtual Threads Apache Flink , Kafka – real-time data stream processing Drools Rule Engine Strong grasp of J2EE , OOP principles , and Design Patterns Experience working with CI/CD tools , Git , Quay , TDD Familiarity with Cloud-native solutions, especially in AWS environments Preferred Experience BFSI / Fintech domain experience in building risk and fraud monitoring applications Exposure to Agile methodology and tools like JIRA Solid communication skills and strong sense of ownership Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

5 years of experience as a Data Engineer or a similar role. Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field. Strong knowledge of data engineering tools and technologies (e.g. SQL, ETL, data warehousing). Experience with data pipeline frameworks and data processing platforms (e.g. Apache Kafka, Apache Spark). Proficiency in programming languages such as Python, Java, or Scala. Experience with cloud platforms (e.g. AWS, Google Cloud Platform, Azure). Knowledge of data modeling, database design, and data governance. Mongo DB Is Must Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Senior Backend Engineer – Python & Microservices Location: Remote Experience Required: 8–10+ years 🚀 About the Role: We’re looking for a Senior Backend Engineer (Python & Microservices) to join a high-impact engineering team focused on building scalable internal tools and enterprise SaaS platforms. You'll play a key role in designing cloud-native services, leading microservices architecture, and collaborating closely with cross-functional teams in a fully remote environment. 🔧 Responsibilities: Design and build scalable microservices using Python (Flask, FastAPI, Django) Develop production-grade RESTful APIs and background job systems Architect modular systems and drive microservice decomposition Manage SQL & NoSQL data models (PostgreSQL, MongoDB, DynamoDB, ClickHouse) Implement distributed data pipelines using Kafka, RabbitMQ, and SQS Apply best practices in rate limiting, security, performance optimisation, logging, and observability (Grafana, Datadog, CloudWatch) Deploy services in cloud environments (AWS preferred, Azure/GCP acceptable) using Docker, Kubernetes, and EKS Contribute to CI/CD and Infrastructure as Code (Jenkins, Terraform, GitHub Actions) ✅ Requirements: 8–10+ years of hands-on backend development experience Strong proficiency in Python (Flask, FastAPI, Django, etc.) Solid experience with microservices and containerised environments (Docker, Kubernetes, EKS) Expertise in REST API design, rate limiting, and performance tuning Familiarity with SQL & NoSQL (PostgreSQL, MongoDB, DynamoDB, ClickHouse) Experience with cloud platforms (AWS preferred; Azure/GCP also considered) CI/CD and IaC knowledge (GitHub Actions, Jenkins, Terraform) Exposure to distributed systems and event-based architectures (Kafka, SQS) Excellent written and verbal communication skills 🎯 Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science or a related field Certifications in Cloud Architecture or System Design Experience integrating with tools like Zendesk, Openfire, or similar chat/ticketing platforms Show more Show less

Posted 3 days ago

Apply

5.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Hiring for Java Developers for Pune Location Key Requirements: 1. Experience: 5-7 years of experience in Java development. 2. Skills: Strong knowledge of Java and Kafka. 3. Location: Pune. Notice Period: Immediate to 15 days. CTC: Up to 30 LPA. How to Apply: Please send your CV to Shabnam.s@liveconnections.in #liveconnections #livec #weplacepeoplefirst Show more Show less

Posted 3 days ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value Job Description Your Impact: Drive the design, planning, and implementation of multifaceted applications, giving you breadth and depth of knowledge across the entire project lifecycle. Combine your technical expertise and problem-solving passion to work closely with clients, turning complex ideas into end-to-end solutions that transform our clients’ business. Constantly innovate and evaluate emerging technologies and methods to provide scalable and elegant solutions that help clients achieve their business goals. Qualifications 4+ Year of Experience in Java development only. Strong development skills in Java JDK 1.8 or above. Java fundamentals like Exceptional handling, Serialization/Deserialization and Immutability concepts . Good fundamental knowledge in Enums, Collections, Annotations, Generics, Auto boxing and Data Structure Database RDBMS/No SQL (SQL, Joins, Indexing) Multithreading (Re-entrant Lock, Fork & Join, Sync, Executor Framework) Spring Core & Spring Boot, security, transactions Hands-on experience with JMS (ActiveMQ, RabbitMQ, Kafka etc) Memory Mgmt (JVM configuration, Profiling, GC), profiling, Perf tunning, Testing, Jmeter/similar tool) Devops (CI/CD: Maven/Gradle, Jenkins, Quality plugins, Docker and containersization) Logical/Analytical skills. Thorough understanding of OOPS concepts, Design principles and implementation of different type of Design patterns. Hands-on experience with any of the logging frameworks (SLF4J/LogBack/Log4j) Experience of writing Junit test cases using Mockito / Powermock frameworks. Should have practical Experience with Maven/Gradle and knowledge of version control systems like Git/SVN etc. Good communication skills and ability to work with global teams to define and deliver on projects. Sound understanding/experience in software development process, test-driven development. Cloud – AWS / AZURE / GCP Experience in Microservices Show more Show less

Posted 3 days ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana

Remote

Indeed logo

Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana

Posted 3 days ago

Apply

15.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Exciting Opportunity for Java Microservices Architecture with LTI Mindtree Location:- Pune, Mumbai, Chennai, Bangalore, Hyderabad Work Mode - Hybrid Notice Period- Immediate joiner or 30 Days Mandatory Skills : Java, Microservices, Kubernetes, Kafka, Camel Experience- 12-16 Yrs Note: Please do not share profiles if your experience is not matching and Notice period is 90 Days Job description Bachelors Degree Computer Science or Engineering degrees preferred (Mandatory) Hands on architect who is comfortable working with engineers at code detailed design level 15 years of experience in software engineering and architecture Coding experiences and skills preferred in Java Net Python Kafka MQ JavaScript Angular Nodejs and Azure Ability to guide and mentor development teams on sound design and coding principles best practices in API based Micro Services and Cloud environments Ability to work with development teams closely together to review code recommend best practices and mentor junior engineers Experience in application modularization and modernization including adoption of Micro Service Architecture models Experience in Integration Architectures from Service Bus to API Mediation and Event Bus models Ability to abstract up to harvest patterns and articulate design decisions with senior audience CIO level Knowledge and experience of insurance underwriting platform architectures is a plus Experience in Information Security and Risk Management Experience in cloud solution design and implementation of IaaS PaaS and SaaS and hybrid models Understanding of platform design and technology considerations to build out an ecosystem for producers clients and internal staff If interested please send me your updated resume to Pooja.chakravarti@ltimindtree.com Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Description: In Securities Operations IT, we are looking for individuals who are passionate about what they do, who value excellence, learning and integrity. Our culture places emphasis on team work, collaboration and delivering business value while working at a sustainable pace. Most importantly we are looking for someone who is technically excellent and can aspire, and inspire others, to our values and culture. Responsibilities: • Develop, test, automate, release, and maintain Java web/backend applications using Java and Spring Boot, Kafka & Azure. • Collaborate with cross-functional teams to define and implement new features. • Write clean, efficient, and well-documented code that meets industry standards and best practices. • Troubleshoot and debug issues in existing applications and provide timely resolutions. • Participate in code reviews and provide constructive feedback to team members. • Stay up-to-date with emerging trends and technologies in software development, and apply them to improve the quality and performance of applications. • Write and execute unit tests and regression BDD tests to ensure the reliability and functionality of code. • Implement security measures to protect applications from potential threats and vulnerabilities. • Document technical specifications, user manuals, and other relevant documentation to ensure effective communication and knowledge transfer. Mandatory Skills Description: • Hands-on experience with Java programming language, SpringBoot, and Kafka, Azure, Containers, Kubernetes (5+ years) • Experience with app deployment on Azure cloud / AKS. • Understanding of software development principles and methodologies • Ability to work in agile scrum methodology. • Excellent problem-solving and analytical skills to build highly performant solutions. • Good communication and teamwork abilities Nice-to-Have Skills Description: • Knowledge of front-end technologies like HTML, CSS, JavaScript, react is a plus. Show more Show less

Posted 3 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra

Remote

Indeed logo

R020800 Pune, Maharashtra, India Engineering Regular Location Details: Pune (Maharastra) - Hybrid At GoDaddy the future of work looks different for each team. Some teams work in the office full-time; others have a hybrid arrangement (they work remotely some days and in the office some days) and some work entirely remotely. Hybrid: This is a hybrid position. You’ll divide your time between working remotely from your home and an office, so you should live within commuting distance. Hybrid teams may work in-office as much as a few times a week or as little as once a month or quarter, as decided by leadership. The hiring manager can share more about what hybrid work might look like for this team. Join Our Team... The Terminal Management team in the GoDaddy Commerce Division is responsible for the maintenance and development of Poynt Smart Terminal Management System (aka Mothership) capabilities that monitors, provisions, and updates all Smart Terminals, Mobile apps and 3rd party PoyntOS enabled hardware. It plays a vital role in the entire lifecycle of a Payment Terminal - from the factory, through fulfillment, at the customer site (e.g. retail stores) and our repair centers. This system enables real time control and insights of each Terminal, enabling customer service, fulfillment providers, resellers, banking partners and ISOs to remotely manage their customers’ devices, manage various device settings and fine tune the device for a customer’s in-store environment (network settings, timeouts, EMV parameters, etc.). If you love building impactful products that set the path for future growth and expansion immediately, join our team! What you'll get to do... As a member of the Terminal Management team, you will play a pivotal role in enhancing various critical aspects of our Terminal Management platform, contributing to a seamless and efficient ecosystem for managing payment terminals globally: Comprehensive API Development: Build and maintain a robust set of APIs to manage payment terminals, supporting both GoDaddy’s proprietary devices and third-party OEM hardware across diverse markets worldwide. Over-the-Air (OTA) Management System: Design and improve a scalable OTA solution to efficiently distribute OS and application updates to hundreds of thousands of terminals globally, ensuring devices are always up-to-date with the latest features and security enhancements. Real-Time Data Collection and Diagnostics: Implement advanced systems for collecting and managing logs, crash reports, and critical system diagnostics in real-time, enabling rapid troubleshooting and proactive issue resolution. Data Integration and Analytics: Work closely with data ingestion pipelines to streamline the collection and processing of vital telemetry data, feeding into business analytics, product insights, and machine learning models that inform strategic decision-making. Global Device Provisioning and Configuration: Develop systems for seamless provisioning and configuration of devices at scale, allowing easy customization for diverse customer environments, including network settings, EMV parameters, and region-specific requirements. Enhanced Lifecycle Management: Contribute to features that support every phase of a terminal's lifecycle—from factory production and fulfillment to deployment at retail locations and service centers, ensuring operational excellence and customer satisfaction. Partner and Reseller Enablement: Enable customer service teams, resellers, banking partners, and ISOs with tools to remotely manage and fine-tune devices, driving efficiency and reducing customer downtime. Your experience should include... 3+ years of experience in server-side programming preferably with Java / Golang. Proficient in developing secure, high-performance cloud applications on AWS (ECS, EC2). Expertise in designing and implementing external-facing, highly organized APIs. Skilled in building large-scale cloud services, distributed systems, and event-driven architectures. Strong knowledge of databases (SQL, NoSQL) and scalable data management solutions. You might also have... At least 2 years experience with Java / Golang backend development. Knowledge of integrating messaging systems like Kafka, RabbitMQ, or AWS SNS/SQS. Familiarity with AWS Lambda or similar platforms for building lightweight, event-driven applications. We've got your back... We offer a range of total rewards that may include paid time off, retirement savings (e.g., 401k, pension schemes), bonus/incentive eligibility, equity grants, participation in our employee stock purchase plan, competitive health benefits, and other family-friendly benefits including parental leave. GoDaddy’s benefits vary based on individual role and location and can be reviewed in more detail during the interview process. We also embrace our diverse culture and offer a range of Employee Resource Groups (Culture). Have a side hustle? No problem. We love entrepreneurs! Most importantly, come as you are and make your own way. About us... GoDaddy is empowering everyday entrepreneurs around the world by providing the help and tools to succeed online, making opportunity more inclusive for all. GoDaddy is the place people come to name their idea, build a professional website, attract customers, sell their products and services, and manage their work. Our mission is to give our customers the tools, insights, and people to transform their ideas and personal initiative into success. To learn more about the company, visit About Us. At GoDaddy, we know diverse teams build better products—period. Our people and culture reflect and celebrate that sense of diversity and inclusion in ideas, experiences and perspectives. But we also know that’s not enough to build true equity and belonging in our communities. That’s why we prioritize integrating diversity, equity, inclusion and belonging principles into the core of how we work every day—focusing not only on our employee experience, but also our customer experience and operations. It’s the best way to serve our mission of empowering entrepreneurs everywhere, and making opportunity more inclusive for all. To read more about these commitments, as well as our representation and pay equity data, check out our Diversity and Pay Parity annual report which can be found on our Diversity Careers page. GoDaddy is proud to be an equal opportunity employer . GoDaddy will consider for employment qualified applicants with criminal histories in a manner consistent with local and federal requirements. Refer to our full EEO policy. Our recruiting team is available to assist you in completing your application. If they could be helpful, please reach out to myrecruiter@godaddy.com. GoDaddy doesn’t accept unsolicited resumes from recruiters or employment agencies.

Posted 3 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

GE Healthcare Healthcare Science & Technology Organization Category Digital Technology / IT Mid-Career Job Id R4021877 Relocation Assistance No Location Bengaluru, Karnataka, India, 560066 Job Description Summary This is a critical position in a globally distributed organization, presents a unique opportunity to develop products that define healthcare solutions globally. You will be responsible for setting technical direction, overseeing the architecture and implementation of cloud services, and driving continuous improvement in development processes. Your expertise in cloud technologies and leadership skills will be crucial in shaping the future of our product offerings. GE HealthCare is a leading global medical technology and digital solutions innovator. Our purpose is to create a world where healthcare has no limits. Unlock your ambition, turn ideas into world-changing realities, and join an organization where every voice makes a difference, and every difference builds a healthier world. Job Description Roles and Responsibilities In this role, you will: Be responsible for defining, developing, and evolving software in a fast paced and agile development environment using the latest software development technologies and infrastructure Provide technical leadership to an agile team of 3-5 engineers Work with Product Line Leaders (PLLs) to understand product requirements & vision Translate requirements/vision into prioritized list of user stories and deliver to required timelines and quality standards Work with product line leaders and architects to develop multi generation software technology plans Drive increased efficiency across the teams, eliminating duplication, leveraging product and technology reuse Support process improvements which guide the development, sustaining & support activities Work cross functionally with other business departments to align activities and deliverables Drive world-class quality in the development and support of products Engage subject matter experts in successful transfer of complex domain knowledge Apply principles of SDLC and methodologies like Lean/Agile/XP, CI, Software and Product Security, Scalability, Documentation Practices, refactoring and Testing Techniques Maintain code quality through best practices, unit testing and automation. Establish coding standards and conduct regular code reviews to ensure delivery of product with high-quality. Work on core data structures and algorithms and implement them using technology chosen Education Qualification Bachelor's Degree in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with 6+ years of experience. Technical Expertise AWS Expert, certification preferred Python expert Experience with Containers - Docker/Kubernetes, Helm Strong knowledge of Design Patterns and Unit Testing frameworks Expert on Kafka and/or related messaging/ frameworks, ActiveMQ Hands-on experience in web services (REST), SQL, Hibernate on Database such as Oracle MySQL, PostgreSQL Business Acumen Demonstrates the initiative to explore alternate technology and approaches to solving problems Skilled in breaking down problems, documenting problem statements and estimating efforts Demonstrates awareness about competitors and industry trends Has the ability to analyze impact of technology choices Leadership Ability to takes ownership of the project and deliver while mentoring and helping team members Ensures understanding of issues and presents clear rationale. Able to speak to mutual needs and win-win solutions. Uses two-way communication to influence outcomes and ongoing results Identifies misalignments with goals, objectives, and work direction against the organizational strategy. Makes suggestions to course correct Continuously measures deliverables of self and team against scheduled commitments. Effectively balances different, competing objectives Personal Attributes Strong oral and written communication skills Strong interpersonal skills Effective team building and problem solving abilities Persists to completion, especially in the face of overwhelming odds and setbacks. Inclusion and Diversity GE HealthCare is an Equal Opportunity Employer where inclusion matters. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. We expect all employees to live and breathe our behaviors: to act with humility and build trust; lead with transparency; deliver with focus, and drive ownership – always with unyielding integrity. Our total rewards are designed to unlock your ambition by giving you the boost and flexibility you need to turn your ideas into world-changing realities. Our salary and benefits are everything you’d expect from an organization with global strength and scale, and you’ll be surrounded by career opportunities in a culture that fosters care, collaboration and support. Disclaimer: GE HealthCare will never ask for payment to process documents, refer you to a third party to process applications or visas, or ask you to pay costs. Never send money to anyone suggesting they can provide employment with GE HealthCare. If you suspect you have received a fraudulent call , please fill out the form below: https://www.ge.com/careers/fraud #LI-MA6 Additional Information Relocation Assistance Provided: No

Posted 3 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

GE Healthcare Healthcare Science & Technology Organization Category Digital Technology / IT Mid-Career Job Id R4021878 Relocation Assistance No Location Bengaluru, Karnataka, India, 560066 Job Description Summary As a Senior Software Engineer, You will be responsible for setting technical direction, overseeing the architecture and implementation of cloud services, and driving continuous improvement in development processes. Your expertise in cloud technologies and leadership skills will be crucial in shaping the future of our product offerings. GE Healthcare is a leading global medical technology and digital solutions innovator. Our mission is to improve lives in the moments that matter. Unlock your ambition, turn ideas into world-changing realities, and join an organization where every voice makes a difference, and every difference builds a healthier world. Job Description Roles and Responsibilities In this role, you will: Be responsible for defining, developing, and evolving software in a fast paced and agile development environment using the latest software development technologies and infrastructure Provide technical leadership to an agile team of 3-5 engineers Work with Product Line Leaders (PLLs) to understand product requirements & vision Translate requirements/vision into prioritized list of user stories and deliver to required timelines and quality standards Work with product line leaders and architects to develop multi generation software technology plans Drive increased efficiency across the teams, eliminating duplication, leveraging product and technology reuse Support process improvements which guide the development, sustaining & support activities Work cross functionally with other business departments to align activities and deliverables Drive world-class quality in the development and support of products Engage subject matter experts in successful transfer of complex domain knowledge Apply principles of SDLC and methodologies like Lean/Agile/XP, CI, Software and Product Security, Scalability, Documentation Practices, refactoring and Testing Techniques Maintain code quality through best practices, unit testing and automation. Establish coding standards and conduct regular code reviews to ensure delivery of product with high-quality. Work on core data structures and algorithms and implement them using technology chosen Educational Qualification Bachelor's Degree in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with a minimum of 6+ years of experience in Software Development . Technical Expertise AWS Expert, certification preferred Expertise in Python. Proficiency in Design Patterns and Unit Testing frameworks Experience with Containers - Docker/Kubernetes, Helm Expert on Kafka and/or related messaging/ frameworks, ActiveMQ Hands-on experience in web services (REST), EJBs Experience in SQL, Hibernate on Database such as Oracle MySQL, PostgreSQL Business Acumen Demonstrates the initiative to explore alternate technology and approaches to solving problems Skilled in breaking down problems, documenting problem statements and estimating efforts Demonstrates awareness about competitors and industry trends Has the ability to analyze impact of technology choices Leadership Ability to takes ownership of the project and deliver while mentoring and helping team members Ensures understanding of issues and presents clear rationale. Able to speak to mutual needs and win-win solutions. Uses two-way communication to influence outcomes and ongoing results Identifies misalignments with goals, objectives, and work direction against the organizational strategy. Makes suggestions to course correct Continuously measures deliverables of self and team against scheduled commitments. Effectively balances different, competing objectives Personal Attributes Excellent oral and written communication skills Good interpersonal skills Effective team building and problem solving abilities Persists to completion, especially in the face of overwhelming odds and setbacks. Inclusion and Diversity GE Healthcare is an Equal Opportunity Employer where inclusion matters. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. We expect all employees to live and breathe our behaviors: to act with humility and build trust; lead with transparency; deliver with focus, and drive ownership – always with unyielding integrity. Our total rewards are designed to unlock your ambition by giving you the boost and flexibility you need to turn your ideas into world-changing realities. Our salary and benefits are everything you’d expect from an organization with global strength and scale, and you’ll be surrounded by career opportunities in a culture that fosters care, collaboration and support. #LI-AM11 #LI-Hybrid Additional Information Relocation Assistance Provided: No

Posted 3 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Engineer III, Database Engineering Gurgaon, India; Hyderabad, India Information Technology 316332 Job Description About The Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 3 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location Gurugram, Haryana, India This job is associated with 2 categories Job Id GGN00001744 Information Technology Job Type Full-Time Posted Date 06/16/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values: At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities United Airlines is seeking talented people to join the Data Engineering team. Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus. You will work as a Senior Engineer - Machine Learning and collaborate with data scientists and data engineers to: Build high-performance, cloud-native machine learning infrastructure and services to enable rapid innovation across United Build complex data ingestion and transformation pipelines for batch and real-time data Support large scale model training and serving pipelines in distributed and scalable environment This position is offered on local terms and conditions within United’s wholly owned subsidiary United Airlines Business Services Pvt. Ltd. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains, compensates, and promotes regardless of race, religion, color, national origin, gender identity, sexual orientation, physical ability, age, veteran status, and other protected status as required by applicable law. Qualifications Required BS/BA, in Advanced Computer Science, Data Science, Engineering or related discipline or Mathematics experience required Strong software engineering experience with Python and at least one additional language such as Go, Java, or C/C++ Familiarity with ML methodologies and frameworks (e.g., PyTorch, Tensorflow) and preferably building and deploying production ML pipelines Experience developing cloud-native solutions with Docker and Kubernetes Cloud-native DevOps, CI/CD experience using tools such as Jenkins or AWS CodePipeline; preferably experience with GitOps using tools such as ArgoCD, Flux, or Jenkins X Experience building real-time and event-driven stream processing pipelines with technologies such as Kafka, Flink, and Spark Experience setting up and optimizing data stores (RDBMS/NoSQL) for production use in the ML app context Strong desire to stay aligned with the latest developments in cloud-native and ML ops/engineering and to experiment with and learn new technologies Experience 3 + years of software engineering experience with languages such as Python, Go, Java, Scala, Kotlin, or C/C++ 2 + years of experience working in cloud environments (AWS preferred) 2 + years of experience with Big Data technologies such as Spark, Flink 2 + years of experience with cloud-native DevOps, CI/CD At least one year of experience with Docker and Kubernetes in a production environment Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Preferred Masters in computer science or related STEM field

Posted 3 days ago

Apply

6.0 - 8.0 years

0 Lacs

Baner, Pune, Maharashtra

Remote

Indeed logo

Baner, Pune, Maharashtra, India Department Delivery Unit 2 Job posted on Jun 16, 2025 Employment type Permanent Associate Tech Specialist - Java, Vue.js & React Experience- 6 to 8 Years Project Specific work timing- 2 PM to 11 PM Job Location- Pune (Hybrid) / Remote Job Summary We are seeking a highly skilled Senior Software Developer with full stack expertise to join and mentor a dynamic team of engineers. This role involves end-to-end software development across backend and frontend components. You will design, develop, document, test, and deploy modern applications leveraging Java, MySQL, CouchDB, React, and REST APIs. Experience with ActiveMQ is a plus. The ideal candidate thrives in an agile environment, contributes to design discussions, and promotes engineering best practices. Key Skills: Bachelor’s degree in computer science, Information Systems, or a related field, or equivalent work experience. 6+ years of professional experience in software development, with a strong command of Java (backend development). 3+ experience in cloud technologies (Azure preferred) including, but not limited to App Services, API Gateway, Service Bus, Azure Storage, Azure Functions and Application Insights. Experience in building Event-Driven microservices leveraging high throughput, real-time stream/batch processing, and interacting with Restful API’s. Hands-on experience with MySQL and at least one NoSQL database (CouchDB preferred). Frontend development experience using React and related tools (e.g., Redux, Webpack, etc.). Proven ability to implement the React framework from scratch, including project setup, state management, and component architecture. Experience building and deploying full-stack applications in production. Exposure to messaging systems such as ActiveMQ, Kafka, or RabbitMQ (ActiveMQ preferred). Experience creating CI/CD pipelines. Demonstrable experience with Agile processes (KANBAN, Scrum) using popular Agile Software such as Jira. Solid understanding of object-oriented programming, design patterns, and software architecture principles. Knowledge of Test-Driven Development and common testing frameworks. Being generally accessible during primary hours of operation. Duties and Responsibilities Lead software development efforts for on-premise and cloud-based applications through the entire software development lifecycle. Design and implement scalable backend services using Java and RESTful APIs. Develop modern, responsive UIs using Vue.js. Work with MySQL and CouchDB to design and optimize data storage solutions. Preferred: Integrate and utilize ActiveMQ or similar messaging systems for asynchronous processing. Participate in Agile ceremonies, contribute to sprint planning, and ensure timely delivery. Conduct code reviews and mentor junior developers, providing technical guidance and constructive feedback. Participate in architectural planning and risk assessments, ensuring scalable and secure solutions. Identify performance bottlenecks and advocate for platform improvements. Collaborate across cross-functional teams, including QA, DevOps, and Product Management. Producing technical documentation for the software that you have contributed to. Clear communication regarding project status with your managers.

Posted 3 days ago

Apply

5.0 - 8.0 years

0 Lacs

Baner, Pune, Maharashtra

Remote

Indeed logo

Baner, Pune, Maharashtra, India Department Delivery Unit 2 Job posted on Jun 16, 2025 Employment type Permanent Associate Tech Specialist - Java, Vue.js & React Experience- 5 to 8 Years Project Specific work timing- 2 PM to 11 PM Job Location- Pune (Hybrid) / Remote Job Summary We are seeking a highly skilled Senior Software Developer with full stack expertise to join and mentor a dynamic team of engineers. This role involves end-to-end software development across backend and frontend components. You will design, develop, document, test, and deploy modern applications leveraging Java, MySQL, CouchDB, React, and REST APIs. Experience with ActiveMQ is a plus. The ideal candidate thrives in an agile environment, contributes to design discussions, and promotes engineering best practices. Key Skills: Bachelor’s degree in computer science, Information Systems, or a related field, or equivalent work experience. 6+ years of professional experience in software development, with a strong command of Java (backend development). 3+ experience in cloud technologies (Azure preferred) including, but not limited to App Services, API Gateway, Service Bus, Azure Storage, Azure Functions and Application Insights. Experience in building Event-Driven microservices leveraging high throughput, real-time stream/batch processing, and interacting with Restful API’s. Hands-on experience with MySQL and at least one NoSQL database (CouchDB preferred). Frontend development experience using React and related tools (e.g., Redux, Webpack, etc.). Proven ability to implement the React framework from scratch, including project setup, state management, and component architecture. Experience building and deploying full-stack applications in production. Exposure to messaging systems such as ActiveMQ, Kafka, or RabbitMQ (ActiveMQ preferred). Experience creating CI/CD pipelines. Demonstrable experience with Agile processes (KANBAN, Scrum) using popular Agile Software such as Jira. Solid understanding of object-oriented programming, design patterns, and software architecture principles. Knowledge of Test-Driven Development and common testing frameworks. Being generally accessible during primary hours of operation. Duties and Responsibilities Lead software development efforts for on-premise and cloud-based applications through the entire software development lifecycle. Design and implement scalable backend services using Java and RESTful APIs. Develop modern, responsive UIs using Vue.js. Work with MySQL and CouchDB to design and optimize data storage solutions. Preferred: Integrate and utilize ActiveMQ or similar messaging systems for asynchronous processing. Participate in Agile ceremonies, contribute to sprint planning, and ensure timely delivery. Conduct code reviews and mentor junior developers, providing technical guidance and constructive feedback. Participate in architectural planning and risk assessments, ensuring scalable and secure solutions. Identify performance bottlenecks and advocate for platform improvements. Collaborate across cross-functional teams, including QA, DevOps, and Product Management. Producing technical documentation for the software that you have contributed to. Clear communication regarding project status with your managers.

Posted 3 days ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

SmalBlu is the world’s first cross-layer AI platform that autonomously optimises enterprise data infrastructure across user, app, compute, network, database, and storage layers. Powered by LLM-powered agentic system and proprietary optimisation and compression technologies, SmalBlu helps enterprises cut cloud costs by 40%, carbon footprint by 35%, and query latency by 30%, while enabling deep infrastructure intelligence for CXOs. Preference: Fast learners and track owners with 2+ industry experience in their field. Part-time also works if you are currently working somewhere (30 hours per week min) AI Engineering Scope of Work: You will be responsible for building the core agentic intelligence layer of SmalBlu. This includes developing and fine-tuning ML models for optimizing compute, network, and storage layers across enterprises. You’ll integrate with LangGraph workflows, vector databases, and cost-performance datasets to enable AI-driven decision-making across the platform. Responsibilities: Build intelligent agent workflows using LangChain or LangGraph Train and fine-tune models for resource optimisation, cost prediction, and carbon analytics Integrate custom LLM pipelines with APIs and event systems ( Kafka, REST ) Work on Pinecone or other vector DBs to power infra-aware memory Skills Required: Proficiency in Python Experience with TensorFlow/PyTorch, LangChain, Pinecone Familiarity with LLMs, RAG models, vector search Strong grasp of optimization, compute, or data infra modeling About the Position: Duration: 6-month paid internship with a compensation of upto 50k per month + ESOPs , followed by full-time employment based on performance with competitive pay. Compensation & Perks: ✅ Full-time employment opportunity with full-time pay + incentives post-internship ✅ Hybrid work model ( Ahmedabad preferred ) ✅ Be a core part of a fast-growing impact-driven startup tificial Intelligence, or a related field Application Link : https://forms.gle/VUhCvyPHrw8czdkv8 You can reach out to us at garv@smalblu.ai via mail or at +91 6354057091/+91 8488006997 for any questions. Show more Show less

Posted 3 days ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Project Description: Financial Market Digital Channels team is driven to provide world class technology to support the bank’s Financial Markets business, working specifically on the bank’s in-house built pricing, execution and trade processing platform. We bring a deep understanding of the domain, a scientific approach, and innovative solutions to bear on the challenges of best servicing our customers in a highly competitive environment. This is a rare opportunity to join an organization working with smart technologists globally in the financial markets domain. The culture in the team is open, intellectual, and fun. Learning opportunities are plentiful and career advancement is always waiting for those high-energized talents willing and able to step up. Responsibilities: • Be a proactive, self-starting developer who can quickly understand the requirements, technology platform and external integrations and start designing and contributing to the delivery. • Liaise with devops and wider team to effectively manage resource usage • Apply your experience of financial markets electronic trade capture, booking and full lifecycle to build these features into our platform • Work within the ever-evolving framework of Financial Markets specifications and regulations to ensure CAT is using best-in-class technology within the boundaries of approved technologies • Build services with an API-first approach and generic pattern mindset for scale across multiple dimensions - optimized compute/performance, functional, visualization, quality control • Implement and enforce safe defensive coding, intelligent design, access control, in low-maintenance supportable applications • Support database integration/migration (postgres/mongo) caching (hazelcast), communication (solace), security (hashicorp, SSO tokens, Kong integration) • Effectively and proactively manage resource usage • Do whatever it takes to maintain stability, availability and performance • Provide Production L3 support as required, to this delivery and the wider platform. We have users from Asia to New York, and you'll be part of a global team of developers • Face-off and partner with stakeholders to capture requirements and deliver solutions - deep understanding of the problem and context along with ownership are key to be successful in this role • Question / Challenge anything you see from your peers or the wider squad that isn't up to standard. • The successful candidate is expected to be an experienced developer in designing, building, deploying and supporting critical solutions in the Financial Markets domain. This role requires a wide variety of strengths and capabilities Mandatory Skills: • 6+ years of experience in Java Development • Micro-services development using Spring boot • Technical Stack (Back End): Core Java, Java , Spring boot, Kafka, REST APIs, • Technical Tools: Confluence/Jira/Bitbucket or Git, CI / CD (Maven, Git, Jenkins), Eclipse or Intelij IDEA Nice-to-Have Skills: • Technical Stack (UI): JavaScript, React JS, Angular , CSS/SCSS, HTML5, Nodejs, Git, • Experience in event driven architectures (CQRS and SAGA patterns). • Build Tools (Gulp, and webpack), Jenkins, Docker, Automation, Bash, Redis, Elasticsearch, Kibana • Experience in Agile (Scrum) project an added plus Languages: English: C2 Proficient Show more Show less

Posted 3 days ago

Apply

Exploring Kafka Jobs in India

Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Gurgaon

These cities are known for their thriving tech industries and have a high demand for Kafka professionals.

Average Salary Range

The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.

Career Path

Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.

Related Skills

In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture

Interview Questions

  • What is Apache Kafka and how does it differ from other messaging systems? (basic)
  • Explain the role of Zookeeper in Apache Kafka. (medium)
  • How does Kafka guarantee fault tolerance? (medium)
  • What are the key components of a Kafka cluster? (basic)
  • Describe the process of message publishing and consuming in Kafka. (medium)
  • How can you achieve exactly-once message processing in Kafka? (advanced)
  • What is the role of Kafka Connect in Kafka ecosystem? (medium)
  • Explain the concept of partitions in Kafka. (basic)
  • How does Kafka handle consumer offsets? (medium)
  • What is the role of a Kafka Producer API? (basic)
  • How does Kafka ensure high availability and durability of data? (medium)
  • Explain the concept of consumer groups in Kafka. (basic)
  • How can you monitor Kafka performance and throughput? (medium)
  • What is the purpose of Kafka Streams API? (medium)
  • Describe the use cases where Kafka is not a suitable solution. (advanced)
  • How does Kafka handle data retention and cleanup policies? (medium)
  • Explain the Kafka message delivery semantics. (medium)
  • What are the different security features available in Kafka? (medium)
  • How can you optimize Kafka for high throughput and low latency? (advanced)
  • Describe the role of a Kafka Broker in a Kafka cluster. (basic)
  • How does Kafka handle data replication across brokers? (medium)
  • Explain the significance of serialization and deserialization in Kafka. (basic)
  • What are the common challenges faced while working with Kafka? (medium)
  • How can you scale Kafka to handle increased data loads? (advanced)

Closing Remark

As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies